Friday, January 7th 2022

AMD Radeon RX 6500 XT Limited To PCIe 4.0 x4 Interface

The recently announced AMD Radeon RX 6500 XT only features a PCIe 4.0 x4 interface according to specifications and images of the card published on the ASRock site. This is equivalent to a PCIe 3.0 x8 link or a PCIe 2.0 x16 connection and is a step down from the Radeon 6600 XT which features a PCIe 4.0 x8 interface and the Radeon 6700 XT with a PCIe 4.0 x16 interface. This fact is only specified by ASRock with AMD, Gigabyte, ASUS, and MSI not mentioning the PCIe interface on their respective pages. The RX 6500 XT also lacks some of the video processing capabilities of other RX 6000 series cards including the exclusion of H264/HEVC encoding and AV1 decoding.
Sources: ASRock (via VideoCardz), 3DCenter
Add your own comment

118 Comments on AMD Radeon RX 6500 XT Limited To PCIe 4.0 x4 Interface

#51
ShurikN
JismYep.

Lots of engineers these days on these forums, in where their skills obviously exceeds that of AMD mosts professional out there.
You mean the same Amd professionals that made the 5500XT 4GB?
Posted on Reply
#52
Chrispy_
stimpy88AMD are clearly taking the piss with this card. I thought nGreedia were bad, but this card is bordering on the unusable.
Unusable? No. It'll work just fine.
Even if the x4 interface limits it slightly, it'll be way faster than the integrated graphics or GTX 750Ti it's replacing, for example.

It's just that some people will complain that they were promised 60fps in Fortnite on max settings and they're only getting 56fps; They'll have a valid complaint.
Posted on Reply
#53
InVasMani
I don't think it'll have a major impact insignificant overall, but cost savings probably more readily noticeable. It's probably worth the trade off from AMD's perspective and in the end worth it for the consumer as well even though on paper from a technical standpoint looks less ideal. I do wish they would've used the untapped PCIE slot bandwidth though for M.2 devices rather than just leaving them vaccant. It would be worth paying the cost association to integrate and include 2-3 of those especially for SFF ITX and Micro ATX builds. There is more to be gained with PCIE 5.0 in regard to putting the untapped PCIE lanes to usage though. It would be a shame if Intel didn't take advantage of that on it's GPU's depending on PCIE usage in regard to hardware itself. The sad part is AMD already has demonstrated using a M.2 on a GPU.
Posted on Reply
#54
Bongo_
TomorrowThe point is that people who use it on PCIe 3.0 or perhaps even 2.0 board will also be limited to x4 link but wil much less bandwidth than 4.0 x4 would provide. Obviously 4.0 x4 is just fine for this card but it may not be for 3.0 or 2.0 users.

Based on TPU's GPU database and assuming 6500XT has roughly the performance of GTX 980 it could lose up to 14% with 2.0 and up to 6% with 3.0: www.techpowerup.com/review/nvidia-gtx-980-pci-express-scaling/21.html
Chrispy_The GTX 1080 loses 8% of its performance when run at PCIe 3.0 x4
The GTX 980 loses 5% of its performance when run at PCIe 3.0 x4

It's fair to say that the 6500XT stands to lose at least 5% of its performance when used in a PCIe 3.0 system as it's likely to fall somewhere between the range of those two cards.

If you have a PCIe 3.0 system you plan to put a 6500XT into it's worth bearing in mind that you're not getting the advertised performance, but 92-95% of what you'd otherwise get is still good enough that it's not a deal-breaker.
From the article:
This is equivalent to a PCIe 3.0 x8 link or a PCIe 2.0 x16 connection.
Posted on Reply
#55
trsttte
Bongo_From the article:
This is equivalent to a PCIe 3.0 x8 link or a PCIe 2.0 x16 connection.
You misunderstood, the bandwidth of PCIe 4.0 x4 is in fact equivalent to 3.0 x8 and 2.0 x16 but this card will only be able to work at 4.0 x4, 3.0 x4 and 2.0 x4 (pcie is backwards compatible but lanes are lanes and can't be split, you can't split x1 pcie 4.0 into x2 3.0)
Posted on Reply
#56
GoldenX
JismYep.

Lots of engineers these days on these forums, in where their skills obviously exceeds that of AMD mosts professional out there.
You mean the people that basically invented Vulkan and are still trying to figure it out? Yeah, sure.
Posted on Reply
#57
Fluffmeister
Ultimately it's a little thing, but those over the years wishing the "evil competition" would disappear are slowing learning the good guys can quickly become monsters.
Posted on Reply
#58
watzupken
In my opinion, the specs for this card is way too gimped to be good. But good really depends on the price. AMD's MSRP sounds about right for a budget card in times like this. But the problem is whether it will turn up at MSRP or 2x the MSRP. Anything more than MSRP is not worth it, at least for me. The main one that annoys me is the removal of AV1 decode. How much does it cost to add that feature when cheap ARM SOCs can support AV1 decode?
Bongo_From the article:
This is equivalent to a PCIe 3.0 x8 link or a PCIe 2.0 x16 connection.
It is if you are using it in a PCI-E 4.0 slot. Problem is that budget gamers may want to just upgrade their GPU, and I believe most people are still using a system with PCI-E 3.0 support. Even AMD themselves are selling Zen 3 APUs with PCI-E 3.0 only support. I wonder do they save a lot of money by cutting the PCI-E lane support by half? Are they that tight fisted that x8 is not possible?
Posted on Reply
#59
Selaya
This is AMD intentionally being cheap. The PCIe 4.0 is just a charade and a strawman. Nobody who can afford a PCIe 4.0 platform will buy this card. The best budget options are the 10100F and 10400F and will in all likelihood remain for at least another year, the H610 board's just too expensive. The 10100F and maybe 10400F are what you'd want to pair with such a card except, because fuckyou thatswhy now we're stuck on 3.0 x4 and losing ~4% of performance. Because AMD can.

Yeah.
Fuck you too AMD.
Posted on Reply
#60
RJARRRPCGP
WTF? AV1 is not the proprietary stuff that H.264 and HEVC are.
Posted on Reply
#61
kruk
FluffmeisterUltimately it's a little thing, but those over the years wishing the "evil competition" would disappear are slowing learning the good guys can quickly become monsters.
If they are monsters for you, what is the "evil competition" then? Morgoth? Even with stupid decisions like this they are still the least evil of the trio, and it seems it will stay like that for a long time ...
Posted on Reply
#62
ExcuseMeWtf
SelayaThe best budget options are the 10100F and 10400F and will in all likelihood remain for at least another year, the H610 board's just too expensive.
Nah, I ordered i3 12100F + Gigabyte B660M Gaming mATX for roughly the same price as I'd pay for 10400F + similarly featured B560 mobo. I will have to wait extra week for delivery, but that's not a problem for me.

That said, missing codecs thing is def concerning for this card.
Posted on Reply
#63
Tartaros
Chrispy_Don't be so sure, the 6600XT clearly saturates the bus occasionally with a measurable performance drop at PCIe 3.0 x8:



The 6500XT is half the performance, but PCI 3.0 x4 is also half the bandwidth, implying that the 6500XT will very likely saturate the bus.

The performance drop caused by putting a 6500XT into a PCIe 3.0 slot could easily be the 98% drop to 93% drop in the chart above if everything scales linearly (it doesn't, but the factors that scale non-linearly might cancel each other out - we'll have to wait for real-world PCIe scaling tests like the above test to know for sure).
Well, a 1080 card for Apex and Fortnite. They are playing with the probable fact that this card is for your kid's pc who doesn't know anything about computers so making cuts here and there is something no one at that level would notice even, 4-5 fps less maybe?
hat said, missing codecs thing is def concerning for this card.
And the price point. I can live without a few fps in games, cutting on video decoding is just a plain bad decision. These cards could have been reused as HTPC cards in the future when the kid's pc is rebuilt and this just gimps their reusability. Precisely what the low end cards needs is all the wacky not gaming implementations, that's how a lot 9300GT survived as Physx cards back then.
Posted on Reply
#65
Fluffmeister
krukIf they are monsters for you, what is the "evil competition" then? Morgoth? Even with stupid decisions like this they are still the least evil of the trio, and it seems it will stay like that for a long time ...
Please don't defend them.
Posted on Reply
#66
stimpy88
Chrispy_Unusable? No. It'll work just fine.
Even if the x4 interface limits it slightly, it'll be way faster than the integrated graphics or GTX 750Ti it's replacing, for example.

It's just that some people will complain that they were promised 60fps in Fortnite on max settings and they're only getting 56fps; They'll have a valid complaint.
A second-hand gaming laptop for double the price of this card would perform the same, and is a whole, portable system. This card has no future because the kids that will want this card will be pissed at the performance, as it will be slower than their friends on the consoles, and finished off by the fact they will also find out that they can't stream video of their gameplay, makes it a hard pass, almost useless card. For once, i'd pay nGreedia the $50 more, and get a real graphics card.
Posted on Reply
#67
seth1911
What a Junk Card, 64bit and only x4.
But 200$ for this one, a 3050 8GB looks like very cheap with MRSP 249$ against this 6500.

Primary the Card would be a choice for User with older Hardware and then they got only PCIe 3.0 x4,
its a slower connection than a GT 1030.
Posted on Reply
#68
kruk
FluffmeisterPlease don't defend them.
I'm not defending them, I just stated the facts. GPP, vendor-lock in, anticompetitive buyouts, etc. are far more damaging (and long lasting) to the ecosystem than selling a crippled card that should cost $100 for $200. Prove me wrong.

I really won't care too much if you blast the card, but please don't make it look like AMDs design decisions for this are comparable with the super shady stuff the "evil competition" does. It just makes the "evil competition" look better and enables them to do more shady stuff.
Posted on Reply
#69
Fluffmeister
I fully expected you to do a " yeah, but Nvidia" post and you delivered.
Posted on Reply
#70
Selaya
ExcuseMeWtfNah, I ordered i3 12100F + Gigabyte B660M Gaming mATX for roughly the same price as I'd pay for 10400F + similarly featured B560 mobo. I will have to wait extra week for delivery, but that's not a problem for me.

That said, missing codecs thing is def concerning for this card.
Well, you're an enthusiast (on a budget, but nonetheless)
Average people on a budget like this will just score the cheapest, bottom-of-the-barrel board and call it a day. At this price bracket w/ a 65W part, there's nothing wrong w/ that either. And it'll be quite a bit cheaper than a b660.
Posted on Reply
#71
TheoneandonlyMrK
Proper troll fest now eh , lol looks like a hate fest in here, meanwhile no reviews no tests just hyperbolic bs.

I await reviews, and then like most commenters, I already have a better GPU, I wouldn't be buying it, it wouldn't matter to me, I would expire no butt hurtness.
Posted on Reply
#72
AusWolf
"This is equivalent to a PCIe 3.0 x8 link or a PCIe 2.0 x16 connection" - In bandwidth yes, but one needs to be careful with PCI-e 3.0 and 2.0 motherboards, as those will still run the card at x4.

"The RX 6500 XT also lacks some of the video processing capabilities of other RX 6000 series cards including the exclusion of H264/HEVC encoding and AV1 decoding." - I can't understand or endorse AMD's decision on that. Even the HTPC market will look away now.
Posted on Reply
#73
ExcuseMeWtf
SelayaWell, you're an enthusiast (on a budget, but nonetheless)
Average people on a budget like this will just score the cheapest, bottom-of-the-barrel board and call it a day. At this price bracket w/ a 65W part, there's nothing wrong w/ that either. And it'll be quite a bit cheaper than a b660.
Fair enough about myself.
But if someone is looking at bottom of the barrel stuff, they should not even be thinking about this card for a while to begin with.
Because, let's face it, it WILL be price inflated, even if it won't make a good miner. Demand WILL be crazy, and sellers WILL take advantage of it regardless. At best I expect it to match price/performance of those already inflated "miner approved" cards, and for some time after launch maybe even worse.
Posted on Reply
#74
windwhirl
ExcuseMeWtfBut if someone is looking at bottom of the barrel stuff, they should not even be thinking about this card for a while to begin with.
... I'd argue they wouldn't even look at graphics cards but rather whatever IGP is available in Intel or AMD's CPUs
Posted on Reply
#75
stimpy88
TheoneandonlyMrKProper troll fest now eh , lol looks like a hate fest in here, meanwhile no reviews no tests just hyperbolic bs.

I await reviews, and then like most commenters, I already have a better GPU, I wouldn't be buying it, it wouldn't matter to me, I would expire no butt hurtness.
If you're unable to make decisions yourself, and need a review to tell you it is crap, then I have plenty of other crap I'd like to sell you... I'll guess i'll start on some "reviews" first.

I guess some people are born to ridicule public descent, and hand their autonomy over to something with a logo, while sneering at those clever enough to have seen it coming. We have seen a lot of these types over the last few years.
Posted on Reply
Add your own comment
Jul 26th, 2024 17:28 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts