Friday, January 7th 2022

AMD Radeon RX 6500 XT Limited To PCIe 4.0 x4 Interface

The recently announced AMD Radeon RX 6500 XT only features a PCIe 4.0 x4 interface according to specifications and images of the card published on the ASRock site. This is equivalent to a PCIe 3.0 x8 link or a PCIe 2.0 x16 connection and is a step down from the Radeon 6600 XT which features a PCIe 4.0 x8 interface and the Radeon 6700 XT with a PCIe 4.0 x16 interface. This fact is only specified by ASRock with AMD, Gigabyte, ASUS, and MSI not mentioning the PCIe interface on their respective pages. The RX 6500 XT also lacks some of the video processing capabilities of other RX 6000 series cards including the exclusion of H264/HEVC encoding and AV1 decoding.
Sources: ASRock (via VideoCardz), 3DCenter
Add your own comment

118 Comments on AMD Radeon RX 6500 XT Limited To PCIe 4.0 x4 Interface

#76
TheoneandonlyMrK
stimpy88If you're unable to make decisions yourself, and need a review to tell you it is crap, then I have plenty of other crap I'd like to sell you... I'll guess i'll start on some "reviews" first.

I guess some people are born to ridicule public descent, and hand their autonomy over to something with a logo, while sneering at those clever enough to have seen it coming. We have seen a lot of these types over the last few years.
Yeh your so smart bro , keep righting those wrongs eh.

As I said I couldn't care any less how this performs, I'm not buying it.

I sneer at those who think they know me from a paragraph of text or think they can derive greater meaning than I presented.

I laugh at fools who waste their time shit posting about things they'll never buy or want too.

And those that choose to attack an individual on his opinion rather than the topic at hand, trying to sound smart.

You feeling defensive, did you take the troll jab to heart perhaps , act on it , that'll prove you aren't?!.
Posted on Reply
#77
RJARRRPCGP
AMD would have seemed to drop the ball here, because of no AV1 encoding and decoding! It's a truly-open format, unlike the horse hockey that goes with H.264 and H.265.

It's now possible that with the RX 6500 XT, that even 1080p recording lags with less than 8 CPU cores, maybe for less than 10 CPU cores for all I know!

A higher number card that looks inferior to 5500 XT, starkly reminds me of AGP-era-pre-AMD Radeons, with the 9000 Pro. :(
Posted on Reply
#78
windwhirl
RJARRRPCGPIt's now possible that with the RX 6500 XT, that even 1080p recording lags with less than 8 CPU cores, maybe for less than 10 CPU cores for all I know!
The chart said "H264/4K", not "H264". To me that means 4K is blocked off, not H264. We'll have to wait until someone reviews the cards to make sure.


Though I do not understand why H265 isn't available, even if only for 1080p. The now long in the tooth RX 580 has the capability.
Posted on Reply
#79
trsttte
windwhirlThe chart said "H264/4K", not "H264". To me that means 4K is blocked off, not H264. We'll have to wait until someone reviews the cards to make sure.


Though I do not understand why H265 isn't available, even if only for 1080p. The now long in the tooth RX 580 has the capability.
I think we're going well into speculation and mis understanding now, for one any modern cpu will be able to power through software decoding. My old laptop with a sandy bridge (you read that right, intel 2nd gen) quad core had "no problems" chugging along h265 on software alone (it did put some load in the cpu but it worked).

Then there's directx, you might not have hw decoders but you can still use directx to push the workload to the gpu which will still be faster, not as good as dedicated hw support but still faster than cpu brute force.

Is it bad and silly that a 2022 gpu lacks basic enc/decode features? Yes, it's a damn disgrace but it's not the end of the world
Posted on Reply
#80
Berfs1
FouquinIt won't be a problem on 3.0, and god help anyone still on 2.0. You're going to face UEFI issues on most 2.0 platforms before you ever have the opportunity to face bandwidth problems.
Not sure if god will help people on B450 and X470. I'd honestly rather have AMD force this card to run PCIe 3.0 x8 or PCIe 2.0 x16 because, the card can work at the same bandwidth regardless of a newer generation. HOWEVER, if you limit it to PCIe 4.0 x4, you now require a PCIe 4.0 x4 slot, otherwise you won't get the same bandwidth any other way unless you put it in a larger PCIe 4.0 slot. Which is why 99% of people don't like graphics cards intentionally limited to the latest generation and 1/4th the max PCIe lanes for that slot.
Posted on Reply
#81
trsttte
Berfs1I'd honestly rather have AMD force this card to run PCIe 3.0 x8 or PCIe 2.0 x16 because, the card can work at the same bandwidth regardless of a newer generation. HOWEVER, if you limit it to PCIe 4.0 x4, you now require a PCIe 4.0 x4 slot, otherwise you won't get the same bandwidth any other way unless you put it in a larger PCIe 4.0 slot
That's probably not an option. We don't really have a block diagram or anything but the likely scenario is that they built only 4x lanes into the thing, if they had used more lanes we'd have access to them (i.e. x8 gen4). This is most likely because the card uses a mobile chip, which usually are limited to x4.
Posted on Reply
#82
Berfs1
trsttteThat's probably not an option. We don't really have a block diagram or anything but the likely scenario is that they built only 4x lanes into the thing, if they had used more lanes we'd have access to them (i.e. x8 gen4). This is most likely because the card uses a mobile chip, which usually are limited to x4.
In that case, why couldn't AMD just make it a physical x4 card? That way it can fit in more motherboards.
TheoneandonlyMrKProper troll fest now eh , lol looks like a hate fest in here, meanwhile no reviews no tests just hyperbolic bs.
I await reviews, and then like most commenters, I already have a better GPU, I wouldn't be buying it, it wouldn't matter to me, I would expire no butt hurtness.
TheoneandonlyMrKI laugh at fools who waste their time shit posting about things they'll never buy or want too.
So, you are laughing at yourself? :confused::confused:
RJARRRPCGPAMD would have seemed to drop the ball here, because of no AV1 encoding and decoding! It's a truly-open format, unlike the horse hockey that goes with H.264 and H.265.
AV1 encode/decode won't really matter right now, however if AV1 encoding does kick off for twitch (odds of that happening are pretty low), as they have stated they are working on supporting AV1 encode, then it will basically kick this off the relevant GPU list for many streamers. Then again, 6500 XT is such a low end card and AMD's GPU encoder is so horridly trash for streaming, I don't think it even matters; just get an NVIDIA GPU w/ Turing/Ampere (7th gen) NVENC if you plan on streaming.
Posted on Reply
#83
TheoneandonlyMrK
Berfs1In that case, why couldn't AMD just make it a physical x4 card? That way it can fit in more motherboards.



So, you are laughing at yourself? :confused::confused:


AV1 encode/decode won't really matter right now, however if AV1 encoding does kick off for twitch (odds of that happening are pretty low), as they have stated they are working on supporting AV1 encode, then it will basically kick this off the relevant GPU list for many streamers. Then again, 6500 XT is such a low end card and AMD's GPU encoder is so horridly trash for streaming, I don't think it even matters; just get an NVIDIA GPU w/ Turing/Ampere (7th gen) NVENC if you plan on streaming.
I am now looking back.

Sometimes you think more than others

But every day, I learn

Sorry @stimpy88 I overreacted.
Posted on Reply
#84
Berfs1
ExcuseMeWtfWe don't even get hypothetical 6500XT with x16 so it's not even possible to test how much this card would lose, if any lol.
Technically you could just force the card into PCIe 2.0 x4 mode, that would be 1/4th the max bandwidth.
Posted on Reply
#85
trsttte
Berfs1In that case, why couldn't AMD just make it a physical x4 card? That way it can fit in more motherboards.
Marketing probably, just like x8 cards are still built with the full lenght slot.

In terms of motherboard support, if you have a smaller slot that you plan to use for it you can just cut the back part so the card enters, there are even slots that come with the back open for that exact reason but it's possible to also diy'it with a dremel or something

ExcuseMeWtfWe don't even get hypothetical 6500XT with x16 so it's not even possible to test how much this card would lose, if any lol.
We can just test the performance on pcie 4.0 x4 like the card was designed and then test on pcie 3.0 and pcie 2.0 x4 and will see how much it looses, just like any card.

I don't think anyone is arguing that the card will be bottlenecked on pcie 4.0 which is very unlikely for a low end card like this, with older gens that's a different story and the discussion here
Posted on Reply
#86
Berfs1
trsttteif you have a smaller slot that you plan to use for it you can just cut the back part so the card enters, there are even slots that come with the back open for that exact reason but it's possible to also diy'it with a dremel or something

That is true, however in certain motherboards, there are other components right next to the slot, so it wouldn't be feasible for those boards. Boy am I glad my motherboard has open ended slots lol
Posted on Reply
#87
LabRat 891
Y'all say what you will. This is great! We finally have a GPU that's perfect for m.2 slot conversion, or use in the chipset PCIe x4 slot. Since x570 has bifurcation on the 4.0x16, you could now use that slot for other devices with the proper risers. (Like MaxCloudOn).
Posted on Reply
#88
trsttte
LabRat 891Y'all say what you will. This is great! We finally have a GPU that's perfect for m.2 slot conversion, or use in the chipset PCIe x4 slot. Since x570 has bifurcation on the 4.0x16, you could now use that slot for other devices with the proper risers. (Like MaxCloudOn).
Nothing stops you from doing that with x16 cards either, the situation is exactly the same since this one also has the full lenght connector on the pcb. The only difference is this card is limited to x4 but nothing stopping you from using only x4 lanes of a x16 card
Posted on Reply
#89
noel_fs
dont need more than that for 1080
Posted on Reply
#90
eidairaman1
The Exiled Airman
Berfs1Not sure if god will help people on B450 and X470. I'd honestly rather have AMD force this card to run PCIe 3.0 x8 or PCIe 2.0 x16 because, the card can work at the same bandwidth regardless of a newer generation. HOWEVER, if you limit it to PCIe 4.0 x4, you now require a PCIe 4.0 x4 slot, otherwise you won't get the same bandwidth any other way unless you put it in a larger PCIe 4.0 slot. Which is why 99% of people don't like graphics cards intentionally limited to the latest generation and 1/4th the max PCIe lanes for that slot.
The card should have full 16X lanes no matter the pcie gen.

The AsRock B550 Steel Legend mobo allows selection of PCIE Gen but im forcing it to run PCIE 4 despite using a XFX R7 250X GHOST. It boots fine on uefi lol.
Posted on Reply
#91
trsttte
eidairaman1The card should have full 16X lanes no matter the pcie gen.

The AsRock B550 Steel Legend mobo allows selection of PCIE Gen but im forcing it to run PCIE 4 despite using a XFX R7 250X GHOST. It boots fine on uefi lol.
What you select in the bios doesn't matter, the card will only run at the specs it supports, gen 3.0 in your case
Posted on Reply
#92
IceShroom
This could be one explanation why Navi24 has this weird design and limited PCI-e lanes. Less lane on laptop means more efficient pcb design.
Posted on Reply
#93
Assimilator
Oh the horror, a low-end card that is gimped in a way that won't make a difference to its overall performance. I guarantee that 99% of users who pay over-inflated scalper prices for these cards won't notice any PCIe performance drop.
Berfs1I'd honestly rather have AMD force this card to run PCIe 3.0 x8 or PCIe 2.0 x16
It would indeed be impressive if they could force the card to use lanes that it physically does not have.
Posted on Reply
#94
Mussels
Freshwater Moderator
IceShroomThis could be one explanation why Navi24 has this weird design and limited PCI-e lanes. Less lane on laptop means more efficient pcb design.
Repurposed laptop design makes a lot of sense
Posted on Reply
#95
stimpy88
TheoneandonlyMrKYeh your so smart bro , keep righting those wrongs eh.

As I said I couldn't care any less how this performs, I'm not buying it.

I sneer at those who think they know me from a paragraph of text or think they can derive greater meaning than I presented.

I laugh at fools who waste their time shit posting about things they'll never buy or want too.

And those that choose to attack an individual on his opinion rather than the topic at hand, trying to sound smart.

You feeling defensive, did you take the troll jab to heart perhaps , act on it , that'll prove you aren't?!.
Oh goodness. It's only a shitty graphics card...

But I just read your later post. No problems, and I'm sorry for the tone of my post too. Two reactions don't always make a good one!
Posted on Reply
#96
TheoneandonlyMrK
stimpy88Oh goodness. Read your own post dude, stop rage crying and take a chill pill. It's only a shitty graphics card.
I did , read further through the thread.:p

Have a go at being less insinuating and insulting yourself though , might help.
Posted on Reply
#97
stimpy88
TheoneandonlyMrKI did , read further through the thread.:p

Have a go at being less insinuating and insulting yourself though , might help.
I posted that before I saw your post apologizing. I then edited my rant to something a little different. Refresh the page...
Posted on Reply
#98
RJARRRPCGP
Is this literally another card that's only good for DirectX 9 games?
Posted on Reply
#99
windwhirl
RJARRRPCGPIs this literally another card that's only good for DirectX 9 games?
We'll know when the card goes through reviews. At the very least, AMD is pushing its core clocks really close to their limit, so that should help a bit.

AMD's numbers aren't bad, but they're AMD's.
Posted on Reply
#100
RJARRRPCGP
windwhirlWe'll know when the card goes through reviews.
I feel like this might just be "a newfangled version of an HD 5450"!
Posted on Reply
Add your own comment
Oct 6th, 2024 00:18 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts