• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 6500 XT Limited To PCIe 4.0 x4 Interface

So instead of an PCIe 4.0 x16 allowing near RAM speed (~30GB/s) access to system RAM (RAM now >45GB/s),
the PCIe 4.0 x4 link is <7.9GB/s per direction. That's not much for streaming textures.
PS5 assumes NVMe can stream textures at >5GB/s (saved as compressed file in the SSD).
This card is really designed for non gamers or only simple game animation (not FPS, not flight sims).
Just enough to tick enough of the feature boxes in the marketing brochure/packaging/advert.

This gets more of the OEM's to have a lower entry price to advertise. If some users can move away from the larger cards to use this then larger cards could have greater availability.
If they increase the GPU RAM or increased the PCIe width then some crypto miners may be tempted, crippled like this means their not a problem for this card.
 
The GTX 1080 loses 8% of its performance when run at PCIe 3.0 x4
The GTX 980 loses 5% of its performance when run at PCIe 3.0 x4

It's fair to say that the 6500XT stands to lose at least 5% of its performance when used in a PCIe 3.0 system as it's likely to fall somewhere between the range of those two cards.

If you have a PCIe 3.0 system you plan to put a 6500XT into it's worth bearing in mind that you're not getting the advertised performance, but 92-95% of what you'd otherwise get is still good enough that it's not a deal-breaker.
 
We don't even get hypothetical 6500XT with x16 so it's not even possible to test how much this card would lose, if any lol.
I just trust AMD engineers know what they are doing there and will base my decision on final performance numbers and ofc pricing/availability (so almost certainly nope due to latter :roll: )
 
I'd be more worried about the cut video processing capabilities than the PCIE lane amount. This makes a huge difference for anyone planning to use this for HTPC environment.
Agreed

"The RX 6500 XT also lacks some of the video processing capabilities of other RX 6000 series cards including the exclusion of H264/HEVC encoding and AV1 decoding."

If so that's a major ding against it right there. Smaller cards like this should be strong with codecs to offset the mediocre gaming performance. I was considering this for my HTPC, but if it's true about the codecs, then nope, sorry, I will keep my money.

x16 is definitely not needed, but x4 is just straight up insulting. Put it in 3.0 system and you got yourself a quarter of the PCI bandwidth of an RX470. A card launched 4 and a half years ago.
But the memory bandwidth is almost 1/4 due to the 64-bit memory interface vs the 256-bit interface of the RX470. One thing is probably certain, miners won't be buying it, and probably neither with anyone else lol (except oems) And with the possible omission of hardware codecs, it's looking like a waste of silicon and a PCB.
 
Agreed

"The RX 6500 XT also lacks some of the video processing capabilities of other RX 6000 series cards including the exclusion of H264/HEVC encoding and AV1 decoding."

If so that's a major ding against it right there. Smaller cards like this should be strong with codecs to offset the mediocre gaming performance. I was considering this for my HTPC, but if it's true about the codecs, then nope, sorry, I will keep my money.


Yep, codecs are big boomer... Dont like it either :/.
 
"The RX 6500 XT also lacks some of the video processing capabilities of other RX 6000 series cards including the exclusion of H264/HEVC encoding and AV1 decoding."

If so that's a major ding against it right there. Smaller cards like this should be strong with codecs to offset the mediocre gaming performance. I was considering this for my HTPC, but if it's true about the codecs, then nope, sorry, I will keep my money.
HTPC options suck right now (I just dumped a full-fat 3060 in mine as I have two slots and plenty of cooling in my HTPC case) but the traditional 75W sector has been neglected for the last two generations.

I'd be surprised if Nvidia don't eventually release a desktop variant of GA107, which at it's fully-enabled spec is called the "Laptop 3050 Ti". In laptops it is configurable from 35-80W, so presumably it would make a good candidate for a low-profile, slot-powered HTPC card.

To the best of my knowledge GA107 has all the features of the Ampere lineup with nothing cut.
 
x16 is definitely not needed, but x4 is just straight up insulting. Put it in 3.0 system and you got yourself a quarter of the PCI bandwidth of an RX470. A card launched 4 and a half years ago.
I doubt it will saturate the bus even with that downgrade.

HTPC options suck right now (I just dumped a full-fat 3060 in mine as I have two slots and plenty of cooling in my HTPC case) but the traditional 75W sector has been neglected for the last two generations.
A GT3030 or 1630 or however they want to call it. But at this point we might have to wait to Intel to see good options in the 100 bucks space, nowadays everything is overpriced and old.
 
I doubt it will saturate the bus even with that downgrade.
Don't be so sure, the 6600XT clearly saturates the bus occasionally with a measurable performance drop at PCIe 3.0 x8:

relative-performance_1920-1080.png


The 6500XT is half the performance, but PCI 3.0 x4 is also half the bandwidth, implying that the 6500XT will very likely saturate the bus.

The performance drop caused by putting a 6500XT into a PCIe 3.0 slot could easily be the 98% drop to 93% drop in the chart above if everything scales linearly (it doesn't, but the factors that scale non-linearly might cancel each other out - we'll have to wait for real-world PCIe scaling tests like the above test to know for sure).
 
HTPC options suck right now (I just dumped a full-fat 3060 in mine as I have two slots and plenty of cooling in my HTPC case) but the traditional 75W sector has been neglected for the last two generations.

I'd be surprised if Nvidia don't eventually release a desktop variant of GA107, which at it's fully-enabled spec is called the "Laptop 3050 Ti". In laptops it is configurable from 35-80W, so presumably it would make a good candidate for a low-profile, slot-powered HTPC card.

To the best of my knowledge GA107 has all the features of the Ampere lineup with nothing cut.
I fully agree with what you wrote. I'm currently using my old GTX960 as a HTPC GPU because there's nothing I can replace it with. It works well for some gaming and for outputting 4k video signal to feed to the TV set.
I don't expect the 75W HTPC cards to return anymore. AMD's current APUs (and future Intel) are good enough and made that segment obsolete.
I tested a PC with AMD Ryzen 5 PRO 4650G APU and it's really good for everything except hi-res AAA gaming. Any person that's happy with FHD at medium-low settings can skip the old $100-$200 GPU range because the APUs are good enough.
 
I fully agree with what you wrote. I'm currently using my old GTX960 as a HTPC GPU because there's nothing I can replace it with. It works well for some gaming and for outputting 4k video signal to feed to the TV set.
I don't expect the 75W HTPC cards to return anymore. AMD's current APUs (and future Intel) are good enough and made that segment obsolete.
I tested a PC with AMD Ryzen 5 PRO 4650G APU and it's really good for everything except hi-res AAA gaming. Any person that's happy with FHD at medium-low settings can skip the old $100-$200 GPU range because the APUs are good enough.
I hate to say it but waiting for an expensive, unicorn GPU to appear that may never appear is a far less sensible approach than just rebuilding your HTPC to accommodate dual-slot cards. There are now enough good HTPC cases that can take either an SFX or full ATX PSU and still provide enough ventilation for 150W GPUs or more. I got bored of waiting for a good low-end card to replace my passively-cooled 7600GT and just bought a new case (Silverstone GD04) that could accommodate bigger cards. I've used the Fractal Node 202 quite a lot for HTPC builds as it fits in an AV/media rack alongside consoles* and surround receivers.

My solution is to get a more expensive card than necessary and then massively reduce the TDP. The 3060 I have restricted to around 120W and whilst I'm obviously losing some performance it's near-silent whilst still being moderately capable.

*yeah, I'm not sure what Microsoft was thinking this generation.
 
This here could make partners do graphics cards that has a physical PCI-E x4 and x8 slot on the cards I remember some partners made a lower end Nvidia GeForce GPU with physical PCI-E x1 port on their card.

I think one was Zotac back in the day for their GT 520 or 710.
Zotac actually made an x1 GT 730, and there are quite a few others as well.

Oh wait, how about some "modern" GPUs on the 133MHz PCI interface?
 

And this was with x8. 6500XT is x4
Now I really wanna see how this card performs in a 3.0 VS 4.0 comparison. I doubt infinity cache will help it that much

If its OEM only, and only for systems with PCI-E 4.0 out of the box that more than enough bandwidth
6400 is oem only. 6500XT is not.
 
I doubt this card needs heaps of bandwidth anyways but at the same time, that's gross AMD.
 
Is pciex 3 still Even a thing?!.:p

I jest but of the four computers I have ATM two Intel three and and all released in the last two years None have pciex 3 main slots?!.

So
 
Last edited:
6500XT is so bad that it's good, for people desperate enough :D

Please don't make fun of people that can't afford better hardware - it's disgusting (and it's basically Hardware Elitism). In these hard times when the prices of GPUs have skyrocketed, PC Gamers should stick together and encourage each other. We can make fun of the card, but don't mock people that can't afford anything better.
 
Please don't make fun of people that can't afford better hardware - it's disgusting (and it's basically Hardware Elitism). In these hard times when the prices of GPUs have skyrocketed, PC Gamers should stick together and encourage each other. We can make fun of the card, but don't mock people that can't afford anything better.
I think it's quite a good move by AMD in times like these where there are such big shortages all around.
The GPU seems to be exceptionally frugal in all the import ways, lowering die space and even smd (=Surface Mounted Device, think of things like capacitors, resistors, power regulators) part count, of which many are in short supply as well.
This will allow them make the absolute most of their resources and get as many cards into peoples hands as possible. I'm quite certain most people in this segment would rather choose a card with it's features slightly gimped over no card at all.

It's probably gonna be quite a big success, even with it's somewhat gimped features, especially since they'll probably be sold the most combined with a new PC/Laptop which will have PCI-E 4.0 anyway. Furthermore, the lack of encoding/decoding hardware can usually be negated by software decoding, something desktop PC's should have no problem with, while laptops will have decoding hardware in the iGPU anyway.
 
I fully agree with what you wrote. I'm currently using my old GTX960 as a HTPC GPU because there's nothing I can replace it with. It works well for some gaming and for outputting 4k video signal to feed to the TV set.
I don't expect the 75W HTPC cards to return anymore. AMD's current APUs (and future Intel) are good enough and made that segment obsolete.
I tested a PC with AMD Ryzen 5 PRO 4650G APU and it's really good for everything except hi-res AAA gaming. Any person that's happy with FHD at medium-low settings can skip the old $100-$200 GPU range because the APUs are good enough.


I'm using currently 4650G as CPU before I got gtx 3060 ti. DayZ full hd with low settings fps 70-100 :). WoW vanilla smooth as well. GTA V full hd with low/medium settings above 60 fps. Can't wait to get cpu with rdna2 :}.
 
Is pciex 3 still Even a thing?!.

I jest but of the four computers I have ATM two Intel three and and all released in the last two years None have pciex 3 main slots?!.

So
Intel were stuck on PCIe 3.0 until Rocket Lake which is only 9 months old and launched to pretty negative or lukewarm reviews. I don't think there are that many Rocket lake chips out in the wild TBH; High end Rocket Lake was a dumpster fire as the 11900K lost two cores and was a step back in most ways, cheaper rocket lake models offered far lower performance/$ because Comet Lake was selling for far less. I'd still recommend the 10400F today on a budget and it's still readily available.

AMD got PCIe 4.0 over 2 years ago but it existed only on high-end boards (X570) that make up a pretty small proportion of AMD's overall market. It only really arrived for mainstream buyers with Zen3 and the B550 which is barely a year ago (Nov 2020) and even then the cheapest point of entry was the 5600X which is a good $140 more expensive than the equvalent Zen2 or Comet Lake configuration.

Given that there are a lot more Intel machines out there than AMD machines, I think it's fair to assume that there are a huge number of modern PCIe 3.0 motherboards that will be wanting a GPU update before they're retired.

Let's face it, if you're in the market for a 6500XT you're probably not rocking a recent flagship motherboard and CPU, making its PCIe 3.0 performance even more important!
 
Zen, zen+ and all 400 chipsets are 3.0. Those platforms are not that old, and a lot of people are still on them.
Come on, I said I Jest. IE joke.
 
Come on, I said I Jest. IE joke.
You jest, yet it did make me actually think about how many PCIe 4.0 machines were really out there in the wild, and it's not that many!
 
4GB and 4x at "200 usd".
What a joke.
 
AMD are clearly taking the piss with this card. I thought nGreedia were bad, but this card is bordering on the unusable.
 
You jest, yet it did make me actually think about how many PCIe 4.0 machines were really out there in the wild, and it's not that many!
Your points were well taken.

And after thinking about it one of mine is pciex 3 I forgot pre x570s were that for a moment there.

I think the people should be told, most of these will be in entry level OEM gaming rigs and for those x4 pciex4 will be fine.
Others might have something to complain about, though some here are definitely being way too dramatic, tests show x4 pciex 4 has enough bandwidth.
 
Back
Top