# Sapphire Radeon RX 6500 XT Pulse



## W1zzard (Jan 20, 2022)

The Sapphire Radeon RX 6500 XT Pulse comes with super impressive noise levels. Even when fully loaded does it run whisper-quiet in an already quiet room. If you put it into a case, it'll be inaudible. Unfortunately, the card is held back by its small VRAM size of 4 GB and the narrow PCIe x4 interface.

*Show full review*


----------



## Lightofhonor (Jan 20, 2022)

@W1zzard the charts say 6600 XT, not 6500 XT


----------



## Blaazen (Jan 20, 2022)

Sapphire made GPU clocks 150 MHz lower than Asus (2750 vs 2900 MHz) and power consumption in gaming is 12 watts lower (89 vs 101 W). VRM seems be the same.


----------



## Forza.Milan (Jan 20, 2022)

brace your self, more are coming..


----------



## NC37 (Jan 20, 2022)

People willing to pay $350 for a 64 bit 4GB card deserve to be scammed by this trash. Budget sub $150 at best. Laughable how much the 5500 beats it.


----------



## AdmiralThrawn (Jan 20, 2022)

For this price you can buy a used 980ti that will completely outclass it in gaming performance. It is honestly pathetic that companies are now scalping their own products. If the only AMD can come up with is a cheaped out compromise gpu using tech from 2016 then they need to make some serious changes moving forward.

Just noticed that it only pushes 72 fps in 1080p on doom. One of the best optimized games out there. Then only gets 27 fps in valhalla, at 1080p. My ps4 could run that game better.


----------



## oxrufiioxo (Jan 21, 2022)

For people who don't pay attention to gpu reviews this is the equivalent of getting scammed with a fake gpu on ebay....


----------



## Turmania (Jan 21, 2022)

NC37 said:


> People willing to pay $350 for a 64 bit 4GB card deserve to be scammed by this trash. Budget sub $150 at best. Laughable how much the 5500 beats it.


Well when an average consumer goes into a shop orders PC parts, 250 usd for a cpu, 200 for a motherboard, 100 usd for memory and ssd. and then when he see 1000 usd for a mid level gpu he will choose this. this is marketing trick, that is why world first 6nm gpu  will be used to promote it. and many will buy.


----------



## RedBear (Jan 21, 2022)

No mention of the missing H.264/H.265 hardware encoding in the conclusions? Isn't this an important limitation for people building entry-level eSports gaming PCs who would like to stream their games?

EDIT:



oxrufiioxo said:


> For people who don't pay attention to gpu reviews this is the equivalent of getting scammed with a fake gpu on ebay....


Actually the review still recommends it for people who owns a PCIe Gen 4 system, as long as it doesn't exceed the $300 mark. It's a bit of a stretch, but I guess it's a legit point of view, as long as one understands exactly what he's buying and what he's going to use it for. Between the outlets I follow I guess only Tomshardware's review was similarly soft with AMD on this GPU (I mean, a lot of people simply said to not buy it, period).


----------



## watzupken (Jan 21, 2022)

RedBear said:


> No mention of the missing H.264/H.265 hardware encoding in the conclusions? Isn't this an important limitation for people building entry-level eSports gaming PCs who would like to stream their games?
> 
> EDIT:
> 
> ...


Because most of the reviewers likely don't use a card of this class. So as a 1 off review of the card, those cuts may not be a problem. Objectively, the card is good enough for 1080p gaming IF one owns a PC with a working PCI-E 4.0 slot for it. If not, then look elsewhere. Secondly, if the price is higher than MSRP, I will also recommend people to look away. The cheapest Sapphire Pulse is going for around 265 USD before taxes, and models from the likes of Asus and Gigabyte is going for close to 300 USD.


----------



## nguyen (Jan 21, 2022)

I hope scalpers gobbled up all these 6500XT, only to take a huge cut and sell them for <100usd in a few months because no one is buying these turds.


----------



## W1zzard (Jan 21, 2022)

Lightofhonor said:


> @W1zzard the charts say 6600 XT, not 6500 XT


Whoops .. fail .. fixed now



RedBear said:


> No mention of the missing H.264/H.265 hardware encoding in the conclusions? Isn't this an important limitation for people building entry-level eSports gaming PCs who would like to stream their games?


Don't think it's a big deal, but definitely worth mentioning. Added



RedBear said:


> I mean, a lot of people simply said to not buy it, period


That is simply bad advice in my opinion. I think our job is to educate readers to come to their own conclusion. Yeah, times are changing, I know .. 
The card works fine and gets you somewhat decent 1080p gaming, it comes down to price and availability. IF you can find it for $200, it's a great option, because nothing exists in this market thats offering similar value.


----------



## Selaya (Jan 21, 2022)

W1zzard said:


> [ ... ]
> That is simply bad advice in my opinion. I think our job is to educate readers to come to their own conclusion. Yeah, times are changing, I know ..
> The card works fine and gets you somewhat decent 1080p gaming, it comes down to price and availability. IF you can find it for $200, it's a great option, because nothing exists in this market thats offering similar value.


agreed.

however, this card has a huge _asterisk_ attached - PCIe 4.0 platform is essentially mandatory, which significantly drives up your costs (since Skylake or [AMD] B450/A520 are out ie the best budget platforms)
honestly i would just go shop for something used like rx570, 580, gtx1060 or gtx980ti or something because they will all function properly w/o PCIe 4.0.


----------



## HD64G (Jan 21, 2022)

Agreed that this GPU has many shortcomings for a desktop GPU and that is a strong indication of its design initially being made for notebooks only. But for anyone in a rush to make a new PC or with a faulty GPU to replace, if found close to MSRP it is the only sensible solution. Sadly to say that but this market is too bad for any PC gamer on a budget. And after the initial disappointment about this GPU and with a clear mind it is the only and mediocre solution to many people. Nothing else new and close to that price is incoming in the next months after all.


----------



## Readlight (Jan 21, 2022)

I haw fan speed overshoot to.


----------



## thelawnet (Jan 21, 2022)

Selaya said:


> agreed.
> 
> however, this card has a huge _asterisk_ attached - PCIe 4.0 platform is essentially mandatory, which significantly drives up your costs (since Skylake or [AMD] B450/A520 are out ie the best budget platforms)
> honestly i would just go shop for something used like rx570, 580, gtx1060 or gtx980ti or something because they will all function properly w/o PCIe 4.0.



Well. I'm building a new Alder Lake build, so while I'm probably gonna skip the GPU entirely and make do with a 12500, otherwise this board would be a great choice.  Used 570 4GB goes $300 here and no guarantee at all, so would be a far worse decision than this with a warranty brand new

As far as PCIe 4.0 goes, PCIe 3.0 makes games slower, but AFAICT it doesn't make it unusable per se - it makes it still better than a 570 4GB, but worse than a 580 8GB, whereas otherwise it would be better than a 580 8GB.

I mean, if you have PCIe 4.0 then this card is objectively more valuable than if you only have 3.0, but the difference doesn't turn it into a GT 1030 DDR4 or something. It's, like, slightly slower on average. In some games *significantly*. But it is what it is and if you need to buy a new card there's no point in telling people to buy a used one, or if this card is available at $250, and then a 1650 Super at $450, or whatever, it wouldn't make much sense to spend the $200 extra for the 1650 Super.


----------



## Selaya (Jan 21, 2022)

obviously.
4x4 however isnt enough even, theres some games that get bottlenecked still
given equal pricing i'd always skip right past this and buy something used instead (be smart: buy good aibs, rather spend an extra buck than bottom-of-the-barrel and you should minimise your chance of buying bricks)


----------



## dalekdukesboy (Jan 21, 2022)

This thing is an utter piece of crap for the price it goes for... This is barely a 150$ card at best in a sane world.


----------



## Footman (Jan 21, 2022)

I see this card being used in laptops and disappearing as a discrete card soon.


----------



## HD64G (Jan 21, 2022)

dalekdukesboy said:


> This thing is an utter piece of crap for the price it goes for... This is barely a 150$ card at best in a sane world.


Agreed but in the todays crazy market where most used GPUs are twice their price when new...


----------



## RedBear (Jan 21, 2022)

W1zzard said:


> Don't think it's a big deal, but definitely worth mentioning. Added
> 
> 
> That is simply bad advice in my opinion. I think our job is to educate readers to come to their own conclusion. Yeah, times are changing, I know ..
> The card works fine and gets you somewhat decent 1080p gaming, it comes down to price and availability. IF you can find it for $200, it's a great option, because nothing exists in this market thats offering similar value.


Thanks, another thing perhaps worth mentioning is the limited video outputs (two, the direct predecessor, Sapphire Pulse RX 5500 XT, had four for instance), it's mentioned earlier in the review, but a lot of people probably just skip to the conclusions when reading a review.

Agreed on the purpose of reviews, but I suppose you don't get a lot of attention if you just say that something is not very good, in terms of generational upgrade is pretty bad actually, but it's still acceptable at the right price and for the right usage.


----------



## sith'ari (Jan 22, 2022)

> We benchmarked ray tracing too in this review and saw massive performance drops. For example, enabling ray tracing effects in Resident Evil Village 1080p dropped the FPS from an enjoyable 80 FPS to 7.5 FPS, a 91% loss. Other titles are similarly affected: Watch Dogs Legion: -64%, Deathloop: -68%, Cyberpunk -75%. Two games, Control and Doom Eternal, simply refused to enable ray tracing, probably because of the 4 GB VRAM size. The Radeon RX 6500 XT really doesn't have the horsepower for ray tracing, *but that's no big deal, I think.*



Excuse me Wizzard ,but for *historical reasons* i'll have to disagree.

Personally , i never forgot the statement that AMD's senior VP of engineering for RTG ,David Wang had made 3-years ago ,curiously enough ,exactly the time when nVIDIA was introducing their first RayTracing implementations with Turing:
Back then ,Wang responding to the fact that nVIDIA 's RTX features were only supported from mid/upper range of their lineup ,he had stated that :
_"""*Utilisation of ray tracing games will not proceed unless we can offer ray tracing in all product ranges from low end to high end*"""_
( https://www.game-debate.com/news/26...until-even-low-end-radeon-gpus-can-support-it )
*I had to wait for 3 years* , but i never forgot to check what will eventually happen with Wang's statement.
And ,since the passing of time always comes , after seeing those RT numbers , it's finally time for me to wish "good luck" to those who had taken Wang's statement seriously back then , and they are brave enough to enable RayTracing with those lower-tier products.


----------



## arni-gx (Jan 23, 2022)

wow, radeon rx 6500 xt 4gb with 64bit, not 128bit......nice radeon new gpu.....


----------



## Valantar (Jan 23, 2022)

@W1zzard Article suggestion: investigating low-power performance on the 6500 XT, with underclocking (and possibly undervolting). Why? Because this GPU is pretty clearly built to be a 25-50W mobile GPU, judging by both the narrow PCIe bus and the narrow VRAM bus. (That's the power range of the 6400M (25W) and 6500M (35-50W).) This desktop SKU is clearly pushed stupidly high to sell as an x5xx SKU rather than where it belongs, as a (~$100) x4xx SKU. Hence why an article like this would be interesting: how does it perform at various power levels (assuming it can be made to run at them), especially 75W and 50W, but lower too if possible. And what clocks can it maintain at those power levels? I would expect it to hit >2GHz at pretty low power, without losing _that_ much performance overall.

I get that this would be a noticeable amount of extra work for a niche article, but it could be a great investigative piece shedding light on how this missed the mark in balancing product segmentation, marketing and performance expectations.


----------



## W1zzard (Jan 23, 2022)

Valantar said:


> Article suggestion


Good suggestion. Maybe once I'm finished with 2x RTX 3080 12 GB, 4x RTX 3050, RTX 3090 Ti (unknown # of samples yet), 3x Alder Lake non-K (might buy 2 more) and 4x SSD


----------



## Valantar (Jan 23, 2022)

W1zzard said:


> Good suggestion. Maybe once I'm finished with 2x RTX 3080 12 GB, 4x RTX 3050, RTX 3090 Ti (unknown # of samples yet), 3x Alder Lake non-K (might buy 2 more) and 4x SSD


... so, some time in 2023?


----------



## W1zzard (Jan 23, 2022)

Valantar said:


> ... so, some time in 2023? XD


Rather February, because then: more launches


----------



## Roph (Jan 23, 2022)

A $200 card that's not worth even $70, that can't beat *6 year old* $200 cards and yet will sell for $400+. 

What sad times.


----------



## wolf (Jan 24, 2022)

@W1zzard how do you feel about the 6500XT's name as the XT variant? imo this is a key area where the cards critical reception might have at least been partly better because it carried an expectation. The two other like for like series RDNA2 vs 1 cards offered quite a performance leap, which I think people would reasonably expect the 6500 to follow suit.





But instead we get a card that's pretty much equal to the 5500XT, sometimes worse, and certainly worse on pci-e gen3.

Even if for reasons, the card had to be $199 MSRP to be viable to produce and put on shelves, I think being named as say RX 6500 or even RX 6500LE would make more sense and have aided reception, as it sets an expectation about the cards performance that would much more closely match how it actually performs in the real world.


----------



## Roph (Jan 24, 2022)

This thing is a 6300 at absolute best.


----------



## chrcoluk (Jan 24, 2022)

89W of power isnt impressive to me, although the author did realise that when they looked at performance per watt, I would expect circa 50W for the spec of the card.  I can get my 3080 below 100w with under volting and capping to 60fps, and I consider it a power hungry card.

Priced way too high.


----------



## kanecvr (Jan 24, 2022)

This 6500xt clearly needed a 128bit memory bus. With a 64bit bus and resulting performance levels it's more suited to being called a 6300XT like @Roph said above. Shame.


----------



## mechtech (Jan 25, 2022)

I wonder what the 6400 is going to look like after this?  Same speed as integrated graphics??


----------



## dalekdukesboy (Jan 25, 2022)

mechtech said:


> I wonder what the 6400 is going to look like after this?  Same speed as integrated graphics??


Wow...so the 6400 is going to be significantly faster???


----------



## Darmok N Jalad (Jan 26, 2022)

Currently, the 6500 XT Is going for between $240 and $300 at Microcenter (limited stock), while ye ol’ RX 560 is selling for that lofty $200 price point. At least the OEMs are slapping massive coolers on them so you can still tell yourself that you’re getting what you paid for. Go for the triple fan 5600 XT!


----------



## mb194dc (Jan 27, 2022)

A card so bad there is actually stock of it in the UK for £199.99... 

Could we see this as the start for things to move more towards pre Covid normal?


----------



## kruk (Jan 28, 2022)

Valantar said:


> @W1zzard Article suggestion: investigating low-power performance on the 6500 XT, with underclocking (and possibly undervolting). Why? Because this GPU is pretty clearly built to be a 25-50W mobile GPU, judging by both the narrow PCIe bus and the narrow VRAM bus. (That's the power range of the 6400M (25W) and 6500M (35-50W).) This desktop SKU is clearly pushed stupidly high to sell as an x5xx SKU rather than where it belongs, as a (~$100) x4xx SKU. Hence why an article like this would be interesting: how does it perform at various power levels (assuming it can be made to run at them), especially 75W and 50W, but lower too if possible. And what clocks can it maintain at those power levels? I would expect it to hit >2GHz at pretty low power, without losing _that_ much performance overall.
> 
> I get that this would be a noticeable amount of extra work for a niche article, but it could be a great investigative piece shedding light on how this missed the mark in balancing product segmentation, marketing and performance expectations.



CapFrameX already done some work. Basically, they wasted a lot of efficiency for minor performance gains.

__ https://twitter.com/i/web/status/1485211874182447107
@W1zzard : if you still have an Ryzen APU lying around, could you please check if ReLive recording works when this GPU is used with an APU? Because currently, there is no official way to turn on ReLive when using an Ryzen APU only (GUI is missing), and the 6500 XT obviously lacks the encoders, but ... the combination of both might work. Thanks.


----------



## Valantar (Jan 28, 2022)

kruk said:


> CapFrameX already done some work. Basically, they wasted a lot of efficiency for minor performance gains.
> 
> __ https://twitter.com/i/web/status/1485211874182447107
> @W1zzard : if you still have an Ryzen APU lying around, could you please check if ReLive recording works when this GPU is used with an APU? Because currently, there is no official way to turn on ReLive when using an Ryzen APU only (GUI is missing), and the 6500 XT obviously lacks the encoders, but ... the combination of both might work. Thanks.


Thanks for the tip! Looks like this will perform quite well in its intended 25-50W range in mobile. Couple this with a R5 6600H(S) or 28W 6x00U and you'll have a pretty potent thin-and-light low end gaming laptop. And those APUs at least deliver all the encode/decode that this misses (which is likely why it was omitted in the first place - for the intended use it's just a duplicate feature and a waste of die area). Sadly I have no idea about the ReLive question, hope it gets enabled though.


----------



## TheinsanegamerN (Feb 1, 2022)

chrcoluk said:


> 89W of power isnt impressive to me, although the author did realise that when they looked at performance per watt, I would expect circa 50W for the spec of the card.  I can get my 3080 below 100w with under volting and capping to 60fps, and I consider it a power hungry card.
> 
> Priced way too high.


It's really bad, considering you could buy 560s that ran on just 75w bus power (see also MSI's low profile 560). The move from 14nm to 7nm, from polaris to rDNA 2, and AMD's perf/watt has barely moved. Just....utter garbage. 


kanecvr said:


> This 6500xt clearly needed a 128bit memory bus. With a 64bit bus and resulting performance levels it's more suited to being called a 6300XT like @Roph said above. Shame.


Ideally with the 18Gbps GDDR6 they wasted on thsi attrocity a 6GB 96 bit bus would have both provided significantly more bandwidth and also alleviated the 4GB limit that plagues the slower 560. One of these cards, with a 96 bit 6GB bus (maybe with 16Gbps to save some money and get the price down) clocked at a more sane 2.4-2.5 GHz with a 75 watt TDP would have made for a much better low power upgrade card. As it stands this thing is an abomination of silicon.


----------



## Anymal (Feb 1, 2022)

316 eur in Germany eretailer


----------



## LabRat 891 (Feb 4, 2022)

Any info or way to gather info on the PCIe Slot Power Draw alone? I want to throw one of these in an m.2 adapter, but the slot current draw will be the difference between fiery failure and success.


----------



## Kanan (Feb 5, 2022)

I don't think the power consumption in this review resembles the cards true efficiency. Cyberpunk 2077 Ultra doesn't run well on this card for obvious reasons, it's a worst case scenario. In games where this card runs good, or with settings this card likes, it should be very efficient, comparable to 6600 XT. It's a halved 6600 XT in every regard, this includes power consumption.


----------



## omerfak (Feb 6, 2022)

Wait, it's slower than 5500 XT?????


----------



## Sirleirbag (May 2, 2022)

I had to chose between this card and a 970 or 570 at same price, so i chose this card, even when i did not upgrade to pcie gen 4 still. 
A used 5 year old card could be a lot more of a bet than this one. I will flip it for a 1650 if i can.


----------



## HD64G (Jul 25, 2022)

Sapphire just announced with a tweet that the 6500XT is also available with 8GB of VRAM. And this imho will diminish the deficit in preformance when paired with a PCIE3 board. The same happened with 5500XT 5GB back then.


----------



## LabRat 891 (Jul 27, 2022)

HD64G said:


> Sapphire just announced with a tweet that the 6500XT is also available with 8GB of VRAM. And this imho will diminish the deficit in preformance when paired with a PCIE3 board. The same happened with 5500XT 5GB back then.


Wonder how much the limited bus width will cripple it? 
Somehow, my 6500XT manages superior performance to my RX 580, until it hits a VRAM wall. CP'77 is the only game I've tried that at 1080p, where it hits that wall regularly (and *hard* at that).
Very interested, if only to have an oddball card in my collection.


----------

