# AMD Radeon RX 6500 XT PCI-Express Scaling



## W1zzard (Jan 19, 2022)

The AMD Radeon RX 6500 XT comes with only a narrow PCI-Express x4 interface. In this article, we took a closer look at how performance is affected when running at PCI-Express 3.0; also included is a full set of data for the academically interesting setting of PCI-Express 2.0.

*Show full review*


----------



## Chaitanya (Jan 19, 2022)

Yikes, thanks but no thanks and it seems like nVidia option is looking like a better choice  with added premium.


----------



## gridracedriver (Jan 19, 2022)

A good fail.


----------



## kapone32 (Jan 19, 2022)

If this card cannot even get100 FPS in Doom Eternal at 1080P it is a waste. This reminds me of the R7 250.


----------



## 80-watt Hamster (Jan 19, 2022)

This design is a bit baffling to me.  There's no way AMD isn't aware of the factors you mentioned regarding the hardware that a good chunk of potential customers will be pairing a card like this with.  Beyond that, the fact that the relative performance loss from 4.0 to 3.0 is fairly consistent across tests (outside of RT) suggests, as you mentioned, that performance may be bus-limited even at 4.0x4.  Why would AMD intentionally limit performance like that?  That is not a rhetorical question.


----------



## ShurikN (Jan 19, 2022)

> As if it could use the bandwidth even if they wired it up to x16. People just need another thing to complain about.





> I don't think they purposely gimp it unless it doesn't matter anyway.





> The 6500 XT also has a larger L3 cache buffer like all other desktop RDNA 2 cards, thus isn't susceptible to bus bandwidth.





> It's probably enough bandwidth for that performance anyways, so would be much ado about nothing.





> I just trust AMD engineers know what they are doing there





> I doubt it will saturate the bus even with that downgrade





> Lots of engineers these days on these forums, in where their skills obviously exceeds that of AMD mosts professional out there.


----------



## kapone32 (Jan 19, 2022)

The only issue is it's slower than the RX570 in most Games.


----------



## TheinsanegamerN (Jan 19, 2022)

kapone32 said:


> The only issue is it's slower than the RX570 in most Games.


That's not the only issue by a long shot. How about "its not only slower then the 570 but it's also $200 AND is gimped for non AMD platforms"

Because if it WAS faster then a 570, that still means it offers worse price performance then a $200 RX 480 from *6 YEARS AGO*. And it would still scale very pporly on anythign that isnt rocket lake, alder lake, or AMD 500 series chipsets, AKA the majority of buyers looking for a low end card like this. Few are going to be buying a $200 GPU to go with a new $500 CPU after all, and for anyone with a PCIe 3.0 platform (AMD from k10 to zen 2, intel from ivy bridge to comet lake) is going to lose more performance. 

At $200 this thing would need to be consistely outperforming the 1660 super, and even then wouldn't be a very good value. Add on that pathetic 4GB VRAM buffer and this thing should be a sub $100 GT 1030 competitor. And dont forget, this thing draws 100 watts of power when gaming, compared to the ~140 watts pulled by the 6 year old 12nm RX 480 or the ~70 watts pulled by the 12nm 1650, or the ~95-100 pulled by the 1650 super which occasionally outperforms this 6nm rDNA2 card. 

This thing is an attrocious GPU.


----------



## Mathragh (Jan 19, 2022)

Does anyone have an educated guess as to how much is being saved with this 4x bus compared to let's say a x8 bus? I'm specifically interested in raw materials (PCB), SMD's on the PCB and perhaps die space.

I wonder whether the shortage of loads of devices and materials also had something to do with this decision.


----------



## TheLostSwede (Jan 19, 2022)

Mathragh said:


> Does anyone have an educated guess as to how much is being saved with this 4x bus compared to let's say a x8 bus? I'm specifically interested in raw materials (PCB), SMD's on the PCB and perhaps die space.
> 
> I wonder whether the shortage of loads of devices and materials also had something to do with this decision.


PCB/SMD only, 20-30 cents? Maybe $1 with the current component shortage.
No idea about die space.


----------



## HD64G (Jan 19, 2022)

Gamers should not buy that GPU for even $200 with hw encoding absent and 4GB of VRAM, even if they own a PCIE 4.0 board. AMD should price it closer to $150. But we see people buying $2000 GPUs so...


----------



## Mathragh (Jan 19, 2022)

HD64G said:


> Gamers should not buy that GPU for even $200 with hw encoding absent and 4GB of VRAM, even if they own a PCIE 4.0 board. AMD should price it closer to $150. But we see people buying $2000 GPUs so...


Desktop PC's will usually have enough oomph/power to just do encoding and decoding in software, and laptops will all have the dedicated hardware integrated into the iGPU. It's not such a big deal imho.


----------



## TheinsanegamerN (Jan 19, 2022)

HD64G said:


> Gamers should not buy that GPU for even $200 with hw encoding absent and 4GB of VRAM, even if they own a PCIE 4.0 board. AMD should price it closer to *$50*. But we see people buying $2000 GPUs so...


FTFY. The GT 1030 was a $79 MSRP product with GRDDR5 memory when it came out, and that had media decoders on board. This....doesnt.



Mathragh said:


> Desktop PC's will usually have enough oomph/power to just do encoding and decoding in software, and laptops will all have the dedicated hardware integrated into the iGPU. It's not such a big deal imho.


That doesnt justify wasting power on it, decoding hardware especially has been standard in GPUs for over a decade at this point. The last AMD GPU lineup that didnt have a dedicated decoder was evergreen. From 2009.


----------



## Mathragh (Jan 19, 2022)

TheinsanegamerN said:


> FTFY. The GT 1030 was a $79 MSRP product with GRDDR5 memory when it came out, and that had media decoders on board. This....doesnt.
> 
> 
> That doesnt justify wasting power on it, decoding hardware especially has been standard in GPUs for over a decade at this point. The last AMD GPU lineup that didnt have a dedicated decoder was evergreen. From 2009.


but it does cost additional die space; something there is an undeniable shortage of. 
If the absence of encoding hardware means more people will be able to get a card at all instead of no card, then there is something to be said for that imho.


----------



## Mistral (Jan 19, 2022)

What the hell is happening in The Wither 3 and Control's RT test?


----------



## Forza.Milan (Jan 19, 2022)

everyone at Amd who was design it, must put to jail, its all totally crime..


----------



## damric (Jan 19, 2022)

Well if it was only crippled at 1440p or above then it wouldn't be a big deal, but seeing significant penalty at 1080p for a card designed for that segment is alarming. I see that @W1zzard  's test rig used 4000MT/s RAM. I am willing to bet that performance drastically tanks with more mainstream system memory speeds.


----------



## TheinsanegamerN (Jan 19, 2022)

Mathragh said:


> but it does cost additional die space; something there is an undeniable shortage of.
> If the absence of encoding hardware means more people will be able to get a card at all instead of no card, then there is something to be said for that imho.


The media decoder is TINY. Encoder you have a point on, but a decoder? Really? That's not going to make a substantial difference in die space as to affect availability.

There's also the issue of the RX 560, built on 14nm node, which has a media decoder, media* ENCODER*, 4k H264 encoder, the same number of cores (1024) and is only 137mm in size. The 6500 is 107mm2. If they wanted to save "die space" as you say, they could have simply dropped infinity cache altogether, given it a 128 bit memory bus, media encoder and decoder, and had a smaller, better GPU.

There really is 0 defending AMD on this one. A 560 on 6nm likely would have not only performs just as well given the attrocious performance of the 6500xt but would have been nearly the same size die wise.

EDIT: there's also the issue of "it gets mroe cards to people" that is blatantly wrong. The 6500xt at $200 is already sold out a minute after the review dropped, and the ones in stock are now going for $350. Bet you good money they'll be gone within an hour adn then will be unobtanium like the rest of the 6000 line.


----------



## Mathragh (Jan 19, 2022)

TheinsanegamerN said:


> The media decoder is TINY. Encoder you have a point on, but a decoder? Really? That's not going to make a substantial difference in die space as to affect availability.
> 
> There's also the issue of the RX 560, built on 14nm node, which has a media decoder, media* ENCODER*, 4k H264 encoder, the same number of cores (1024) and is only 137mm in size. The 6500 is 107mm2. If they wanted to save "die space" as you say, they could have simply dropped infinity cache altogether, given it a 128 bit memory bus, media encoder and decoder, and had a smaller, better GPU.
> 
> ...


Well, okay. I'm just trying to figure out why certain choices were made in the context of a company which is full of smart people and which is trying to sell as many GPU's as possible.
I don't believe this GPU is what it is because "people are dumb", or that they're intentionally building something worse than what they're capable of given the confines of their design goals.


----------



## Zareek (Jan 19, 2022)

You've got to love e-waste! And, livin' in the twilight zone that is 2020 thru ???? (not ending soon enough!)


----------



## 80-watt Hamster (Jan 19, 2022)

Mathragh said:


> Well, okay. I'm just trying to figure out why certain choices were made in the context of a company which is full of smart people and which is trying to sell as many GPU's as possible.
> I don't believe this GPU is what it is because "people are dumb", or that they're intentionally building something worse than what they're capable of given the confines of their design goals.



There are a couple of explanations that spring to mind.  The first is that project management set the design constraints to intentionally limit the card to a specific performance envelope, though why that limit would be so far behind the 6600XT is anyone's guess.  The other is that, as others have hypothesized, that this chip was primarily designed as a mobile component, and is being repurposed for desktop parts.  That may not be true, but it could help explain some of the weird design decisions, like the 64-bit memory bus and x4 lane count.

As an aside, it's easy to blame the engineers and designers, but they're not always the reason a product is bad.


----------



## TheinsanegamerN (Jan 19, 2022)

Mathragh said:


> Well, okay. I'm just trying to figure out why certain choices were made in the context of a company which is full of smart people and which is trying to sell as many GPU's as possible.
> I don't believe this GPU is what it is because "people are dumb", or that they're intentionally building something worse than what they're capable of given the confines of their design goals.


I feel like this is very optimistic thinking. The last few years we have seen the proverbial mask slip numerous times as companies treat their consumers like cash bags on two legs, insulting them and trating them like garbage. There is not much reason to believe that AMD, the company that jacked up the price of VEGA after launch and said "oh well it's an introductory price didnt you know?", sold the AM4 platform on "support till 2020" then changed their tune to "when we said support we meant we supported you buying a new motherboard before 2020 if you want our new CPU" and raised the price of their 6 core offering from $179 to $300 and left the entire budget market to rot for over a year. is excluded form this mindset. 

Then again this is AMD, a company that has never failed to mismange itself out of success. The examples are numerous over the year, to keep it blunt: polaris was a fluke, so was the 5700xt. The nonsesnse over the 5500/5600xt/vega launches is normal for them.  The 6500xt is likely engineered to be as absolutely cheap as possible, since everything sells right now regardless of how good it is.



80-watt Hamster said:


> There are a couple of explanations that spring to mind.  The first is that project management set the design constraints to intentionally limit the card to a specific performance envelope, though why that limit would be so far behind the 6600XT is anyone's guess.  The other is that, as others have hypothesized, that this chip was primarily designed as a mobile component, and is being repurposed for desktop parts.  That may not be true, but it could help explain some of the weird design decisions, like the 64-bit memory bus and x4 lane count.
> 
> As an aside, it's easy to blame the engineers and designers, but they're not always the reason a product is bad.


Agreed, 64 bit may have been a command from high management that MUST be adhered to to maintain product segmentation, regardless if a 6 GB 96 bit bus would have worked better.


----------



## Mathragh (Jan 19, 2022)

TheinsanegamerN said:


> I feel like this is very optimistic thinking. The last few years we have seen the proverbial mask slip numerous times as companies treat their consumers like cash bags on two legs, insulting them and trating them like garbage. There is not much reason to believe that AMD, the company that jacked up the price of VEGA after launch and said "oh well it's an introductory price didnt you know?", sold the AM4 platform on "support till 2020" then changed their tune to "when we said support we meant we supported you buying a new motherboard before 2020 if you want our new CPU" and raised the price of their 6 core offering from $179 to $300 and left the entire budget market to rot for over a year. is excluded form this mindset.
> 
> Then again this is AMD, a company that has never failed to mismange itself out of success. The examples are numerous over the year, to keep it blunt: polaris was a fluke, so was the 5700xt. The nonsesnse over the 5500/5600xt/vega launches is normal for them.  The 6500xt is likely engineered to be as absolutely cheap as possible, since everything sells right now regardless of how good it is.
> 
> ...


It's certainly not the first time I've been called an optimist  Thanks all for the answers regardless.

Regarding the 64bit vs 96bit bus: I wonder whether they even have a GDDR "design module" (or whatever you call the things they build their chip designs out of) in a smaller than 64bit size.


----------



## TheinsanegamerN (Jan 19, 2022)

Mathragh said:


> It's certainly not the first time I've been called an optimist  Thanks all for the answers regardless.
> 
> Regarding the 64bit vs 96bit bus: I wonder whether they even have a GDDR "design module" (or whatever you call the things they build their chip designs out of) in a smaller than 64bit size.


Therre's nothing technically stopping you, their memory chips use 32 bit channels each, so you could put a single memory chip on a 32 bit bus and it would technically work. I doubt the standard body actually wrote one out though, given 32 bit cards.....wait did those every exist in the first place?


----------



## Mathragh (Jan 19, 2022)

TheinsanegamerN said:


> .....wait did those every exist in the first place?


haha exactly


----------



## Chrispy_ (Jan 19, 2022)

Ouch. 

I was guessing 5-8% loss from PCIe 3.0 in threads last week.

I didn't expect 12-18%. That's _bad._


----------



## Adam Krazispeed (Jan 19, 2022)

HD64G said:


> Gamers should not buy that GPU for even $200 with hw encoding absent and 4GB of VRAM, even if they own a PCIE 4.0 board. AMD should price it closer to $150. But we see people buying $2000 GPUs so...


Nah!!!! 6500XT should only cost $99 USD MSRP ( IF MSRP EVEN EXISTED)  NO ENCODING!!! Gen4 x4 Pusshyish links (gen3 will suffer, GOOD LUCK CHINA LGA 2011 XEON USERS)  THISGPU ISNOT FOR YOU!!! keep RX 570 4GB ?  LOL   LMMFAOSHIFOOMC   NICE AMD JUST NICE JOB....... NOT!!!!!!!!!!!!!!!


----------



## BSim500 (Jan 19, 2022)

Mathragh said:


> Does anyone have an educated guess as to how much is being saved with this 4x bus compared to let's say a x8 bus? I'm specifically interested in raw materials (PCB), SMD's on the PCB and perhaps die space. I wonder whether the shortage of loads of devices and materials also had something to do with this decision.


Hardly anything. The price of copper has increased from $2.5/lb (Jan 2019) to $4.5/lb (Jan 2022). Even if this GPU came with a huge 1kg solid copper heatsink, it still would have barely increased prices by $5 per GPU vs where we were pre-lockdown. A cheap aluminium heatsink and copper used only on PCB tracks / interconnects alone = not even $0.1 saving on copper. As for die costs, I don't have the maths but as a reality check, even the 6 year old 2015-era $99 GTX 1050 2GB (non Ti) had both a 3.0 x16 bus and Shadowplay encoder, so _"We did this to this $300 GPU to save money" _=_ "We love to insult your intelligence"._..


----------



## WhoDecidedThat (Jan 19, 2022)

TheinsanegamerN said:


> given 32 bit cards.....wait did those every exist in the first place?


they do now


----------



## mechtech (Jan 19, 2022)

Well only games I play on that massive list is witcher 3 and BL3.  And on witcher 3 the pcie version makes no difference and on BL3 it’s minimal which is surprising in a good way.

now having said that there is absolutely no reason to replace my 8GB RX480 with that card.

and if it doesn’t support hardware encode/decode of all modern video codecs then as I said before it’s a waste of ram and pcb and silicon.

with 4GB of ram this should have been priced in the rx460/560 bracket at $120

edit.  Lol ya nope


----------



## DemonicRyzen666 (Jan 19, 2022)

Should have put 3D v-cache on that infinity cache, lol and at least 8x 4.0
So the only features you really get is FSR and RSR no decoder or encoder. hhhmmm I think you're better off with a 5500 XT 8Gb at that point. Doesn't FSR and RSR work on those 5500 XTs too?


----------



## ARF (Jan 19, 2022)

And the winner for the worst graphics card launch ever gets: THIS!

Absolute shit show from AMD 


It is DBE.

don't-buy-edition


----------



## erek (Jan 19, 2022)

https://www.kitguru.net/components/...ming-4gb-vram-is-not-enough-for-todays-games/


----------



## Mistral (Jan 19, 2022)

ARF said:


> And the winner for the worst graphics card launch ever gets: THIS!
> 
> Absolute shit show from AMD
> 
> ...


Now watch it sell out to the very last unit....


----------



## erek (Jan 19, 2022)

Mistral said:


> Now watch it sell out to the very last unit....


you considering getting one?


----------



## Lew Zealand (Jan 19, 2022)

x8 would have fixed the biggest of the criticisms.  Sure, 8GB wouldn't have hurt either and most gamers aren't using the encode/decode blocks, but x4 makes this well over the line to DOA.

But how are they out of stock??


----------



## erek (Jan 19, 2022)

Lew Zealand said:


> x8 would have fixed the biggest of the criticisms.  Sure, 8GB wouldn't have hurt either and most gamers aren't using the encode/decode blocks, but x4 makes this well over the line to DOA.
> 
> But how are they out of stock??











						PowerColor AMD Radeon RX 6500XT ITX Gaming Graphics Card 4GB GPU  | eBay
					

<p>PowerColor AMD Radeon RX 6500XT ITX Gaming Graphics Card 4GB GDDR6. </p><p>This is a confirmed order</p><p>Comes to me next week (today date is the 19th of January) </p>



					www.ebay.com


----------



## Raendor (Jan 19, 2022)

What an utter piece of garbage!


----------



## Lew Zealand (Jan 19, 2022)

erek said:


> PowerColor AMD Radeon RX 6500XT ITX Gaming Graphics Card 4GB GPU  | eBay
> 
> 
> <p>PowerColor AMD Radeon RX 6500XT ITX Gaming Graphics Card 4GB GDDR6. </p><p>This is a confirmed order</p><p>Comes to me next week (today date is the 19th of January) </p>
> ...



$550.  And sold.

Seriously.  This is just money laundering, right?


----------



## Fourstaff (Jan 19, 2022)

Looks like some AMD's engineers are worse than I thought. Gave them a benefit of doubt, they did not deliver. It doesn't even conclusively beat 5500XT.


----------



## 80-watt Hamster (Jan 19, 2022)

Fourstaff said:


> Looks like some AMD's engineers are worse than I thought. Gave them a benefit of doubt, they did not deliver. It doesn't even conclusively beat 5500XT.



Making a silk purse from a sow's ear is even more difficult than it sounds.


----------



## erek (Jan 19, 2022)

80-watt Hamster said:


> Making a silk purse from a sow's ear is even more difficult than it sounds.


people just seem to be looking at this product in the wrong way it seems, was awaiting for this hot take. AMD might just be misunderstood


----------



## trsttte (Jan 19, 2022)

erek said:


> people just seem to be looking at this product in the wrong way it seems, was awaiting for this hot take. AMD might just be misunderstood



TL: DW: "It's a turd but it's a new turd available at stores and with a warranty" lol, what a ridiculous bad take


----------



## InVasMani (Jan 19, 2022)

The 1440p relative results outperformed 1080p and 4K which is a bit of a oddity. The card is overall weak though. Even beyond he PCIE matter other parts of the card are really cut down from the 5500XT. The cache and GDDR is better, but other parts of the hardware are more vital. If I had to guess AMD will sell a lot of these to OEM's though in bulk perhaps. It's not a total dud of a card, but it is underwhelming for the DIY enthusiast market and below expectations.


----------



## watzupken (Jan 19, 2022)

TheinsanegamerN said:


> That's not the only issue by a long shot. How about "its not only slower then the 570 but it's also $200 AND is gimped for non AMD platforms"
> 
> Because if it WAS faster then a 570, that still means it offers worse price performance then a $200 RX 480 from *6 YEARS AGO*. And it would still scale very pporly on anythign that isnt rocket lake, alder lake, or AMD 500 series chipsets, AKA the majority of buyers looking for a low end card like this. Few are going to be buying a $200 GPU to go with a new $500 CPU after all, and for anyone with a PCIe 3.0 platform (AMD from k10 to zen 2, intel from ivy bridge to comet lake) is going to lose more performance.
> 
> ...


It is not just Intel platform that it’s gimped. If one is using a Ryzen APU, ie, 4000G and 5000G series, you are limited to PCI-E 3.0, and will face the same issue. There are rumors that AMD may release 4000G APUs to compete with Intel in the low end segment. So budget system gets bad performance due to bad design decision.
While this is a good review to understand the impact of moving from PCI-E 4.0 to 3.0, the use of average performance lost in the conclusion to me is wrong. Instead the worst case scenario should be painted because there are potential games that will lose significant performance. Potential buyer should not read that on average the lost in performance is 13% and decide to buy it and upgrade to a PCI-E 4.0 later because they may be leaving a lot of performance untapped. And average numbers can be skewed based on the games tested.


----------



## Mussels (Jan 20, 2022)

Me: Hell yeah, gunna use the timezone advantage and get in early

also me: 50 comments here and 150 on the card review, before i start. popular topic, much?


Final thoughts:

Ouch. Yeah this is a repurposed laptop card that shouldnt exist outside budget prebuilts.

Being equal to an RX470/480 from 500 years ago but requiring PCI-E 4.0 to reach those levels, makes it a terrible choice.
The only upside is if they have a lot of stock, during the GPU shortage.


----------



## watzupken (Jan 20, 2022)

Mussels said:


> Me: Hell yeah, gunna use the timezone advantage and get in early
> 
> also me: 50 comments here and 150 on the card review, before i start. popular topic, much?
> 
> ...


I feel that going forward with new systems, chances of us seeing PCI-E 3.0 is slim. PCI-E 5.0 is the current standard, so I expect even budget builds to come with PCI-E 4.0, at least mostly. I am aware that Intel's latest H610 still supports PCI-E 3.0. Overall, I think if one is getting this card to run with a PCI-E 4.0, it may not be a deal breaker, if you don't need things like support for more than 2 monitors, lack of updated video decoder and lack of any video encoder.

In any case, I feel AMD's GPU range from 6700 XT and lower is very lacklustre. Each downgrade from the RX 6800 series resulted in very hefty 50% cut(s) somewhere. The RX 6700 XT lost 50% of the CUs to 40 CUs, and if you consider the fact that the RX 6600 XT has 32 CUs, I can't help but think AMD cheap out too much and hampering performance. Next downgrade is the RX 6600 XT, which I feel is not as bad when the biggest downgrade is the Infinity cache to 32 MB, which is more than a 50% downgrade. In addition, the PCI-E support also got axed by half to x8 where the performance degrade when using it on PCI-E 3.0 is not as severe, but again, some titles suffer more than others. Lastly comes the disastrous RX 6500/ 6400 series, and the axe fell hard on it where almost everything is axed by half, with the exception of features like video encoder/ decoder which is as good as completely axed. In my opinion, AMD should have given this card a 96 bit memory bus so that it can accommodate 6GB of VRAM. But to save their own cost, they cheap out once again and went for a meagre 64 bit. The cache is not "infinite" as the name suggest, and that silly decision to couple low VRAM with x4 connection is a deal breaker.

Sure it will sell well to OEMs, but people buy it because they either do not know about the limitations of this card, or desperate to get their hands on a GPU. Both are not good reasons and a poor showing from AMD. I don't believe there will be stock issue because if you look at Amazon, you can easily find RX 6600 available for sale. Only problem is the price is not attractive, which I believe will be the case for this card as well. If RX 6600 is not popular, you can imagine this is probably 3x worst.


----------



## mechtech (Jan 20, 2022)

Lew Zealand said:


> x8 would have fixed the biggest of the criticisms.  Sure, 8GB wouldn't have hurt either and most gamers aren't using the encode/decode blocks, but x4 makes this well over the line to DOA.
> 
> But how are they out of stock??


From a modder or just for fun standpoint, I wonder how it perform with a 256-bit mem bus and x16 pcie lanes and 8GB of top speed GDDR6, then OC it all to the nuts just to see what this itty bitty piece of silicon could have actually done.   Maybe it tie the 6600xt lol


----------



## RJARRRPCGP (Jan 20, 2022)

Yikes! "Radeon RX 6500 XT was killed by Radeon RX 580"->Me.

Looks like I'm more likely to OC my RX 580! It's on my A320 system with a Pinnacle Ridge.

For new cards, with possibly a few exceptions, it looks like Nvidia for me. Even the GTX 1650 Super, looks better than this!

And for the raytracing being ruled a fail, in that department, it feels more like a fake card! It at least has a "fake card"-like vibe!  WTF! LOL!


----------



## watzupken (Jan 20, 2022)

RJARRRPCGP said:


> Yikes! "Radeon RX 6500 XT was killed by Radeon RX 580"->Me.
> 
> Looks like I'm more likely to OC my RX 580! It's on my A320 system with a Pinnacle Ridge.
> 
> ...


To me, ray tracing is not something viable for cards in the mid to low range. At least I would have preferred higher frame rates as opposed to having RT on. We can claw some performance back using DLSS or FSR for sure, but still having to game at a lower resolution which is further upscaled from an even lower resolution don't sound great to me.


----------



## RJARRRPCGP (Jan 20, 2022)

watzupken said:


> To me, ray tracing is not something viable for cards in the mid to low range. At least I would have preferred higher frame rates as opposed to having RT on. We can claw some performance back using DLSS or FSR for sure, but still having to game at a lower resolution which is further upscaled from an even lower resolution don't sound great to me.


This reminds me of the AGP-era, when Radeon wasn't AMD, when they released the Radeon 9600 SE!  This is like AMD ripping a page out of the AGP-era ATI budget video card playbook! (when 9000 Pro was beat by a Radeon 8500 and later on, a Radeon 9550 getting beat by a Radeon 9500!) (It at least reminds me of their tricky naming scheme of the AGP-era)

This could be the card that causes the Radeon division to get into financial trouble. Is the RX 6500 XT the "Atari E.T." of cards?


----------



## Jism (Jan 20, 2022)

Ok i know what i'm going to write is stupid now, but,

If you do have a reasonable board, you should be able or capable to increase the PCI-E frequency by either 4 to 12Mhz. You have any idea how much extra bandwidth that provides for a limited card like that? I suggest you try out. Other then that: even tho it's backwards compatible, it's obviously designed for use with PCI-E 4.0 and not 3.0.


----------



## watzupken (Jan 20, 2022)

Anyway to sum it up, I don't know if AMD is desperately trying to cut cost or trying to help gamers get a GPU. My take is more of the former than latter. But assuming the latter, I feel they have completely missed the point. Even from a cost cutting perspective, they should be more thoughtful about what they are cutting out, as oppose to throwing everything out just so that it fits the sub 200 price range. The RX 6500/ 6400 series is like a plate of food that have gone through a cost cut that badly, that while it can still fill the stomach, it is terrible to the taste buds. Would people have paid a little more for better taste, I believe so.


----------



## RJARRRPCGP (Jan 20, 2022)

watzupken said:


> Anyway to sum it up, I don't know if AMD is desperately trying to cut cost or trying to help gamers get a GPU. My take is more of the former than latter. But assuming the latter, I feel they have completely missed the point. Even from a cost cutting perspective, they should be more thoughtful about what they are cutting out, as oppose to throwing everything out just so that it fits the sub 200 price range. The RX 6500/ 6400 series is like a plate of food that have gone through a cost cut that badly, that while it can still fill the stomach, it is terrible to the taste buds. Would people have paid a little more for better taste, I believe so.


The RX 6500 XT, reminds me of the dreaded RX 5300!


----------



## watzupken (Jan 20, 2022)

Jism said:


> Ok i know what i'm going to write is stupid now, but,
> 
> If you do have a reasonable board, you should be able or capable to increase the PCI-E frequency by either 4 to 12Mhz. You have any idea how much extra bandwidth that provides for a limited card like that? I suggest you try out. Other then that: even tho it's backwards compatible, it's obviously designed for use with PCI-E 4.0 and not 3.0.


To me, a product should make sense out of the box and not require some advance tinkering to make it run better. And truth to be told, 4Mhz is not going to make a big difference, and it may potentially cause other components on your board to run out of sync or unstably. It is obviously made for PCI-E 4.0 is true but the target market is mostly on PCI-E 3.0. So it is a bad mismatch and decision when you make a product that is not suitable for the target market.



RJARRRPCGP said:


> The RX 6500 XT, reminds me of the dreaded RX 5300!


Objectively, the performance is not terrible (only on PCI-E 4.0), but completely unexciting. The missing features and limited PCIE lanes are the kickers. Anyway, I am done bashing it. Time to move on.


----------



## RJARRRPCGP (Jan 20, 2022)

watzupken said:


> To me, a product should make sense out of the box and not require some advance tinkering to make it run better. And truth to be told, 4Mhz is not going to make a big difference, and it may potentially cause other components on your board to run out of sync or unstably. It is obviously made for PCI-E 4.0 is true but the target market is mostly on PCI-E 3.0. So it is a bad mismatch and decision when you make a product that is not suitable for the target market.


I've never been this disgusted of Radeon in a while!


----------



## Jism (Jan 20, 2022)

True,

I'm just saying. Some extra performance could always be extracted from hardware, even if you OC the GPU or memory. If you do proper testing it can last for years really.

And BTW; even 4Mhz in PCI-E 4.0 terms is quite some hundreds of megabytes a second "extra".


----------



## nguyen (Jan 20, 2022)

AMD could just make it a 8x lanes PCIe and that could halves the number of complaints, but no....as the old saying "save a penny lose a dollar".

Here the 6500XT is selling for slightly higher price than 1650 Super, though 1650 Super is the superior GPU for people without PCIe Gen4 platform.


----------



## Mistral (Jan 20, 2022)

erek said:


> you considering getting one?


I do have one PC with a HD 6850 hooked to and old TV in the basement - but that system is on PCIe2 - so probably no. How about you?


----------



## seth1911 (Jan 20, 2022)

In Memory intense games mostly behind a RX 570 from 2018 for 200$, but the 570 is just a relabled RX 470.


----------



## Jism (Jan 20, 2022)

AMD Radeon RX 6500 XT Review: A Bad, Really Bad Graphics Card
					

How to start this review... this thing is so bad, it's really spoiled the mood for me. In short, this is the worst GPU release since I...




					www.techspot.com
				




Interesting. SAM was enabled here, offering another 5% rough performance improvement in PCI-E 4.0.


----------



## erek (Jan 20, 2022)

Mistral said:


> I do have one PC with a HD 6850 hooked to and old TV in the basement - but that system is on PCIe2 - so probably no. How about you?


I do like high profile failure cards for collecting but maybe they’d have to cancel and or recall this one before there’s too many to be that interesting.  It is funny how the pc enthusiast market is now tech drama and tech soap opera.  This launch and the review coverage is hilarious and possibly memorable.   All this press is good for AMD to be honest


----------



## RJARRRPCGP (Jan 20, 2022)

It feels like the Radeon team, flipped me off again! Maybe I should wait for those Intel discrete video cards!


----------



## Arkz (Jan 20, 2022)

What in the hell... So even on a PCI-E 4.0 board it's limited. And far more people will be on 3.0.
Average around RX580 perf but in 2022 and £225-£300+

I got a Sapphire Nitro+ RX580 4GB like 4 years ago for £170. And that was one of the more expensive versions.


----------



## ARF (Jan 20, 2022)

Mistral said:


> Now watch it sell out to the very last unit....



Oh, well, if the overall quantity of units is still two or three, I get what you tried to tell us 


The reality is that this is a very evil launch from AMD. Can't believe it is actually real..


----------



## W1zzard (Jan 20, 2022)

Jism said:


> AMD Radeon RX 6500 XT Review: A Bad, Really Bad Graphics Card
> 
> 
> How to start this review... this thing is so bad, it's really spoiled the mood for me. In short, this is the worst GPU release since I...
> ...


I've been using SAM for all my testing since forever


----------



## ARF (Jan 20, 2022)

This should cost no more than $79, and to be the lowest entry-level card of the generation. Everything above is a steal from the manufacturer..


----------



## BSim500 (Jan 20, 2022)

Jism said:


> Ok i know what i'm going to write is stupid now, but,
> 
> If you do have a reasonable board, you should be able or capable to increase the PCI-E frequency by either 4 to 12Mhz. You have any idea how much extra bandwidth that provides for a limited card like that? I suggest you try out. Other then that: even tho it's backwards compatible, it's obviously designed for use with PCI-E 4.0 and not 3.0.


OCing the PCIe bus is generally a bad idea. I've seen "silent errors" on storage devices from doing that, and OCing the PCIe bus could easily cause such problems with NVMe drives that directly use same bus.


watzupken said:


> Anyway to sum it up, I don't know if AMD is desperately trying to cut cost or trying to help gamers get a GPU. My take is more of the former than latter. But assuming the latter, I feel they have completely missed the point. Even from a cost cutting perspective, they should be more thoughtful about what they are cutting out, as oppose to throwing everything out just so that it fits the sub 200 price range. The RX 6500/ 6400 series is like a plate of food that have gone through a cost cut that badly, that while it can still fill the stomach, it is terrible to the taste buds. Would people have paid a little more for better taste, I believe so.


The problem with gamers being unable to get GPU's isn't specs but distribution. If AMD / nVidia really wanted gamers to have them, they'd have long setup some way of selling them directly - one GPU per address per year, domestic addresses only. AMD also wouldn't have blocked the 4000 APU's / 5300G for retail during the period of the worst shortages. And I'm pretty sure AMD could have designed an "all-in-one" MATX with a 75-120w dGPU onboard in place of the PCIe slots and underneath an "MSI Aero ITX style" short HS + single 92mm fan (like how dGPU's are found on a laptop's / some custom OEM motherboards, but in a standard MATX for the retail market). Lot's of things could have been done to help budget PC gaming, but at this point they aren't even trying to hide the fact they aren't even trying to fix the market.


----------



## ARF (Jan 20, 2022)

BSim500 said:


> The problem with gamers being unable to get GPU's isn't specs but distribution. If AMD / nVidia really wanted gamers to have them, they'd have long setup some way of selling them directly - one GPU per address per year, domestic addresses only. AMD also wouldn't have blocked the 4000 APU's / 5300G for retail during the period of the worst shortages. And I'm pretty sure AMD could have designed an "all-in-one" MATX with a 75-120w dGPU onboard in place of the PCIe slots and underneath an "MSI Aero ITX style" short HS + single 92mm fan (like how dGPU's are found on a laptop's / some custom OEM motherboards, but in a standard MATX for the retail market). Lot's of things could have been done to help budget PC gaming, but at this point they aren't even trying to hide the fact they aren't even trying to fix the market.



I agree with your very good points.
The situation is only getting worse, and it will bite them back. You know the universal law - what goes around, comes around. It will bite them back in their backsides.. Hopefully, it will be painful for them, as is painful for us now..


----------



## Mathragh (Jan 20, 2022)

Risking further derailment of the thread,...
Perhaps this is one of the reasons for the odd design choices made with this card: 



__ https://twitter.com/i/web/status/1483830238333509632
AMD might be combining the Navi 24 chip, on which the 6500XT is based, with a "Core Complex Die" and an "IO Die" in order to build a chiplet-based APU. Who knows..


----------



## ARF (Jan 20, 2022)

Mathragh said:


> Risking further derailment of the thread,...
> Perhaps this is one of the reasons for the odd design choices made with this card:
> 
> 
> ...



So, it is an integrated-level class of die which for some unknown reasons is marketed and tried to be sold as a fully competent desktop graphics card.

Very misleading and ugly by AMD!


----------



## InVasMani (Jan 20, 2022)

Gigabyte trying out compete AMD on it's bad design with a equally bad cooler that makes zero sense on it. Why stop there Gigabyte water cool it or throw on a TEG.


----------



## trsttte (Jan 20, 2022)

Mathragh said:


> Risking further derailment of the thread,...
> Perhaps this is one of the reasons for the odd design choices made with this card:
> 
> 
> ...



Wish that would be true!







I doubt that's anything more than speculation though, they've been playing with chiplets long enought that they could already have done that if they wanted to


----------



## Jism (Jan 20, 2022)

Arkz said:


> What in the hell... So even on a PCI-E 4.0 board it's limited. And far more people will be on 3.0.
> Average around RX580 perf but in 2022 and £225-£300+
> 
> I got a Sapphire Nitro+ RX580 4GB like 4 years ago for £170. And that was one of the more expensive versions.



Yes,

But the RX580 is pretty much discontinued. You can only buy it 2nd handed. System builders or OEM integrators are'nt going to re-use "old" hardware.

If you where a system builder you woud'nt be pairing this with a board thats capable of only PCI-E 3.0. You select a Ryzen with a modest 5x0 chipset and 3x00 or 5x00 series of CPU.

I agree that this is a missed chance, i thought with the whole Navi thing AMD was about to set the new standard, like twice the RX580 performance (or 1080TI levels) for the same price.

However scalping is another thing, and they are a growing number causing this shortage as well; to resell it on even ebay for 600$ for the same card.



BSim500 said:


> OCing the PCIe bus is generally a bad idea. I've seen "silent errors" on storage devices from doing that, and OCing the PCIe bus could easily cause such problems with NVMe drives that directly use same bus.



Well it's said to test any of your overclocks if your planning to use your system on 24 hours basis. I've done quite some overclocking in the past and just a slight increase of the PCI-E bus could gain a tad faster graphics. However NVME SSD's that are hooked up onto the PCI-E bus are more sensitive just as NIC's and coud'nt cope with 112Mhz of speeds.


----------



## InVasMani (Jan 20, 2022)

Mathragh said:


> Risking further derailment of the thread,...
> Perhaps this is one of the reasons for the odd design choices made with this card:
> 
> 
> ...



Could be intended for MCM GPU and just the starting stages of it. Could be ment for Rembrandt, but DDR5 memory production postponed that and their actively sitting on a stockpile of them and want to make use of them. Could be just be a really questionable and bad design. Why was infinity cache cut down so much and the memory bus and the PCIE bus and the TMU's and the ROPs and the Shaders? Like the list goes on.


----------



## WhoDecidedThat (Jan 20, 2022)

Mathragh said:


> Perhaps this is one of the reasons for the odd design choices made with this card:
> 
> 
> 
> ...


They would have to build a whole new IO die to support the bandwidth requirements of the CPU + GPU.



ARF said:


> The situation is only getting worse, and it will bite them back. You know the universal law - what goes around, comes around. It will bite them back in their backsides.. Hopefully, it will be painful for them, as is painful for us now..


Oh it will be painful for them when the crypto market dumps millions of cards for cheap in the 2nd hand market. Nvidia's consumer division is especially going to suffer since they don't have iGPU/console market to fall back on. AMD RTG atleast knows that if PlayStation 6 and whatever the next Xbox is called happens, they are going to get the contract to design the chip using RDNA 4/5/whatever.


----------



## Prima.Vera (Jan 20, 2022)

Sorry, this card is worth no more than 50$ including all taxes.
People who are going to buy this junk are probably on a low budget running mobos with PCI-E 3.0,so no way the card is worth more than 50$.
The 200$ MSRP price is beyond callous for this silicon waste dump.


----------



## 80-watt Hamster (Jan 20, 2022)

Prima.Vera said:


> Sorry, this card is worth no more than 50$ including all taxes.
> People who are going to buy this junk are probably on a low budget running mobos with PCI-E 3.0,so no way the card is worth more than 50$.
> The 200$ MSRP price is beyond callous for this silicon waste dump.



Realistic price floor for any newly-designed card is probably about $100, which I'd argue is fair.  And margins will be slim at that price, if they're even positive.


----------



## SamuelL (Jan 20, 2022)

There is no reason for this card to exist outside of additional display outputs for office PCs. There will be quite a few disappointed kids, having paid $300+ for one of these only to discover they aren't as good as entry level cards from 5+ years ago.

I suppose the one upside may be scalpers finally getting burned on their habits after demand for these dries up and no takers are found on ebay at $300-400...


----------



## Mussels (Jan 21, 2022)

Jism said:


> Ok i know what i'm going to write is stupid now, but,
> 
> If you do have a reasonable board, you should be able or capable to increase the PCI-E frequency by either 4 to 12Mhz. You have any idea how much extra bandwidth that provides for a limited card like that? I suggest you try out. Other then that: even tho it's backwards compatible, it's obviously designed for use with PCI-E 4.0 and not 3.0.


Oh i missed this question:

On most systems you can get a max of 103Mhz off the PCI-E bus before devices stop working.
You can sometimes get higher, at the cost of NVME, lowering to previous gen (3.0/2.0) or various onboard devices stopping working


----------



## RJARRRPCGP (Jan 21, 2022)

Oh, and about raytracing being ruled a fail, that reminds me of a YouTube video I came across a while ago, with the Nvidia T400:
Gaming With The New Nvidia T400 - An Entry Level RTX Pro GPU - YouTube


----------



## ratirt (Jan 21, 2022)

Damn rx 5500 XT is faster than this thing. Not by much but still. AMD you are gonna get smacked for this for sure.


----------



## RJARRRPCGP (Jan 21, 2022)

ratirt said:


> Damn rx 5500 XT is faster than this thing. Not by much but still. AMD you are gonna get smacked for this for sure.


Reminds me of the AGP-era with Radeon 9500 and 9550!


----------



## BlackSwan (Jan 21, 2022)

I've never seen a new card beaten by its equivalent from the previous generation.  That's ridiculous.


----------



## Dwarden (Jan 21, 2022)

the shoddy AMD marketing of 6500XT brand
as 'supposedly' replacement of 5500XT but in reality it's more like 5300 serie

let's point out that 5500, 5300 has 20% more transistors than 6500XT !
then point out that 5500 has 128-bit, 5300 has 96-bit memory bus contrary to 64-bit of 6500XT
note that 5500 came with 8 GB VRAM and only the mobile 5500M , 5300 came with 4GB VRAM
then realize that 5500 and 5300 are equppied with PCIe4x8 lanes
then realize the the video encode was tossed out for 6500xt
add to injure 6500xt has half the outputs than 5500xt
that 5500, 5300 came 28 and 26 months ago

now you want to wonder how graphics card
with cheaper GPU chip (less transistors on 6nm node)
with simpler PCB
with cheaper and smaller memory
costs more than 99 MSRP ?

speaking of which, it is shocking it needs more than 75W ...
AMD missed the chance to get rid of that extra connector

if it was named 6400XT and priced lower at MSRP ...
then maybe there would be way less of negative reactions


----------



## RJARRRPCGP (Jan 21, 2022)

BlackSwan said:


> I've never seen a new card beaten by its equivalent from the previous generation.  That's ridiculous.


Thus, I'm now waiting to see what the specs are going to be for Nvidia's '3050!


----------



## trsttte (Jan 21, 2022)

RJARRRPCGP said:


> Thus, I'm now waiting to see what the specs are going to be for Nvidia's '3050!



There was no 2050 so at least nvidia is safe there. Also according to nvidia's benchmarks the 3050 is infinite times better than the previous generation 1650/1050 (rtx on, still a pretty stupid comparison to begin with)


----------



## MikeMurphy (Jan 23, 2022)

It's a mobile card adapted for desktop use due to GPU shortage, folks.


----------



## arni-gx (Jan 23, 2022)

although this scaling of pcie 4.0 4x vs pcie 3.0 its very bad comparing with rtx 3000 pcie 4.0 vs pcie 3.0, this vga should be enough for all pc dekstop for office daily and light gaming on pc.....


----------



## RJARRRPCGP (Jan 23, 2022)

The bad thing is acting like it's the Radeon HD days and earlier, with no video encoding, so that means you better prepare to OC your CPU cores just to record a game video.
You may need an "early-grave" CPU Vcore amount, just to record without lag, for all I know! Better get those CPU cores ready for x.264!

We may see a lot of Ryzen owners who do a manual all-core OC, as a result and we may see a lot of Ryzen CPU failures in the following months. I'm not joking or trolling there.


----------



## mama (Jan 23, 2022)

It's a mobile chip.  AMD thought they could short cut and make it a low end GPU.  They obviously ran into problems.


----------



## Lew Zealand (Jan 24, 2022)

RJARRRPCGP said:


> The bad thing is acting like it's the Radeon HD days and earlier, with no video encoding, so that means you better prepare to OC your CPU cores just to record a game video.
> You may need an "early-grave" CPU Vcore amount, just to record without lag, for all I know! Better get those CPU cores ready for x.264!
> 
> We may see a lot of Ryzen owners who do a manual all-core OC, as a result and we may see a lot of Ryzen CPU failures in the following months. I'm not joking or trolling there.



Ryzens have had 16 cores in them for over 2 years now and 8 cores for almost 5.  So they're set for CPU encoding.

I'm _totally sure_ that's what AMD was counting on when releasing the 6500XT for desktop...


----------



## chrcoluk (Jan 24, 2022)

How is your bus bandwidth usage so high? On a gen 3 x16 slot my 3080 has never used more than 19%.  So not even 4 lanes of gen 3 and its a more powerful GPU.

On the principle a low end card is released, the target market almost certainly will not have gen 4 boards, so on that basis its an odd choice, but on the other hand, the results dont make much sense to me as pcie slots have typically been massively over provisioned for gpu's.

Was it verified if the slot wasnt stuck in 1.1 power saving mode?


----------



## RJARRRPCGP (Jan 24, 2022)

Lew Zealand said:


> Ryzens have had 16 cores in them for over 2 years now and 8 cores for almost 5.  So they're set for CPU encoding.
> 
> I'm _totally sure_ that's what AMD was counting on when releasing the 6500XT for desktop...


Either that, or ramp up the RAM controller and RAM speeds for a fast single-pass, but then I wouldn't be surprised if the files are bigger than what I get with the on-GPU encoder, with ReLive.

Looks like more gamers will be using x.264, wondering what settings to use for x.264.


----------



## chrcoluk (Jan 24, 2022)

RJARRRPCGP said:


> Either that, or ramp up the RAM controller and RAM speeds for a fast single-pass, but then I wouldn't be surprised if the files are bigger than what I get with the on-GPU encoder, with ReLive.
> 
> Looks like more gamers will be using x.264, wondering what settings to use for x.264.


I think GPU encoding is ok if you streaming, but the file sizes are intolerable for recording, they already huge for x264, but for GPU the increase is absolutely huge on what is already big.  That to me is the real bottleneck, storage space hence I still use x264 recording when I can, but of course demanding games played locally on PC I switch over to GPU as I have to and i suppose I will need to do some post recording x264 encoding to get the sizes down.

I use the following for x264 recording.

CRF 24
keyframe 4
cpu preset faster
profile none
tune film
options level=4.2 vbv-bufsize=30000 vbv-maxrate=30000 profile=high422 bframes=5 rc-lookahead=20 threads=15 keyint_min=29

The quality will be similar to medium cpu preset but without medium cpu usage.  My CRF is higher than many, I used to do CRF18 at one point but couldnt tolerate the file sizes and I honestly cannot see a difference from just watching the videos.  II did get a visible difference from the options I set though.


----------



## Wasteland (Jan 24, 2022)

chrcoluk said:


> How is your bus bandwidth usage so high? On a gen 3 x16 slot my 3080 has never used more than 19%.  So not even 4 lanes of gen 3 and its a more powerful GPU.
> 
> On the principle a low end card is released, the target market almost certainly will not have gen 4 boards, so on that basis its an odd choice, but on the other hand, the results dont make much sense to me as pcie slots have typically been massively over provisioned for gpu's.
> 
> Was it verified if the slot wasnt stuck in 1.1 power saving mode?


The 3080 has 10 GB of VRAM, compared with 4 GB on the 6500 XT.  You can get away with low VRAM, or you can get away with low PCI-e bandwidth, but you can't get away with both.  That's the common theme of all the reviews I've seen.  And yes, other reviewers have noted the PCI-e 3.0 issue; it isn't just a quirk with W1zzard's testing. 

One of the morbidly amusing things about this release is that AMD erased their "4 GB is not enough" marketing page, just in time for their release of what is not just a 4 GB card--no, that would just be banal hypocrisy; AMD went a step further and released a 4 GB card that is _uniquely constrained_ by its low VRAM, at least when you pair it with a PCI-e 3.0 rig. It's as if AMD went out of their way to prove their freshly disavowed marketing campaign correct.

Other 4 GB cards that appear in review benchmarks for this thing, many of them sporting extremely dated tech, actually tend to come out looking ok, lol.


----------



## LabRat 891 (Jan 24, 2022)

MikeMurphy said:


> It's a mobile card adapted for desktop use due to GPU shortage, folks.


Yup. It's not often we get access to mobile-optimized silicon in desktop.


mama said:


> It's a mobile chip.  AMD thought they could short cut and make it a low end GPU.  They obviously ran into problems.


Problems in marketing. Their marketing team dropped the ball BIG TIME. There's a half-dozen+ 'angles' to sell this card on. Yet, many reviewers didn't even get one for testing, and few touched on any of the cards' use cases or strengths. Instead, the publicity is either VERY negative, or dishonestly positive.
The fact it's 'available' is worth celebrating; 6nm die yields must be bretty gud. The fact the 6500 XT isn't a futher cut-down Navi 23, implies a few conjectural possibilities. Otherwise, the 6400 (adopted, OEM only) and 6400 XT monikers would've been more fitting, as others have mentioned. The model number alone is overselling it; especially if you're comparing to a 5500XT. They're hit and miss 'equals', when we expect at least some perf/W + raw perf. uplift.


----------



## 80-watt Hamster (Jan 24, 2022)

LabRat 891 said:


> The model number alone is overselling it; especially if you're comparing to a 5500XT. They're hit and miss 'equals', when we expect at least some perf/W + raw perf. uplift.



It's not even the first time this has happened.  What was it, 5850 --> 6850 that also went backward?  Something in the upper end of the HD 6XXX range, anyway.


----------



## RJARRRPCGP (Jan 24, 2022)

LabRat 891 said:


> Yup. It's not often we get access to mobile-optimized silicon in desktop.


Unfortunately, this isn't good news like it was in the late-socket-462-days!


----------



## Cutechri (Jan 24, 2022)

This product is just depressing.


----------



## Mussels (Jan 25, 2022)

Thought this might interest people: the 6500XT has some *very* erratic pricing


----------



## 80-watt Hamster (Jan 25, 2022)

Mussels said:


> Thought this might interest people: the 6500XT has some *very* erratic pricing
> 
> View attachment 233845



That's in AUD, presumably?


----------



## defaultluser (Jan 25, 2022)

Mussels,​This is pretty normal for launch week .  As-of today, all Reseller's hhave run away screaming:



			https://www.techpowerup.com/forums/threads/gpu-stock-status-microcenter-newegg.283761/page-13#post-4690581


----------



## 80-watt Hamster (Jan 25, 2022)

defaultluser said:


> Mussels,​This is pretty normal for launch week .  As-of today, all Reseller's hhave run away screaming:
> 
> 
> 
> https://www.techpowerup.com/forums/threads/gpu-stock-status-microcenter-newegg.283761/page-13#post-4690581



Er, what?  Sorry; I don't follow what you mean at all.


----------



## kapone32 (Jan 25, 2022)

In my area most of these cards are sold out. I do understand it though for $339 vs $599 for a 6600 there is tremendous cost savings and it is faster than a vanilla 1650. You still get FSR, REbar and whatever AMD is cooking up to improve performance. Those features can no longer be ignored.


----------



## Lew Zealand (Jan 25, 2022)

kapone32 said:


> In my area most of these cards are sold out. I do understand it though for $339 vs $599 for a 6600 there is tremendous cost savings and it is faster than a vanilla 1650. You still get FSR, REbar and whatever AMD is cooking up to improve performance. Those features can no longer be ignored.



Yes, but only (maybe) if you're using a PCIe 4.0 Mobo with a PCIe 4.0 CPU.  I'd rather have the 1650 on PCIe 3.0, but IMO the 6600 for $470 (MSI Mech @Newegg) is a better PCIe 3.0 "value" (LOL) than both.


----------



## BlackSwan (Jan 25, 2022)

What these reviews show though is that the 6500xt is a passable entry level gpu if you have a gen 4 pci interface but for many with older mobos you'd be better off with a previous gen 5500xt.
In addition overclockers has the old 8gb 5500xt (out of stock obvs) showing their previous price which is about £30 less than the 6500xt.


----------



## kapone32 (Jan 25, 2022)

Lew Zealand said:


> Yes, but only (maybe) if you're using a PCIe 4.0 Mobo with a PCIe 4.0 CPU.  I'd rather have the 1650 on PCIe 3.0, but IMO the 6600 for $470 (MSI Mech @Newegg) is a better PCIe 3.0 "value" (LOL) than both.


I wish I could get a 6600 for $470 I would already have bought one. Unfortunately they are a minimum of $549 here in Canada. The thing is you can get cheap B550 boards like the MSI Pro https://www.newegg.ca/msi-b550m-pro...8XEe1tS2v6o8o60agqQdlzD2OKZjpzJhoCAfYQAvD_BwE


----------



## seth1911 (Jan 25, 2022)

Today i saw a interessting price for a Asus RX 6500 it was about 509€ and the second listing was a RTX 2060 12GB for 512€. 
(3rd was a 6600 for 549€)

Idk who is buying a RX6500 for this price if he can get a 2060 12GB


----------



## RJARRRPCGP (Jan 25, 2022)

seth1911 said:


> Today i saw a interessting price for a Asus RX 6500 it was about 509€ and the second listing was a RTX 2060 12GB for 512€.
> (3rd was a 6600 for 549€)
> 
> Idk who is buying a RX6500 for this price if he can get a 2060 12GB


I would take an RTX 2060 over an RX 6500 XT, anyday!


----------



## kapone32 (Jan 25, 2022)

seth1911 said:


> Today i saw a interessting price for a Asus RX 6500 it was about 509€ and the second listing was a RTX 2060 12GB for 512€.
> (3rd was a 6600 for 549€)
> 
> Idk who is buying a RX6500 for this price if he can get a 2060 12GB


Asus is crazy with their Strix pricing. A 6600 for $699 while the MSI Mech 6600Xt was $649.


----------



## Mussels (Jan 25, 2022)

80-watt Hamster said:


> That's in AUD, presumably?


Yeah, Dollary-Doos



RJARRRPCGP said:


> I would take an RTX 2060 over an RX 6500 XT, anyday!



Heh, here a  2060 is $730 for the 6GB and $950 for the 12GB - vs the $320 for an entry level 6500XT, i can see which one would sell


----------



## BlackSwan (Jan 27, 2022)

The worst thing is the 6500xt is a failure which affects everyone.  The 3050 will have no competition hence supply will outstrip demand and all gpu prices will remain high.


----------



## Kissamies (Jan 28, 2022)

Could be an interesting card for my X58 rig, but the performance hit from PCIe 2.0 is just too much, dammit.


----------



## RJARRRPCGP (Jan 28, 2022)

MaenadFIN said:


> Could be an interesting card for my X58 rig, but the performance hit from PCIe 2.0 is just too much, dammit.


Likely incompatible with the BIOS, don't be surprised if the BIOS tells you that every setting is lost and makes you reconfigure the BIOS! Known issue with AMI BIOS on Asus X58 motherboards that are pre-UEFI.
Makes it look like the battery isn't connected or low!


----------



## Kissamies (Jan 29, 2022)

RJARRRPCGP said:


> Likely incompatible with the BIOS, don't be surprised if the BIOS tells you that every setting is lost and makes you reconfigure the BIOS! Known issue with AMI BIOS on Asus X58 motherboards that are pre-UEFI.
> Makes it look like the battery isn't connected or low!


Good point, I always forget that newer cards don't always like pre-UEFI hardware.


----------



## RJARRRPCGP (Jan 30, 2022)

MaenadFIN said:


> Good point, I always forget that newer cards don't always like pre-UEFI hardware.


Vice-versa, pre-UEFI, especially AMI BIOS on Asus, which don't like it at all. Symptoms look like the late cards overwrite part of the pre-UEFI-BIOS-used RAM regions. Something in the UEFI-gen cards, are causing a CMOS access issue. 

Causes these strange symptoms: BIOS menu is laggy and when you unplug the PSU, for example, to move it, you'll be forced to re-enter everything. And, prepare for the BIOS to search for a network drive and make you wait.

I only noticed this BIOS disease symptom when I unplugged the PC, because of a thunderstorm, IIRC. (In 2019, when I wanted to use my Radeon RX 580)
This issue, made me get another socket 1366 motherboard, only to get the same thing!


----------



## Z-GT1000 (May 6, 2022)

AMD really needs to reenable PCIe 4.0 support in agesa for the old A320, B350, X370, B450 and X470, It would be interesting to test these cards with some board with these chipsets and the latest bios that support PCI-e 4.0 before AMD blocked its use in the following Agesa updates, now that they have removed the lock on the processors it makes no sense that someone could use a 5800X3D on an X370 board and AMD artificially lock the PCI-e 4.0 lines of the processor.


----------



## trsttte (May 6, 2022)

Z-GT1000 said:


> AMD really needs to reenable PCIe 4.0 support in agesa for the old A320, B350, X370, B450 and X470, It would be interesting to test these cards with some board with these chipsets and the latest bios that support PCI-e 4.0 before AMD blocked its use in the following Agesa updates, now that they have removed the lock on the processors it makes no sense that someone could use a 5800X3D on an X370 board and AMD artificially lock the PCI-e 4.0 lines of the processor.



There are extra signal integrity requirements with pcie4.0, older boards weren't designed for it, some might be able to use it but it would be a mess to certify and debug so having it locked is a sensible decision unlike cpu support which was promised from the start and arbitarily locked out for a long while


----------



## Kissamies (May 6, 2022)

trsttte said:


> There are extra signal integrity requirements with pcie4.0, older boards weren't designed for it, some might be able to use it but it would be a mess to certify and debug so having it locked is a sensible decision unlike cpu support which was promised from the start and arbitarily locked out for a long while


Yeah. Wasn't B550A actually a B450 but with PCIe4.0 support?


----------



## kapone32 (May 6, 2022)

Is anyone who is complaining about the 6500XT actually an owner of one? Let me tell you what I like about the 6500XT. 

1. The lowest price 6500Xt is $229 Canadian or $177 US 
2. Combined idle power (5600, 6500XT) 13 Watts. 
3. The Memory has no problem OC to 2400 MHZ
4. The GPU OC to 2930 MHZ and sustained boost clock is 2858 in most Games 
5. The max power draw is about 90 Watts
6. The largest cards are only dual slot 
7. The PSU does not need to be more than 400W 
8. It blows the 1650 away 
9. It is cheaper than the 3050
10. It supports HDMI 2.1 and DP 1.4 

In particular I want to talk about HDMI 2.1 as that is what a 4K TV needs to support VRR (Freesync) at 120HZ. That is not the key though as it is the minimum is what I mean. Let's say you are playing Cyberpunk at 1440P on your 4K TV using RSR is not required as applying FSR Quality will get you around 40 FPS which might seem low but is in the VRR range. Anyone who plays TWWH2 with a Freesync monitor understands what I am talking about. Even 4X4 20 stack battles are smooth as butter even though the FPS can jump from 40 to 130 as the battle ensues.



kapone32 said:


> Is anyone who is complaining about the 6500XT actually an owner of one? Let me tell you what I like about the 6500XT.
> 
> 1. The lowest price 6500Xt is $229 Canadian or $177 US
> 2. Combined idle power (5600, 6500XT) 13 Watts.
> ...


BTW I am using it with a 5600G and don't see what all the noise is about PCIe 3.0.


----------



## trsttte (May 6, 2022)

Lenne said:


> Yeah. Wasn't B550A actually a B450 but with PCIe4.0 support?



Yes, it was an OEM only rebrand with pcie4.0 enabled for the ssd and gpu slots (chipset still running on 3.0).


----------



## ratirt (May 9, 2022)

kapone32 said:


> Is anyone who is complaining about the 6500XT actually an owner of one? Let me tell you what I like about the 6500XT.
> 
> 1. The lowest price 6500Xt is $229 Canadian or $177 US
> 2. Combined idle power (5600, 6500XT) 13 Watts.
> ...


Well, with the PCIe 3.0 it is slower than running with PCIe 4.0 and in some games very noticeably.

I actually checked some benchmarks and prices. It appears that the 6500XT in general has the same performance as the 1650 Super. I have also checked pricing and it seems it's cheaper than a 1650 Super in Norway. Actually it is cheaper by a lot. $80-$120 depending on the card. Problem is, 1650 Super's are not in stock. It probably has been replaced by the 3050 but that one also costs more than a 6500 XT but round $130 and up.
That makes me think, if the 6500XT is an entry point card, considering price and what it offers in the current market, it is not that bad. The problem I see is, it's slower than a 5500XT.


----------



## RJARRRPCGP (May 24, 2022)

Well, the RX 6500 XT is more of a pathetic joke now than ever! Because I was able to get an RX 6600 XT for at or around $500 (including tax and any shipping fees) in very-late-March. Got it on March 28, so only took a few days. No shuffle required. That was from Newegg. 

The RX 6600 XT is spanking a lot of other cards!


----------

