• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Readies Radeon RX 6500 XT and RX 6400 Graphics Cards

RX6000 series cards are really easy to come by here, so not sure how they can justify such huge price increases just because the competition is all sold out.
Where is that? Regardless of location, I assume distributors and retailers are pushing prices just because they can. Both AMD and Nvidia's MSRPs this generation are pretty dumb, but for the most part they aren't insane like street prices, meaning that someone further down the distribution chain is padding their margins.
I agree with many when they say that 1080 Ti is one of the best graphics cards ever made.
Yeah, that's likely true, especially when you factor in context like successive generations. I mean, it's a high end, $700 MSRP GPU that is (barely) beaten by $379 inflated MSPR upper mid-range GPUs four years later, but ... that's four years. It used to be a year or two before a GPU went mostly obsolete, while the 1080 Ti is still good in most games today. That is pretty much unprecedented.
I can live with med-high settings, no need for ultra/very high etc
Sounds like a good approach, but keep in mind that upscaling can be pretty good too. Especially on a high density 2160p panel a lower resolution can be a good solution as well. That panel resolution gives a lot of flexibility!
That is a good question but I dont think they do. As long as you have a PCI-e 3.0 and up you are good to go.
Most current GPUs require UEFI boot to work - AFAIK both the RTX 3000 and RX 6000 series do, at least.
 
Most current GPUs require UEFI boot to work - AFAIK both the RTX 3000 and RX 6000 series do, at least.
I guess you are right with the UEFI requirement. They need UEFI to work.
Actually There is a workaround :)
 
Last edited:
I agree with many when they say that 1080 Ti is one of the best graphics cards ever made.
Let's be honest, Pascal was a huge jump after Kepler/Maxwell, but nothing has really improved since. The 1080, 2070 and 3060 offer similar performance with similar power consumption. Everybody is over the Moon and back with the performance of the 3080-3090, but it's no big feat with 300+ Watts. Even AMD likes to position their GPUs at the higher end of their efficiency curves with only incremental efficiency improvements across architectures.

So yes, the 1080 and 1080 Ti are awesome - everything that came after is essentially the same stuff with added ray tracing, DLSS and/or power consumption.
 
Where is that? Regardless of location, I assume distributors and retailers are pushing prices just because they can. Both AMD and Nvidia's MSRPs this generation are pretty dumb, but for the most part they aren't insane like street prices, meaning that someone further down the distribution chain is padding their margins.

Yeah, that's likely true, especially when you factor in context like successive generations. I mean, it's a high end, $700 MSRP GPU that is (barely) beaten by $379 inflated MSPR upper mid-range GPUs four years later, but ... that's four years. It used to be a year or two before a GPU went mostly obsolete, while the 1080 Ti is still good in most games today. That is pretty much unprecedented.

Sounds like a good approach, but keep in mind that upscaling can be pretty good too. Especially on a high density 2160p panel a lower resolution can be a good solution as well. That panel resolution gives a lot of flexibility!

Most current GPUs require UEFI boot to work - AFAIK both the RTX 3000 and RX 6000 series do, at least.

Overclockers.co.uk

6600, 6600 XT, 6700 XT, 6800 XT, 6900 XT loads of stock. Other stores are stocked up well too.


But sure, I appreciate the market is being milked for all its worth, but the demand for one is screwing over the other.
 
Overclockers.co.uk

6600, 6600 XT, 6700 XT, 6800 XT, 6900 XT loads of stock. Other stores are stocked up well too.


But sure, I appreciate the market is being milked for all its worth, but the demand for one is screwing over the other.

The prices are not much different or different at all compared to the ones in Germany.
RX 6600 XT for 560.
RX 6700 XT for 730.
RX 6800 for out of stock.
RX 6800 XT for 1160.
RX 6900 XT for 1350.

Avoid.
 
But why....
All prices are being inflated because anyone selling anything knows that people had nowhere to spend the money they accumulated during the pandemic and everyone selling everything wants that money.
Also factor in the mining, the scalping and...how should i put this...truly special creatures on this Planet paying any amount of money for anything and there you go.
We're screwed.

What follows below is the AMD GPU situation with a TOP5 etailer here in the butthole of the Universe.

Enjoy

RX 6600 for 700 euro
RX 6600 XT for 765 euro
RX 6700 XT for 1070 euro
RX 6800 for out of stock.
RX 6800 XT for 1535 euro
RX 6900 XT for 1715 euro
 
Last edited:
AMD wants to bankrupt its VGA division...

Market share is miserable 17% and:

Overall GPU unit shipments decreased by -18.2% from last quarter, AMD shipments decreased by -11.4%, Intel's shipments decreased by -25.6%
 
Kinda interested to see if any board partners can find enough GDDR6 laying around to do an 8GB Factory OC variant 6500XT(X). Maybe, to compete w/ the 2060 12GB?* lol.

*I have no idea if they're even remotely performance comparable.
 
I seriously wouldn't be surprised at all if even the slowest one would cost like 150-200USD/EUR..

GT 730 cards are still priced around $140 at Micro Center.
GT 1030 cards are priced around $120.
RX 550 is priced at $230.

If the RX 6400 comes in priced at under $200, I'd be surprised.
 
GT 730 cards are still priced around $140 at Micro Center.
GT 1030 cards are priced around $120.
RX 550 is priced at $230.

If the RX 6400 comes in priced at under $200, I'd be surprised.
What the hell, GT 730s are more expensive than GT 1030s?! Great pricing there..
 
AMD wants to bankrupt its VGA division...

Market share is miserable 17% and:
Source? Posting unattributed quotes without linking to the source is a pretty bad habit ...

Attempting to search for that (supposed) quote gave me no identical hits, but several highly informative ones published quite recently. For example, the GPU market increasing ~20% YoY. Also, that quote mainly tells us that iGPU shipments are down (the inclusion of Intel tells is that iGPUs and dGPUs are counted together), which likely means that overall laptop shipments are down - which isn't exactly unlikely with the WFH boom abating after over a year. WFH buying has also likely shifted purchases away from the traditional back-to-school season. So, AMD might be losing market share, but they are still selling far more GPUs than last year - they just aren't able to increase their sales as much as Intel or Nvidia (which makes sense due to both of those being much larger companies and thus having more resources to shift around if needed).

AMD, just like Nvidia, are selling every single functional GPU die they are getting off the fab line, so it's quite doubtful that they're going bankrupt any time soon. Also, these prices have nothing to do with AMD - their MSRPs indicate what they are charging, while anything on top of that is most likely attributable to distributors and retailers. It might be that some of AMD's pricing has increased over initial MSRPs for SKUs that have been out for a while due to supply shortages or other factors (materials pricing etc.), but it's extremely unlikely that this is significantly affecting retail prices. That's just not how pricing for globally distributed consumer goods work.

Some simple math:
If AMD's market share for dGPUs in Q3 20 was 20%, and in Q3 21 was 17%, while overall GPU shipments are up 20% YoY, that means that AMD in Q3 21 compared to Q3 20 sold 120/100*17=an equivalent of 20,4% of Q3 20 marketshare sold in Q3 21. That? That's an increase, if a small one. Remember, market share numbers are relative to the total size of the market, which is obviously not fixed. Nvidia are of course increasing far more (selling an equivalent of 99.6% Q3 20 marketshare in Q3 21), but I doubt AMD cares much. They are supply limited by TSMC and packaging, and are selling every single product they can make.
 
As someone who has been waiting for these low end GPUs to replace my aging RX 460 I'm kind of disappointed. The base chip is too small, which means that they are probably clocked as high as possible to have good performance. This also means that the efficiency will be worse than expected for such a small chip (6500XT is rumored to require an power connector). It's now also clear that these are probably low end mobile first GPUs, because otherwise they would have like 20 CUs, 96-bit bus and 6 GB RAM and would fit right between the 5500XT and 5600XT - IMO, a much better fit for desktop.

The good thing is, that they might reduce the pressure on the GPU market (if the price is right) and push the prices down, as AMD will be able to make a ton of these and they won't be interesting to miners ...
 
As someone who has been waiting for these low end GPUs to replace my aging RX 460 I'm kind of disappointed. The base chip is too small, which means that they are probably clocked as high as possible to have good performance. This also means that the efficiency will be worse than expected for such a small chip (6500XT is rumored to require an power connector). It's now also clear that these are probably low end mobile first GPUs, because otherwise they would have like 20 CUs, 96-bit bus and 6 GB RAM and would fit right between the 5500XT and 5600XT - IMO, a much better fit for desktop.

The good thing is, that they might reduce the pressure on the GPU market (if the price is right) and push the prices down, as AMD will be able to make a ton of these and they won't be interesting to miners ...
Given that the 28CU 6600 does 2.44GHz at a 132W power limit while being the most efficient GPU TPU has tested, and that RDNA2 doesn't clock much above 2.7GHz unless you use LN2 cooling regardless of the specific chip in question, I struggle to see this being a problem. Even if they push clock speeds to the very limit, it sounds difficult to make the 6500 XT (assuming a fully enabled die) at 16CUs and a 64-bit memory interface consume that much power. My guess would be stock SKUs at 75W/slot powered and OC SKUs with a (mostly unnecessary) 6-pin. They might of course be dumb enough to set a marginally higher power requirement than 75W for easier chip binning or something, but IMO that would be an exceedingly dumb choice given how that would hinder the design from being used in a lot of OEM systems.
 
But why....
That is because 5000XT series cards are exceptional miners. 6500XT would not cost as much im guessing.
 
It's nearly 2022. Shouldn't the lower end ones be at least 6GB? When I used to game in 1080p on my RX580 4GB lots of modern detailed games started to get to the 4GB limit and I had to have textures on medium or high instead of ultra to compensate.
 
It's nearly 2022. Shouldn't the lower end ones be at least 6GB? When I used to game in 1080p on my RX580 4GB lots of modern detailed games started to get to the 4GB limit and I had to have textures on medium or high instead of ultra to compensate.
Were you actually seeing stuttering and performance loss from exceeding VRAM, or were you just reaching the limit? People have a lot of misconceptions about how much VRAM games actually need, and miss the part about most games aggressively and opportunistically preloading assets in case they might be needed in the future, thus filling up VRAM with data that is often never used before it is ejected in favor of other data (that is mostly never used either). Once games start utilizing DirectStorage and being designed for SSDs rather than HDDs we can essentially leave this issue behind for good.

These GPUs, just like 4GB RX 570s and 580s, are far more likely to be limited by their compute capabilites than their VRAM in real life. There are always exceptions, but most of those exceptions are due to poor coding rather than the RAM actually being needed in a strict sense.
 
It's X58 so it's a PCIe 2.0 platform. Well, I've been a budget gamer more or less all the time I've been with PCs so it's not that a big deal.

But on the topic itself, a 6500 XT sounds kinda interesting for my 2nd rig if it's supported with such an old platform..
IIRC AMD tends to only put a physical 8x connector on their lower end GPUs. I doubt something like a 6400 would be limited by it, but it’s something to look out for, running an 8x2.0 slot.
 
So the card is right here? if so, perfect card for my wife, thanks.
 
Were you actually seeing stuttering and performance loss from exceeding VRAM, or were you just reaching the limit? People have a lot of misconceptions about how much VRAM games actually need, and miss the part about most games aggressively and opportunistically preloading assets in case they might be needed in the future, thus filling up VRAM with data that is often never used before it is ejected in favor of other data (that is mostly never used either). Once games start utilizing DirectStorage and being designed for SSDs rather than HDDs we can essentially leave this issue behind for good.

These GPUs, just like 4GB RX 570s and 580s, are far more likely to be limited by their compute capabilites than their VRAM in real life. There are always exceptions, but most of those exceptions are due to poor coding rather than the RAM actually being needed in a strict sense.
stuttering in crackdown 3 till I turned down textures. Various games I've played that sit about 3.5-3.8GB but when played on my 3080 use about 7-8GB.
 
I was really expecting the 6500xt to be a 96 bit bus with 6 GB vram and around 1500 cores that would be close to a 1660ti. Instead, we are getting a 6600xt cut in half with a pathetic 4 GB vram and a card no better than an RX 480 4gb / 1650 super
 
Would be pretty cool in M.2 form factor and CF capable.
At that point, the power limit would completely castrate performance.

The card likely uses 50w (based on the dual-slot cooler plus active fan) (and the m.2 connector can only supply 7w). Then, you would have t add the additional overhead of transferring the video output over pcie.

Yo really need all that free PCB space plus active heatsink to dissipate power (so that's why they physically cap the standard at 7w!)
 
stuttering in crackdown 3 till I turned down textures. Various games I've played that sit about 3.5-3.8GB but when played on my 3080 use about 7-8GB.
Sounds like Crackdown was actually seeing a limit then (not surprising given the game that it is), but those other examples are exactly what I was talking about. The fact that memory usage doubled on the higher VRAM card just shows that they were making use of whatever was available for preloading assets, up until some limit. If it's hovering near full VRAM utilization but performance is steady, then that's a surefire sign that all that VRAM isn't actually in active use.
Would be pretty cool in M.2 form factor and CF capable.
I've had some ideas of m.2 GPUs before (I think I discussed them on this forum?), and it would be really cool, but there are many technical challenges involved - from fitting the die, VRAM and VRM hardware on board to adding a power connector to making an m.2 AIC with a PCB thick enough for all those traces. I would love this if it came true, but this definitely isn't the GPU for it. HBM would be pretty much a necessity for something like that.

I was really expecting the 6500xt to be a 96 bit bus with 6 GB vram and around 1500 cores that would be close to a 1660ti. Instead, we are getting a 6600xt cut in half with a pathetic 4 GB vram and a card no better than an RX 480 4gb / 1650 super
It's highly unlikely that this will perform on a level with 1650 Super - remember, these things clock very high and are crazy efficient. Effects from a smaller IC and VRAM bus are hard to judge, but as I said above I would expect 1660 to 1660 Super levels of performance. The 1650 Super isn't that much slower than a 1650, but if anything that is just an argument for why a decently binned 75W RDNA2 chip should be closer to the 1660S - RDNA2 is just that far ahead of Turing in terms of performance/W.
 
Back
Top