• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6400

There is no need to be prejudiced against this video card. You don't need to put it in a regular PC. I have an SFF with an external power supply (200 W) with 1 slot for a video card. Now in stores there are 3 solutions that are suitable for me to replace the video core of the APU. GT 730 (slower than APU), GT 1030 (same as APU) and 6400 (much faster than APU). Yes, this card is for medium to low settings at 1080p, but it's the best I can get myself. And I will take it.
 
lineup 6600 > 5700, but 6500 < 5500, this is simply wrong.
this should not be 6400 but 6300 instead. this is a RX 470 from 2016, and costs the same $179, only $10 off.
 
Last edited:
all the benchmarks say 6400 xt rather than 6400
Whoops, fixed

RX 6400 spec table in the first page is wrong. Cores number must be 768. (3/4 of RX 6500XT)
Fixed

i assume the 1650 being compared to is the obsolete 1650 gddr5, not the faster 1650 gddr6, which is one of the best selling GPUs, at least in my market ?
Yeah it's the GDDR5 version. Not obsolete, both are actively shipping at this time. GDDR6 adds a few percent: https://www.techpowerup.com/review/gigabyte-geforce-gtx-1650-oc-gddr6/27.html

Also, aren't locked-down OC controls the norm for slot-powered cards? I seem to remember that being pretty normal as a safeguard to avoid burning out the 12V traces in your motherboard.
Maybe for AMD, which is still a lame excuse given all the safeguards we have
https://www.techpowerup.com/review/nvidia-rtx-a2000/37.html
https://www.techpowerup.com/review/palit-geforce-gtx-1050-ti-kalmx/33.html

Any chance of a modded 6500XT BIOS being able to be flashed onto the 6400s?
You make that sound so easy :) afaik BIOS can't be modded anymore due to digital signature. Soft PP tables might be an option though
 
Last edited:
It's the price the RX570 launched at five years ago this week (let's ignore the fact you could get new RX570 cards for $99 after the first ETH crash).

It occasionally matches the RX570 but most of the time barely manages half the performance :\
 
At least, you will show at what bandwidth PCIe setting the performance tanks considerably. Please do it.

I mean there should be a tutorial educatory review which enlightens the potential buyers why not to buy this low-performing card.

View attachment 245007

View attachment 245008
Counterargument: those games were tested at highest quality settings. Those are irrelevant for lower end cards. 6500 XT indeed runs Cyberbug reasonably okay at sane settings.
 
those games were tested at highest quality settings. Those are irrelevant for lower end cards
The idea is to have a valid comparison with other cards. No doubt, you can get 60 FPS at lowest settings with upscaling from 480p and it'll look worse than XB1
 
Also, aren't locked-down OC controls the norm for slot-powered cards? I seem to remember that being pretty normal as a safeguard to avoid burning out the 12V traces in your motherboard.
Totally not normal. There's nothing to guard card from as they can self adjust frequency/voltage if needed. That's handled by TDP value in vBIOS. Not even in past it was normal.

The idea is to have a valid comparison with other cards. No doubt, you can get 60 FPS at lowest settings with upscaling from 480p and it'll look worse than XB1
6500 XT runs Cyberbug at 1080p low or medium well without that stupid FSR. 6400 runs it at low with ~40 fps. I understand that you collect data for fair comparison, but it's really useless for anyone looking to actually buy card like this. That's like expecting to run games at ultra on GT 730. That's just not what target audience does with those cards. Since ultra settings are notorious for hammering performance for no good reason, why not collect data with high settings instead?
 
why not collect data with high settings instead?
Because people demand highest settings in reviews for pretty much all cards. Also faster cards like 3080+ will end up CPU limited otherwise

I agree that if I had an army of benchmark slaves I would have retested all cards on lower settings, which takes about two weeks, 10 hours a day. Just not practical for a review like this
 
Because people demand highest settings in reviews for pretty much all cards. Also faster cards like 3080+ will end up CPU limited otherwise
I don't think that's true. That could also be a great opportunity to avoid corporate sabotage like Gameworks that made Radeons perform a lot worse than they should.
 
That could also be a great opportunity to avoid corporate sabotage like Gameworks that made Radeons perform a lot worse than they should.
It's called ray tracing now
 
W1zz needs to add a GT1030 to the list to compare this card too :)
the 1030 is a bit slower than the 550 2GB, which is in the list.

AFAIK, the 1030 is much better selling , so the 1030 would be a slightly more useful, but we can make a good guess 'just a bit lower than the 550'
 
It's called ray tracing now
Didn't work out well for anyone involved in it though. I still remember 2080 Ti struggling at 1080p.
 
Because people demand highest settings in reviews for pretty much all cards. Also faster cards like 3080+ will end up CPU limited otherwise

I agree that if I had an army of benchmark slaves I would have retested all cards on lower settings, which takes about two weeks, 10 hours a day. Just not practical for a review like this

How do you come up with all those numbers? I mean new patches, new drivers, etc., presumably you don't actually retest the old cards for each review? But OTOH I guess 3 or 4 years down the line the position has often changed by 10% for a given card due to updates, so?
 
W1zz needs to add a GT1030 to the list to compare this card too :)
I have a GT1030, and tried to find it for this review, but no luck

edit: omg found it
P1kAlMiG2K.jpg

finishing pcie 3.0 run first, then running gt1030

How do you come up with all those numbers? I mean new patches, new drivers, etc., presumably you don't actually retest the old cards for each review? But OTOH I guess 3 or 4 years down the line the position has often changed by 10% for a given card due to updates, so?
I retest everything every few months and keep drivers/games/patches constant until the next retest. Last retest was done mid-March
 
Last edited:
I do like the reviews here, but in this case I think the title is a bit misleading. Yea, you are reviewing the AMD RX 6400, but you are specifically reviewing the MSI AERO model, and this should be in the title.

It is nothing major, the card is still crap for me, simply because of the lack of de/encode for some codecs, but I would like to see the card name in the review title(link).
 
aren't locked-down OC controls the norm for slot-powered cards?
Last time I tried on my RX 460, all the options were available in the drivers (though granted that was a couple of years ago). I used them to reduce power further. If that's not available now, it's a pity, though the 6400 is more efficient than the 460 out of the box.
 
I do like the reviews here, but in this case I think the title is a bit misleading. Yea, you are reviewing the AMD RX 6400, but you are specifically reviewing the MSI AERO model, and this should be in the title.

All the cards are the same though... You can overclock the 6500 xt, and that is a differential feature between brands in terms of OC potential.

This one you cannot.

The default clocks are also identical across brands.

Since you can overclock the 6500 xt to add about 5% more performance, that makes this card relatively weaker.
 
All the cards are the same though... You can overclock the 6500 xt, and that is a differential feature between brands in terms of OC potential.

This one you cannot.

The default clocks are also identical across brands.

Since you can overclock the 6500 xt to add about 5% more performance, that makes this card relatively weaker.
Well yes but not all fans behave the same and a fast reader might read that the fan overshoots on all 6400 models ;)
 
This is pretty much a 750ti/1050ti class GPU except that Maxwell GPU was released over 8 years back & Pascal one over 5 years back! According to TPU charts it's still not 2x as fast as 750Ti & barely faster than 1050Ti, I think AMD should really do better after so many years especially in this segment. This belongs to 1030GT level right now & shouldn't cost a penny above 100 USD, the segment which it's released into right now is horrendously overpriced ~ granted Nvidia also haven't released anything of that kind like a 3050ti without power connector but the point remains!
What point? This is a 1650-eqivalent card with the same amount of VRAM and the same VRAM bandwidth. The 1650 does it with 128-bit GDDR5, the 6400 with 64-bit GDDR6. Only that low profile 1650s go for £250-300 on ebay while the 6400 costs £160 new. What's not to like?

I have the same use case in mind. I already wanted a HDMI 2.1 a year ago and bought a RTX 3060.
Why did you buy the Sapphire one?
I'm thinking about the Powercolor RX6400. This one has 0db fan stop.
The Sapphire one seems to have a longer cooler and the spec sheet mentions 55 W TBP instead of 53. It might not matter in normal usage, but it costs the same as any other model, so I thought why not. :)
 
the 1030 is a bit slower than the 550 2GB, which is in the list.

AFAIK, the 1030 is much better selling , so the 1030 would be a slightly more useful, but we can make a good guess 'just a bit lower than the 550'
No worries, I have a 1030, and my 6400 arrives on Saturday, I'll make sure to do a comparison. ;) ... on pci-e gen 3!

the 1650 has much more & better features (NVENC/NVDEC, x16 bus, 3 displayouts, OC support) than this tho, it's like leagues above the 6400
The only relevant thing this card is missing compared to the 1650 is the VP1 decoder. The x4 bus is what it is, and I don't believe the target audience gives a damn about the encoder (at least I don't). Nobody wants to see gameplay streams of Cyberpunk 2077 at low quality settings and/or 10 fps.
 
Last edited:
You noticed wrongly, as the 6400 also has only 12 CUs compared to the 6500 XT.

Still, they're the same chip, so unlocking one to the other could be technically possible when the lower end SKU uses good silicon. Big 'if', though.

AMD has been lasering -of cut cards for years now (last one was RX 560)
 
The only relevant thing this card is missing compared to the 1650 is the VP1 decoder. The x4 bus is what it is, and I don't believe the target audience gives a damn about the encoder (at least I don't). Nobody wants to see gameplay streams of Cyberpunk 2077 at low quality settings and/or 10 fps.
But they absolutely should, because that means that card doesn't decode Youtube and Youtube on CPU is rough. IMO it fails as display adapter, but it can run Cyberbug and at 1080p low ~40 fps. If you wanted to record some older game, you can't with 6400. For that RX 550 works better, because it records. It's also useless to add this card to older computers struggling with Youtube or other services. Overall, it managed to alienate the audience it was intended for and pleased the gamers somewhat, who won't buy it. Would have been more acceptable if it had decoding capabilities, but power of GT 1030 DDR4. The irony is that GTX 1050 Ti, which matches RX 6400 in performance is selling for a bit less money and does more. And considering that Chinese manage to put laptop GPUs without such downsides on PCBs or MXM cards, it just shows how blatant cash grab RX 6400 is.
 
As I've said elsewhere the MSRP of this is about $40-50 too high, but other than that I don't see what people are complaining about.
Lack of generational upgrade? This thing performs worse than an RX 570 released in 2017 for a similar MSRP ($170).
I know, although I said early design stages not post 2020 era, unless AMD team was forecasting somehow the inflation which is in contrast with what RX6800XT/6800 pricing strategy indicates.
You say reputation was the only casualty, this isn't a negligible casualty, they made a wrong judgment call imo.
The should have forecasted production to suffice for mobile contracts only or launch with competitive desktop pricing (RX 6400 launched now and it doesn't seem to be acceptable in the same way RX 6500XT wasn't acceptable forcing AMD partners to sell at SRP or below in some cases in Europe when at the same time were selling 6800XT/6800 with +50% from SRP (and Nvidia solutions also +50% in ASUS/MSI/GB case) So no, I don't think partners gained anything from the price strategy only AMD had some financial gains, partners lose potential margins and reputation (GB 3 fan 6500XT design lol)
AMD doesn't seem to care that much about reputation, just look at the Ryzen 5 4500. But yeah, to be fair you might be correct on the partners gaining nothing (or perhaps relatively little) from this pricing strategy.
The only relevant thing this card is missing compared to the 1650 is the VP1 decoder. The x4 bus is what it is, and I don't believe the target audience gives a damn about the encoder (at least I don't). Nobody wants to see gameplay streams of Cyberpunk 2077 at low quality settings and/or 10 fps.
A lot of people stream eSports games, those have light graphical requirements and will work quite decently even on this thing when coupled with an adequate CPU.
 
Lack of generational upgrade? This thing performs worse than an RX 570 released in 2017 for a similar MSRP ($170).
... but a 40 tier card isn't supposed to be a generational upgrade on a 70 tier card (even if AMD's naming back then was dumb and the 580 was more like a 60-tier in reality, with the 570 being a tad below that but still too powerful to fit its contemporary 50-tier). Polaris also delivered ridiculous value even for its time. As I've said time and time again, the pricing is silly, but performance for what this is trying to be is fine. This is an entry level card, with good entry level performance. What makes it problematic is it costing $160 when it should be more like $120 - which would make it fit with cards like the $109 (~$130 after inflation) 2016 GTX 1050. There's also the crazy increases in materials costs and shipping costs of the past few years. In a saner world, this would be $120 with the 6500 XT at $160-ish, but that's sadly not the world we're living in.
A lot of people stream eSports games, those have light graphical requirements and will work quite decently even on this thing when coupled with an adequate CPU.
If they have any non-F Intel CPU or any AMD APU, they already have hardware accelerated encoding support though. And if not, then, well, this GPU isn't for them. And quite frankly that's fine.
 
Back
Top