• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6400

Joined
Feb 25, 2018
Messages
4 (0.00/day)
There is no need to be prejudiced against this video card. You don't need to put it in a regular PC. I have an SFF with an external power supply (200 W) with 1 slot for a video card. Now in stores there are 3 solutions that are suitable for me to replace the video core of the APU. GT 730 (slower than APU), GT 1030 (same as APU) and 6400 (much faster than APU). Yes, this card is for medium to low settings at 1080p, but it's the best I can get myself. And I will take it.
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.36/day)
lineup 6600 > 5700, but 6500 < 5500, this is simply wrong.
this should not be 6400 but 6300 instead. this is a RX 470 from 2016, and costs the same $179, only $10 off.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,857 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
all the benchmarks say 6400 xt rather than 6400
Whoops, fixed

RX 6400 spec table in the first page is wrong. Cores number must be 768. (3/4 of RX 6500XT)
Fixed

i assume the 1650 being compared to is the obsolete 1650 gddr5, not the faster 1650 gddr6, which is one of the best selling GPUs, at least in my market ?
Yeah it's the GDDR5 version. Not obsolete, both are actively shipping at this time. GDDR6 adds a few percent: https://www.techpowerup.com/review/gigabyte-geforce-gtx-1650-oc-gddr6/27.html

Also, aren't locked-down OC controls the norm for slot-powered cards? I seem to remember that being pretty normal as a safeguard to avoid burning out the 12V traces in your motherboard.
Maybe for AMD, which is still a lame excuse given all the safeguards we have
https://www.techpowerup.com/review/nvidia-rtx-a2000/37.html
https://www.techpowerup.com/review/palit-geforce-gtx-1050-ti-kalmx/33.html

Any chance of a modded 6500XT BIOS being able to be flashed onto the 6400s?
You make that sound so easy :) afaik BIOS can't be modded anymore due to digital signature. Soft PP tables might be an option though
 
Last edited:
Joined
Feb 20, 2019
Messages
8,299 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
It's the price the RX570 launched at five years ago this week (let's ignore the fact you could get new RX570 cards for $99 after the first ETH crash).

It occasionally matches the RX570 but most of the time barely manages half the performance :\
 
Joined
May 8, 2021
Messages
1,978 (1.52/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
At least, you will show at what bandwidth PCIe setting the performance tanks considerably. Please do it.

I mean there should be a tutorial educatory review which enlightens the potential buyers why not to buy this low-performing card.

View attachment 245007

View attachment 245008
Counterargument: those games were tested at highest quality settings. Those are irrelevant for lower end cards. 6500 XT indeed runs Cyberbug reasonably okay at sane settings.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,857 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
those games were tested at highest quality settings. Those are irrelevant for lower end cards
The idea is to have a valid comparison with other cards. No doubt, you can get 60 FPS at lowest settings with upscaling from 480p and it'll look worse than XB1
 
Joined
May 8, 2021
Messages
1,978 (1.52/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Also, aren't locked-down OC controls the norm for slot-powered cards? I seem to remember that being pretty normal as a safeguard to avoid burning out the 12V traces in your motherboard.
Totally not normal. There's nothing to guard card from as they can self adjust frequency/voltage if needed. That's handled by TDP value in vBIOS. Not even in past it was normal.

The idea is to have a valid comparison with other cards. No doubt, you can get 60 FPS at lowest settings with upscaling from 480p and it'll look worse than XB1
6500 XT runs Cyberbug at 1080p low or medium well without that stupid FSR. 6400 runs it at low with ~40 fps. I understand that you collect data for fair comparison, but it's really useless for anyone looking to actually buy card like this. That's like expecting to run games at ultra on GT 730. That's just not what target audience does with those cards. Since ultra settings are notorious for hammering performance for no good reason, why not collect data with high settings instead?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,857 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
why not collect data with high settings instead?
Because people demand highest settings in reviews for pretty much all cards. Also faster cards like 3080+ will end up CPU limited otherwise

I agree that if I had an army of benchmark slaves I would have retested all cards on lower settings, which takes about two weeks, 10 hours a day. Just not practical for a review like this
 
Joined
May 8, 2021
Messages
1,978 (1.52/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Because people demand highest settings in reviews for pretty much all cards. Also faster cards like 3080+ will end up CPU limited otherwise
I don't think that's true. That could also be a great opportunity to avoid corporate sabotage like Gameworks that made Radeons perform a lot worse than they should.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,857 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
That could also be a great opportunity to avoid corporate sabotage like Gameworks that made Radeons perform a lot worse than they should.
It's called ray tracing now
 
Joined
Nov 17, 2016
Messages
152 (0.05/day)
W1zz needs to add a GT1030 to the list to compare this card too :)
the 1030 is a bit slower than the 550 2GB, which is in the list.

AFAIK, the 1030 is much better selling , so the 1030 would be a slightly more useful, but we can make a good guess 'just a bit lower than the 550'
 
Joined
May 8, 2021
Messages
1,978 (1.52/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
It's called ray tracing now
Didn't work out well for anyone involved in it though. I still remember 2080 Ti struggling at 1080p.
 
Joined
Nov 17, 2016
Messages
152 (0.05/day)
Because people demand highest settings in reviews for pretty much all cards. Also faster cards like 3080+ will end up CPU limited otherwise

I agree that if I had an army of benchmark slaves I would have retested all cards on lower settings, which takes about two weeks, 10 hours a day. Just not practical for a review like this

How do you come up with all those numbers? I mean new patches, new drivers, etc., presumably you don't actually retest the old cards for each review? But OTOH I guess 3 or 4 years down the line the position has often changed by 10% for a given card due to updates, so?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,857 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
W1zz needs to add a GT1030 to the list to compare this card too :)
I have a GT1030, and tried to find it for this review, but no luck

edit: omg found it
P1kAlMiG2K.jpg

finishing pcie 3.0 run first, then running gt1030

How do you come up with all those numbers? I mean new patches, new drivers, etc., presumably you don't actually retest the old cards for each review? But OTOH I guess 3 or 4 years down the line the position has often changed by 10% for a given card due to updates, so?
I retest everything every few months and keep drivers/games/patches constant until the next retest. Last retest was done mid-March
 
Last edited:
Joined
Mar 1, 2021
Messages
490 (0.36/day)
Location
Germany
System Name Homebase
Processor Ryzen 5 5600
Motherboard Gigabyte Aorus X570S UD
Cooling Scythe Mugen 5 RGB
Memory 2*16 Kingston Fury DDR4-3600 double ranked
Video Card(s) AMD Radeon RX 6800 16 GB
Storage 1*512 WD Red SN700, 1*2TB Curcial P5, 1*2TB Sandisk Plus (TLC), 1*14TB Toshiba MG
Display(s) Philips E-line 275E1S
Case Fractal Design Torrent Compact
Power Supply Corsair RM850 2019
Mouse Sharkoon Sharkforce Pro
Keyboard Fujitsu KB955
I do like the reviews here, but in this case I think the title is a bit misleading. Yea, you are reviewing the AMD RX 6400, but you are specifically reviewing the MSI AERO model, and this should be in the title.

It is nothing major, the card is still crap for me, simply because of the lack of de/encode for some codecs, but I would like to see the card name in the review title(link).
 
Joined
Apr 6, 2010
Messages
19 (0.00/day)
aren't locked-down OC controls the norm for slot-powered cards?
Last time I tried on my RX 460, all the options were available in the drivers (though granted that was a couple of years ago). I used them to reduce power further. If that's not available now, it's a pity, though the 6400 is more efficient than the 460 out of the box.
 
Joined
Nov 17, 2016
Messages
152 (0.05/day)
I do like the reviews here, but in this case I think the title is a bit misleading. Yea, you are reviewing the AMD RX 6400, but you are specifically reviewing the MSI AERO model, and this should be in the title.

All the cards are the same though... You can overclock the 6500 xt, and that is a differential feature between brands in terms of OC potential.

This one you cannot.

The default clocks are also identical across brands.

Since you can overclock the 6500 xt to add about 5% more performance, that makes this card relatively weaker.
 
Joined
Mar 1, 2021
Messages
490 (0.36/day)
Location
Germany
System Name Homebase
Processor Ryzen 5 5600
Motherboard Gigabyte Aorus X570S UD
Cooling Scythe Mugen 5 RGB
Memory 2*16 Kingston Fury DDR4-3600 double ranked
Video Card(s) AMD Radeon RX 6800 16 GB
Storage 1*512 WD Red SN700, 1*2TB Curcial P5, 1*2TB Sandisk Plus (TLC), 1*14TB Toshiba MG
Display(s) Philips E-line 275E1S
Case Fractal Design Torrent Compact
Power Supply Corsair RM850 2019
Mouse Sharkoon Sharkforce Pro
Keyboard Fujitsu KB955
All the cards are the same though... You can overclock the 6500 xt, and that is a differential feature between brands in terms of OC potential.

This one you cannot.

The default clocks are also identical across brands.

Since you can overclock the 6500 xt to add about 5% more performance, that makes this card relatively weaker.
Well yes but not all fans behave the same and a fast reader might read that the fan overshoots on all 6400 models ;)
 
Joined
Jan 14, 2019
Messages
12,353 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
This is pretty much a 750ti/1050ti class GPU except that Maxwell GPU was released over 8 years back & Pascal one over 5 years back! According to TPU charts it's still not 2x as fast as 750Ti & barely faster than 1050Ti, I think AMD should really do better after so many years especially in this segment. This belongs to 1030GT level right now & shouldn't cost a penny above 100 USD, the segment which it's released into right now is horrendously overpriced ~ granted Nvidia also haven't released anything of that kind like a 3050ti without power connector but the point remains!
What point? This is a 1650-eqivalent card with the same amount of VRAM and the same VRAM bandwidth. The 1650 does it with 128-bit GDDR5, the 6400 with 64-bit GDDR6. Only that low profile 1650s go for £250-300 on ebay while the 6400 costs £160 new. What's not to like?

I have the same use case in mind. I already wanted a HDMI 2.1 a year ago and bought a RTX 3060.
Why did you buy the Sapphire one?
I'm thinking about the Powercolor RX6400. This one has 0db fan stop.
The Sapphire one seems to have a longer cooler and the spec sheet mentions 55 W TBP instead of 53. It might not matter in normal usage, but it costs the same as any other model, so I thought why not. :)
 
Joined
Jan 14, 2019
Messages
12,353 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
the 1030 is a bit slower than the 550 2GB, which is in the list.

AFAIK, the 1030 is much better selling , so the 1030 would be a slightly more useful, but we can make a good guess 'just a bit lower than the 550'
No worries, I have a 1030, and my 6400 arrives on Saturday, I'll make sure to do a comparison. ;) ... on pci-e gen 3!

the 1650 has much more & better features (NVENC/NVDEC, x16 bus, 3 displayouts, OC support) than this tho, it's like leagues above the 6400
The only relevant thing this card is missing compared to the 1650 is the VP1 decoder. The x4 bus is what it is, and I don't believe the target audience gives a damn about the encoder (at least I don't). Nobody wants to see gameplay streams of Cyberpunk 2077 at low quality settings and/or 10 fps.
 
Last edited:
Joined
Aug 6, 2020
Messages
729 (0.46/day)
You noticed wrongly, as the 6400 also has only 12 CUs compared to the 6500 XT.

Still, they're the same chip, so unlocking one to the other could be technically possible when the lower end SKU uses good silicon. Big 'if', though.

AMD has been lasering -of cut cards for years now (last one was RX 560)
 
Joined
May 8, 2021
Messages
1,978 (1.52/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
The only relevant thing this card is missing compared to the 1650 is the VP1 decoder. The x4 bus is what it is, and I don't believe the target audience gives a damn about the encoder (at least I don't). Nobody wants to see gameplay streams of Cyberpunk 2077 at low quality settings and/or 10 fps.
But they absolutely should, because that means that card doesn't decode Youtube and Youtube on CPU is rough. IMO it fails as display adapter, but it can run Cyberbug and at 1080p low ~40 fps. If you wanted to record some older game, you can't with 6400. For that RX 550 works better, because it records. It's also useless to add this card to older computers struggling with Youtube or other services. Overall, it managed to alienate the audience it was intended for and pleased the gamers somewhat, who won't buy it. Would have been more acceptable if it had decoding capabilities, but power of GT 1030 DDR4. The irony is that GTX 1050 Ti, which matches RX 6400 in performance is selling for a bit less money and does more. And considering that Chinese manage to put laptop GPUs without such downsides on PCBs or MXM cards, it just shows how blatant cash grab RX 6400 is.
 
Joined
Apr 11, 2021
Messages
214 (0.16/day)
As I've said elsewhere the MSRP of this is about $40-50 too high, but other than that I don't see what people are complaining about.
Lack of generational upgrade? This thing performs worse than an RX 570 released in 2017 for a similar MSRP ($170).
I know, although I said early design stages not post 2020 era, unless AMD team was forecasting somehow the inflation which is in contrast with what RX6800XT/6800 pricing strategy indicates.
You say reputation was the only casualty, this isn't a negligible casualty, they made a wrong judgment call imo.
The should have forecasted production to suffice for mobile contracts only or launch with competitive desktop pricing (RX 6400 launched now and it doesn't seem to be acceptable in the same way RX 6500XT wasn't acceptable forcing AMD partners to sell at SRP or below in some cases in Europe when at the same time were selling 6800XT/6800 with +50% from SRP (and Nvidia solutions also +50% in ASUS/MSI/GB case) So no, I don't think partners gained anything from the price strategy only AMD had some financial gains, partners lose potential margins and reputation (GB 3 fan 6500XT design lol)
AMD doesn't seem to care that much about reputation, just look at the Ryzen 5 4500. But yeah, to be fair you might be correct on the partners gaining nothing (or perhaps relatively little) from this pricing strategy.
The only relevant thing this card is missing compared to the 1650 is the VP1 decoder. The x4 bus is what it is, and I don't believe the target audience gives a damn about the encoder (at least I don't). Nobody wants to see gameplay streams of Cyberpunk 2077 at low quality settings and/or 10 fps.
A lot of people stream eSports games, those have light graphical requirements and will work quite decently even on this thing when coupled with an adequate CPU.
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Lack of generational upgrade? This thing performs worse than an RX 570 released in 2017 for a similar MSRP ($170).
... but a 40 tier card isn't supposed to be a generational upgrade on a 70 tier card (even if AMD's naming back then was dumb and the 580 was more like a 60-tier in reality, with the 570 being a tad below that but still too powerful to fit its contemporary 50-tier). Polaris also delivered ridiculous value even for its time. As I've said time and time again, the pricing is silly, but performance for what this is trying to be is fine. This is an entry level card, with good entry level performance. What makes it problematic is it costing $160 when it should be more like $120 - which would make it fit with cards like the $109 (~$130 after inflation) 2016 GTX 1050. There's also the crazy increases in materials costs and shipping costs of the past few years. In a saner world, this would be $120 with the 6500 XT at $160-ish, but that's sadly not the world we're living in.
A lot of people stream eSports games, those have light graphical requirements and will work quite decently even on this thing when coupled with an adequate CPU.
If they have any non-F Intel CPU or any AMD APU, they already have hardware accelerated encoding support though. And if not, then, well, this GPU isn't for them. And quite frankly that's fine.
 
Top