• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6400

Joined
Jan 14, 2019
Messages
12,359 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
OK, market analysis time. I'll keep it about the 6400 as to not wander off topic:

The most popular model I could find is the AsRock Challenger ITX. Prices are in the $200-260 range for it (standard 20% VAT included), most often around ~$210. Every shop I checked out claimed they have stock.

Saw several offers for the ASUS Dual going from $265 to $285.

Just for laughs I also got some offers from one of our bigger IT shops and they have:
ASUS Dual for $285
MSI Aero ITX for $390
They also claim they have no stock...

Then, for even bigger laughs, we have our biggest e-tailer with these wonderful offers:
AsRock Challenger ITX for $350
MSI Aero ITX for $435
ASUS Dual for $460
All of these are not in stock with a delivery of 8 days.

Now, the ~$210 price doesn't sound half bad once you find out that you average 1050 Ti model retails for around $230 and more. The 6400 is probably the most sensible card you can buy new over here right now.

The average net wage here is 1/3 to 1/4 of what you have over there. Unless you have disposable income - yes, it's a f****** expensive hobby.
With those prices, I get where you're coming from. My Sapphire 6400 was 160 GBP (not USD, but whatevs), but I wouldn't have paid more than this.

1. it's 3% SLOWER then the 1650 at 1080p. The 1650 launched 3 years ago for the same $159 price point. That is technically negative price/perf movement over 3 years, an absolute embarrassment. rDNA 2 is much more capable then this.
2. it's still limited to 4GB of VRAM, even at 1080p this is an issue for my much slower RX 560 in certian games. Paying 1650 prices for a GPU with the same limitation is just silly.
3. it's got the same PCIe limitation as the 6500xt. Many of us are running pci gen 3 systems where the 6400 will lose additional performance, widening the gap with the now 3 year old 1650.
1. It may be 3% slower, but it is also around 50-80% cheaper than low profile 1650 models at the moment. Launch prices don't matter, as we all know that every single graphics card's price has shot up to the moon in the last 2 years (the 1030 costs 90-100 GBP right now which is ridiculous). Why should the 6400 be an exception from this?
2. You're paying original 1650 release prices for a current 1650 level card. Don't forget about the previous point. If the 6400 had been released in 2020, it probably would have been an 80 USD card. But it's 2022 now.
3. That I agree with. How much of a limitation it is, it's in the review, and I also made a post about it earlier when I paired it with a Ryzen 3 CPU and single channel RAM in a PCI-e 3.0 system. Based on this, everybody can decide for themselves whether a 6400 is worth it for them or not.
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
The total lockdown of the 6400xt clocks is terrible. There's no reason for that, the 6500xt is unlocked, as is the older rx 550 and 560. AMD is really trying to cover up just how horribly gimped the 6500 is.

Now, if nvidia did this with the 3050 or GT 1030 there'd be screeching from the rooftops. When AMD does it, crickets....
Crickets? There's been quite a bit of complaining about it that I've seen around here. IMO, not allowing OCing on <75W cards is perfectly fine - saves idiots from burning out the power traces on their motherboards. As for how this is an attempt at covering up how the 6500 XT is "gimped", you'll have to explain that one. Gimped = artificially held back somehow. The 6400 is the same GPU, with 4 fewer CUs and lower clocks, plus a locked down power limit. Best case scenario for that if it wasn't locked down would be it performing close to the 6500 XT, but a bit worse due to fewer CUs. Which ... would be exactly as with every other cut-down GPU? The 6600 isn't a demonstration of the 6600 XT being gimped after all - it's just a lower tier SKU. A demonstration of how the 6500 XT is gimped would be somehow showing how it would perform with a wider PCIe bus or memory bus, which ... well, the 6400 can't do that, and unless you have access to some very exotic AMD engineering samples, that's not something that can be done reasonably. Beyond that, benchmarks speak for themselves.

Of course, the sane scenario would be this selling at ~$120, with the 6500 XT at ~$150 for those wanting an unlocked, higher power card. (Though my first move if I had a 6500 XT would be to underclock it, not overclock it!)
Slight correction, the first mining boom was in 2013-2014. That's what drove prices of the R9 290x through the roof, at the time GCN was far and away superior to kepler at mining. That coupled with the titan's success is what helped pushed prices as high as they are now.
Sure, first major mining boom them. That early one might have pushed R9 290X prices up, but overall didn't affect the GPU market in a major way.
1. It may be 3% slower, but it is also around 50-80% cheaper than low profile 1650 models at the moment. Launch prices don't matter, as we all know that every single graphics card's price has shot up to the moon in the last 2 years (the 1030 costs 90-100 GBP right now which is ridiculous). Why should the 6400 be an exception from this?
2. You're paying original 1650 release prices for a current 1650 level card. Don't forget about the previous point. If the 6400 had been released in 2020, it probably would have been an 80 USD card. But it's 2022 now.
3. That I agree with. How much of a limitation it is, it's in the review, and I also made a post about it earlier when I paired it with a Ryzen 3 CPU and single channel RAM in a PCI-e 3.0 system. Based on this, everybody can decide for themselves whether a 6400 is worth it for them or not.
This is the kind of perspective we need. While it's also important to stay grounded in what would be reasonable GPU prices in a sensible world - which is definitely not the current situation - for real-world comparisons of things for sale today, that can't be the benchmark (unless the overall advice is "don't buy anything unless you have to, pricing is insane", which I mostly agree with, but that "unless you have to" is a rather wide-open door for complications). So we need to not only account for performance and MSRPs of products currently on the market, but also the realities of current pricing - however terrible it may be.
 
Joined
Dec 28, 2012
Messages
3,899 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Crickets? There's been quite a bit of complaining about it that I've seen around here. IMO, not allowing OCing on <75W cards is perfectly fine - saves idiots from burning out the power traces on their motherboards.
When nvidia used that exact same reasoning for locking down mobile maxwell overclocking, tech website were lighting torches. There may be some rumbling from some users here, but far from the fiery reactions nvidia prompts for the exact same behavior. Hell you make my point, finding reasons why it's OK when AMD does it. You can use the same justification to prevent idiots from frying their 300 watt 3090s.
As for how this is an attempt at covering up how the 6500 XT is "gimped", you'll have to explain that one. Gimped = artificially held back somehow. The 6400 is the same GPU, with 4 fewer CUs and lower clocks, plus a locked down power limit. Best case scenario for that if it wasn't locked down would be it performing close to the 6500 XT, but a bit worse due to fewer CUs. Which ... would be exactly as with every other cut-down GPU? The 6600 isn't a demonstration of the 6600 XT being gimped after all - it's just a lower tier SKU. A demonstration of how the 6500 XT is gimped would be somehow showing how it would perform with a wider PCIe bus or memory bus, which ... well, the 6400 can't do that, and unless you have access to some very exotic AMD engineering samples, that's not something that can be done reasonably. Beyond that, benchmarks speak for themselves.
Limiting the 6500 to 4x PCIe lanes artificially holds back performance of the GPU on pcie 3.0 systems. 4GB of VRAM on a 64 bit bus bandwidth starves the GPU of both bandwidth and capacity. This can be demonstrated in games like DOOM eternal.

That's gimping, pure and simple. The 6500 is held back artificially be decisions made by AMD to cut costs. Given how close the 6400 and 6500 can be, being able to OC the 6400 would expose how held back the 6500xt is, especially on older systems or budget systems.
Sure, first major mining boom them. That early one might have pushed R9 290X prices up, but overall didn't affect the GPU market in a major way
I'm sure increasing MSRP by over 50% and rendering the AMD GCN lineup of cards unavailable to gamers for almost an entire year, resulting in the same "will the market ever go back to normal" posts you see right now, counts as affecting the market "in a major way".
This is the kind of perspective we need. While it's also important to stay grounded in what would be reasonable GPU prices in a sensible world - which is definitely not the current situation - for real-world comparisons of things for sale today, that can't be the benchmark (unless the overall advice is "don't buy anything unless you have to, pricing is insane", which I mostly agree with, but that "unless you have to" is a rather wide-open door for complications). So we need to not only account for performance and MSRPs of products currently on the market, but also the realities of current pricing - however terrible it may be.
1. It may be 3% slower, but it is also around 50-80% cheaper than low profile 1650 models at the moment. Launch prices don't matter, as we all know that every single graphics card's price has shot up to the moon in the last 2 years (the 1030 costs 90-100 GBP right now which is ridiculous). Why should the 6400 be an exception from this?
2. You're paying original 1650 release prices for a current 1650 level card. Don't forget about the previous point. If the 6400 had been released in 2020, it probably would have been an 80 USD card. But it's 2022 now.
3. That I agree with. How much of a limitation it is, it's in the review, and I also made a post about it earlier when I paired it with a Ryzen 3 CPU and single channel RAM in a PCI-e 3.0 system. Based on this, everybody can decide for themselves whether a 6400 is worth it for them or not.
1. Unless you are willing to give into the scalpers and pay them bloated prices for used products, the 6400xt moving the price/perf needle backwards absolutely matters. Saying "well other cards have gone up in price too!!!!" is pure whataboutism, and doesnt change the fact that the 6400 offers worse performance/$ then the 1650 did at LAUNCH price, 3 YEARS ago.
2. Again, see whataboutism. My point was that AMD has moved the price/perf needle backwards from the 1650. I dont care what an out of production GPU from 3 years ago goes for on ebay today, I care about how much a new GPU costs compared to options from the previous generation's launch price. The 2080ti was available on ebay for $500 briefly after the 3080 launch, that did not change the fact that the 3080 offered a SUBSTANTIAL improvement in perf/$ over the 2080.
3 Whether people think it's worth it or not, it does not change the objective fact that the 6400 in a PCIe 3.0 system will do worse then the review here shows, and further widen the gap between it and 3 year old options, reinforcing the point that AMD has horrendously overpriced the 6400, just like no matter how many people can justify buying a 3090ti for gaming it doesnt change the face that the 3090ti is a horrendously priced product. No amount of meatshielding AMD will change this fact.
 
Joined
May 8, 2021
Messages
1,978 (1.52/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
It's not a UK vs Lithuania thing. It's more like a "companies don't get rich by spending money" kind of thing. But let's stop the off-topic here. ;)
Sort of. Having good conditions attracts more talent, even the best talent sometimes. That's desirable generally, on the other hand there are penny pinchers that do things as cheaply as possible for maximum output and lower quality.

That's exactly why you need a new PC if you want to play Youtube in HD. My point stands that just because you have an Athlon 64 or Athlon X4 or whatever at hand, it doesn't mean that it's fit for your purpose. It was fine, but it's not anymore. Nothing stops you from walking into a computer recycling centre and paying 5-10 EUR for a Sandy i5. It's even (a lot) cheaper than buying a new graphics card, and it won't only give you HD videos, but the whole system will be faster as well. But if you'd rather pay a hundred EUR just for a fecking video codec, be my guest.
Unless you want something as old as first gen Core i stuff, sure, you can find them for dirt. They have cores as slow as that Athlon X4. If you want Haswell, you will pay. And anything newer used is is bad as buying new, even worse if you need a generation or two generations old boards. Scalping for those is insane. I looked at market and one of the cheaper i5s is i5-3470s, bloke want 29 EUR for that. Another bloke sells i5-2320 for 15 EUR. And third bloke sells i5-4590 for 40 EUR. Never mind the boards. At this point, new Pentium or Celeron is way better deal. And there aren't dirt cheap i5s locally. Cheapest i5 on eBay is 29 EUR + 12 EUR shipping with unknown import fees. It's from Italy. Cheaper i5 computer with gt 1030 is 170 EUR, but godness gracious, it has a bomb like PSU, case without ventilation, which looks like it was from early 2000s and it looks like single stick of RAM. It's with i5 2500 tho. That's ancient, barely better than Athlon X4, but 1030 saves the day, there's SSD too.


Is that your point? Closing ears, closing eyes, "lalala, I'm not listening because you're stupid"? Very mature indeed.
That's literally you here. Ignoring scenarios where GPU functions are important and lalala CheEP i5 StOoPiD. Very mature, indeed. I would understand such moronic statement if you haven't ever been out of your town, but that's not the case. There aren't cheap i5s everywhere and replacing one e-waste with another is monkey business.


I did not. I said that the 6400's decoder supports all the formats that the 710's does, plus H.265. This is not a complaint. This is a fact.
In another thread ffs.


The 6400 has VP9 and HEVC decode as well. I agree that the 1030 is enough for 99% of HTPC uses, as long as you don't need HDMI 2.1. It's only that the 1030 costs around £100, which is a terrible deal, imo.
Either GT 1030 or Quadro T400. Or buying a whole new platform altogether, which is 200 EUR minimum.


As for an Alder Lake upgrade, I played with the thought before buying the 6400, but a motherboard and a CPU would have cost me around £150-200, and then I would have ended up with a slower CPU than what I have now. An i3 with the same performance as my Ryzen 3 would have cost me £250 for the whole system. Not worth it. As for people coming from older systems, they might also need to buy some DDR4 RAM which makes it even more expensive. If they have some DDR3 laying around, picking up an old i5 from a second-hand store for a couple of quid is still a lot better deal.
You really have a bad upgrading habit. A bit hypocritical of you to complain about price, when you more or less buy a new CPU or GPU every generation. I hope you sell some, but you certainly don't save by going through parts so often. Your GT 1030 isn't even 1 year old, 3100 is at best 2 years old if that. BTW what happened to i7 10700? Wasn't it for HTPC too? That should have decoding capabilities.


Are you seriously comparing the 6400 to a top tier card from 20 years ago? Jesus...
It's that poo, so yeah. And X800 Pro was upper end card. High end card back then was X800 XT PE AGP. I have X800 XT PE too, but it's basically the same as X800 Pro. It's only marginally faster, but was going for way more dosh back then. Even as cheap upgrade it was very disappointing indeed. Definitely not as big leap as from FX 5200 to X800 Pro, but even then same games were playable, just at more fps and better graphics. Due to X800 series lacking DirectX 9c support and pixel shader version (can't recall which), it was stupidly crippled card. nVidia 6800 cards didn't have such limitations, but still aged badly due to way too bad power consumption and then soon after launched 8000 series, which were insanely good and had very long lifespan. RX 6400 and 6500 XT will have similarly terrible fate.

The average net wage here is 1/3 to 1/4 of what you have over there. Unless you have disposable income - yes, it's a f****** expensive hobby.
Aye, during last 3 years budget gaming pretty much died. Not sure about Bulgaria, but not even APUs were available. Even 1050 Tis, 750 Tis, RX 550 was gone or going for 300+ EUR. The only options for budget were GT 1030 (even that was nearly sold out all the time), GT 730 GDDR5 and Quadro T600. Quadro T600 was 200 EUR and was as fast as 1050 Ti. With i3 10100f it was the only sort of passable configuration. At least now there are a bit more options like 1650 or RX 6600 (which is 400 EUR, but relatively awesome value). I'm not stoked that all new we got is that decoderless Radeon e-waste and nVidia isn't exactly planning to release GTX 3030 either. Even such low end configurations would now takes nearly two months of averagely paid work in medium size city. If you are in smaller city (with less than 50k people), then it sucks to be you. If you want anything more high end, you better work in IT in capital city or save for it for half of whole year. Lithuania also has 15+% inflation and imports have quite high fees, so your money loses value quite fast and using eBay isn't exactly an option.

The only strategy for budget gamers is to either play older games on modest hardware or paly new games at 720p low with 30-40 fps. I'm blessed to have RX 580, but 6500 XT is still slower than it and RX 580 won't last forever and at some point will become obsolete due to having old arch, lack of DX support, old shader support or some similar reason. The only saving grace is that quite a lot of modern games are quite boring, rehashed version of older one or buggy mess. Not sure about you, but I haven't really seen much to play. I only have Horizon 5 from newer games, but that's it. I gotta admit, that I had a blast this year playing Battlefield 1942, which is old as fuck. Even UHD 710 would have been enough for it. It's quite nice offline, but AI sometimes is quite dumb and gets stuck in places. I played some Stalker too. Again old as fuck and would run with UHD 710. I tried to run it with FX 5200 128MB and it almost could at 640x480, which X800 Pro it's not a problem. Obviously with RX 580 it runs perfectly fine at 1440p, ultra settings and some control panel settings cranked.
 
Last edited:
Joined
May 21, 2009
Messages
270 (0.05/day)
Processor AMD Ryzen 5 4600G @4300mhz
Motherboard MSI B550-Pro VC
Cooling Scythe Mugen 5 Black Edition
Memory 16GB DDR4 4133Mhz Dual Channel
Video Card(s) IGP AMD Vega 7 Renoir @2300mhz (8GB Shared memory)
Storage 256GB NVMe PCI-E 3.0 - 6TB HDD - 4TB HDD
Display(s) Samsung SyncMaster T22B350
Software Xubuntu 24.04 LTS x64 + Windows 10 x64
Aye, during last 3 years budget gaming pretty much died. Not sure about Bulgaria, but not even APUs were available.

Even 1050 Tis, 750 Tis, RX 550 was gone or going for 300+ EUR.

The only options for budget were GT 1030 (even that was nearly sold out all the time), GT 730 GDDR5 and Quadro T600. Quadro T600 was 200 EUR and was as fast as 1050 Ti.

With i3 10100f it was the only sort of passable configuration. At least now there are a bit more options like 1650 or RX 6600 (which is 400 EUR, but relatively awesome value).

I'm not stoked that all new we got is that decoderless Radeon e-waste and nVidia isn't exactly planning to release GTX 3030 either.

Even such low end configurations would now takes nearly two months of averagely paid work in medium size city.

If you are in smaller city (with less than 50k people), then it sucks to be you.

If you want anything more high end, you better work in IT in capital city or save for it for half of whole year.

Lithuania also has 15+% inflation and imports have quite high fees, so your money loses value quite fast and using eBay isn't exactly an option.


The only strategy for budget gamers is to either play older games on modest hardware or paly new games at 720p low with 30-40 fps.

I'm blessed to have RX 580, but 6500 XT is still slower than it and RX 580 won't last forever and at some point will become obsolete due to having old arch, lack of DX support, old shader support or some similar reason.

The only saving grace is that quite a lot of modern games are quite boring, rehashed version of older one or buggy mess.

Not sure about you, but I haven't really seen much to play.

I only have Horizon 5 from newer games, but that's it.

I gotta admit, that I had a blast this year playing Battlefield 1942, which is old as fuck.

Even UHD 710 would have been enough for it. It's quite nice offline, but AI sometimes is quite dumb and gets stuck in places.

I played some Stalker too.

Again old as fuck and would run with UHD 710.

I tried to run it with FX 5200 128MB and it almost could at 640x480, which X800 Pro it's not a problem.

Obviously with RX 580 it runs perfectly fine at 1440p, ultra settings and some control panel settings cranked.
EU gpu prices are terrible compared usa

In my case think i dont want any money to fucking gpu scumbags (until now this include amd and nvidia) companies, personally wait for arc and after this see prices

however in this point i use mainly old games, maybe wait for meteor lake next year because stay interested in arc title igpu*

*hopefully intel aka pat dont continue fuck desktop igps because actually intel laptops igps are more better than intel desktop igps

for example pentium g7400 have uhd 710 with 16 eus aka 128 shaders meanwhile pentium 8500 igp have 48 eus aka 384 shaders and another example is core i3 12100 igp have uhd 730 with 24 eus aka 192 shaders (i5 12500 have uhd 770 with 32 eus aka 256 shaders)
and i3 1210u igp have 64 eus aka 512 shaders

:)
 
Last edited:
Joined
Jan 14, 2019
Messages
12,359 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
1. Unless you are willing to give into the scalpers and pay them bloated prices for used products, the 6400xt moving the price/perf needle backwards absolutely matters. Saying "well other cards have gone up in price too!!!!" is pure whataboutism, and doesnt change the fact that the 6400 offers worse performance/$ then the 1650 did at LAUNCH price, 3 YEARS ago.
2. Again, see whataboutism. My point was that AMD has moved the price/perf needle backwards from the 1650. I dont care what an out of production GPU from 3 years ago goes for on ebay today, I care about how much a new GPU costs compared to options from the previous generation's launch price. The 2080ti was available on ebay for $500 briefly after the 3080 launch, that did not change the fact that the 3080 offered a SUBSTANTIAL improvement in perf/$ over the 2080.
3 Whether people think it's worth it or not, it does not change the objective fact that the 6400 in a PCIe 3.0 system will do worse then the review here shows, and further widen the gap between it and 3 year old options, reinforcing the point that AMD has horrendously overpriced the 6400, just like no matter how many people can justify buying a 3090ti for gaming it doesnt change the face that the 3090ti is a horrendously priced product. No amount of meatshielding AMD will change this fact.
1. I wasn't talking about scalpers. If you strictly consider retail availability only, then the low profile 1050 Ti and 1650 aren't even there, so your only options are the GT 710 for £60, the GT 1030 for £100 or the 6400 for £170. Don't tell me that the 1030 is so great that it's worth 100 quid, or that the 710 is worth spending any amount of money on in 2022.
2. It's not whatabautism. Retail prices and MSRP three years ago don't concern me, nor do they concern anyone else who walks into a computer store, or looks one up online right now. Show me a store where you can buy a 1650 for 150 USD, a 3060 for 329 USD, or a 3080 for 699 USD.
3. I've tested it. Gameplay experience is subjective, of course, but for me, it wasn't that bad in most cases. Metro Exodus sucked on it for some reason, I acknowledge that.

Sort of. Having good conditions attracts more talent, even the best talent sometimes. That's desirable generally, on the other hand there are penny pinchers that do things as cheaply as possible for maximum output and lower quality.
That's very true. I wish the logistics sector worked on this principle, too. Unfortunately, this industry isn't like that. Our company leaders are only concerned about numbers most of the time.

Unless you want something as old as first gen Core i stuff, sure, you can find them for dirt. They have cores as slow as that Athlon X4. If you want Haswell, you will pay. And anything newer used is is bad as buying new, even worse if you need a generation or two generations old boards. Scalping for those is insane. I looked at market and one of the cheaper i5s is i5-3470s, bloke want 29 EUR for that. Another bloke sells i5-2320 for 15 EUR. And third bloke sells i5-4590 for 40 EUR. Never mind the boards. At this point, new Pentium or Celeron is way better deal. And there aren't dirt cheap i5s locally. Cheapest i5 on eBay is 29 EUR + 12 EUR shipping with unknown import fees. It's from Italy. Cheaper i5 computer with gt 1030 is 170 EUR, but godness gracious, it has a bomb like PSU, case without ventilation, which looks like it was from early 2000s and it looks like single stick of RAM. It's with i5 2500 tho. That's ancient, barely better than Athlon X4, but 1030 saves the day, there's SSD too.
I guess that's a country vs country difference, then. Here, the i5 4460 sells for 15 quid with warranty.

That's literally you here. Ignoring scenarios where GPU functions are important and lalala CheEP i5 StOoPiD. Very mature, indeed. I would understand such moronic statement if you haven't ever been out of your town, but that's not the case. There aren't cheap i5s everywhere and replacing one e-waste with another is monkey business.
You presented a case. I presented a solution that is cheaper than buying a graphics card. You ignored it. Let's leave it at that.

In another thread ffs.
OK, show me. It may have been ages ago, as I've had the 1030 for a while now.

Either GT 1030 or Quadro T400. Or buying a whole new platform altogether, which is 200 EUR minimum.
GT 1030: sure. T400: too expensive and rare. New platform: too expensive.

You really have a bad upgrading habit. A bit hypocritical of you to complain about price, when you more or less buy a new CPU or GPU every generation. I hope you sell some, but you certainly don't save by going through parts so often. Your GT 1030 isn't even 1 year old, 3100 is at best 2 years old if that. BTW what happened to i7 10700? Wasn't it for HTPC too? That should have decoding capabilities.
If you have such a good memory regarding what I said in other threads, then you might recall me saying that I don't only buy computer parts to upgrade. ;) PC building is a hobby of mine. I buy most of my stuff out of curiosity, or through a dirt cheap deal, not because I actually need it.

It's that poo, so yeah. And X800 Pro was upper end card. High end card back then was X800 XT PE AGP. I have X800 XT PE too, but it's basically the same as X800 Pro. It's only marginally faster, but was going for way more dosh back then. Even as cheap upgrade it was very disappointing indeed. Definitely not as big leap as from FX 5200 to X800 Pro, but even then same games were playable, just at more fps and better graphics. Due to X800 series lacking DirectX 9c support and pixel shader version (can't recall which), it was stupidly crippled card. nVidia 6800 cards didn't have such limitations, but still aged badly due to way too bad power consumption and then soon after launched 8000 series, which were insanely good and had very long lifespan. RX 6400 and 6500 XT will have similarly terrible fate.
Guess what... high end cards tend to come with all the features and gimmicks while low end ones don't. What do you find so surprising about this?
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I guess that's a country vs country difference, then. Here, the i5 4460 sells for 15 quid with warranty.
Wow, I wish I knew of a store like that here in Sweden or in the EU generally. I just ordered an i7-2600 from Ebay for my secondary PC, and that cost me €35+€15 in shipping. Then again an i7 is always more expensive than an i5, and I specifically wanted the HT support. Now I'm just looking for a way to spend less than SEK 1000 on an ITX motherboard for a Haswell i5 that I got my hands on for free recently. There are those "new" Chinese brand motherboards that look quite interesting (M.2 slots even!), but they're so expensive :(
 
Joined
May 8, 2021
Messages
1,978 (1.52/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
That's very true. I wish the logistics sector worked on this principle, too. Unfortunately, this industry isn't like that. Our company leaders are only concerned about numbers most of the time.
Just asking, but wouldn't it be better to learn some programming and get job in that? From what I see, at least in Lithuania, there is quite a lot of demand for them, often you can work from home and making 2-4 or even 10 times as much as other degree holders seems entirely possible. That looks like crazy shortcut in life and you only need to learn for one year initially. Job offers say that after first 2-5 years you can get full wage, but even entry level programmer gets average national wage. I don't know any other profession that seems to be rather easy to get in and pays a lot. You can either work from home and have a nice office at home or you can save a lot of money and keep putting it into investments and after decade, you can just retire and live off your pot until you die. And by saving, I mean you are basically making times more dosh than basically everyone else, so you can live basically like them and not spend more. I wonder if it really works out like that, if it does that sounds amazing. But even in non programming fields, you can work from home and if you hate your office, there are possibilities like that. Not only that, but you can just buy cheaper property in middle of nowhere too, so even if you don't make more money than others, you have unique expense lowering opportunities with improved quality of life. As long as you don't mind WFH, it seems like a bit of no brainer thing to do.

I guess that's a country vs country difference, then. Here, the i5 4460 sells for 15 quid with warranty.
That's not bad deal, but the real problem with old CPUs is that their motherboards are getting less common and people scalp them badly.


OK, show me. It may have been ages ago, as I've had the 1030 for a while now.
That was your HTPC thread, I certainly won't find it.


GT 1030: sure. T400: too expensive and rare. New platform: too expensive.
GT 1030 and T400 basically cost the same and T400 has some GDDR6.


If you have such a good memory regarding what I said in other threads, then you might recall me saying that I don't only buy computer parts to upgrade. ;) PC building is a hobby of mine. I buy most of my stuff out of curiosity, or through a dirt cheap deal, not because I actually need it.
Maybe, but that's still really often.


Guess what... high end cards tend to come with all the features and gimmicks while low end ones don't. What do you find so surprising about this?
Except that wasn't exactly that way. R420 core in X800 Pro and X800 XT PE had 12 and 16 pipelines. That exact same core was cut down to 4 pipes for lower end GPUs like ATi Radeon HD X550 XT. You got the same capabilities, but for a lot less. And even X800 GT or SE were affordable cards. They had 8 pipes. With enough moding and luck, you may have been able to unlock all 16 pipes and overclock it. You could raise voltage manually with potentiometer and slap ATi Silencer from Arctic. But anyway, those cards were cheap, modable and feature wise identical to flagship cards. You even got the exactly the same cooler on it and often very similar PCB too. And I remember Apple bragging about Intel GMA from similar era and how it could play BluRays just fine and that was the lowest of the low, not even integrated into CPU, but on board graphics. RX 6400 really has some inexcusable regressions that don't really save much, but makes user experiences crippled. If I remember well, nVidia cards handle decoding with CUDA, so there's no easy way to cut out such functionality too. Even if you manage to cut it out, you same barely anything anyway. A CU or two likely take up same space, so it's pretty silly to mess with VCN hardware.

Now I'm just looking for a way to spend less than SEK 1000 on an ITX motherboard for a Haswell i5 that I got my hands on for free recently. There are those "new" Chinese brand motherboards that look quite interesting (M.2 slots even!), but they're so expensive :(
I would rather avoid those Chinese boards. You don't have any support, some of them are gimped on HW level (like botched RAM channels), some lack some BIOS options, questionable VRMs, some features might not work (like turbo) or they will only support some very specific models of CPUs.
 
Joined
Dec 28, 2012
Messages
3,899 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
I love being proven right:


TL;DR: on PCIe 3.0 the 6400 loses roughly 15% on average vs 4.0. One again, AMD has gimped their latest card....
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I love being proven right:


TL;DR: on PCIe 3.0 the 6400 loses roughly 15% on average vs 4.0. One again, AMD has gimped their latest card....
You know Techspot's reviews are the same as HWUB, right? So in essence, that review has already been posted. Nothing new there.


I would rather avoid those Chinese boards. You don't have any support, some of them are gimped on HW level (like botched RAM channels), some lack some BIOS options, questionable VRMs, some features might not work (like turbo) or they will only support some very specific models of CPUs
Yeah, I'll definitely have to do some research before committing - at least reading some reviews. At least there are some Haswell ITX boards to be found - pickings are far slimmer for my poor old Sandy Bridge CPUs. There are a few that exist, but man are they expensive.
 
Joined
May 21, 2009
Messages
270 (0.05/day)
Processor AMD Ryzen 5 4600G @4300mhz
Motherboard MSI B550-Pro VC
Cooling Scythe Mugen 5 Black Edition
Memory 16GB DDR4 4133Mhz Dual Channel
Video Card(s) IGP AMD Vega 7 Renoir @2300mhz (8GB Shared memory)
Storage 256GB NVMe PCI-E 3.0 - 6TB HDD - 4TB HDD
Display(s) Samsung SyncMaster T22B350
Software Xubuntu 24.04 LTS x64 + Windows 10 x64
I love being proven right:


TL;DR: on PCIe 3.0 the 6400 loses roughly 15% on average vs 4.0. One again, AMD has gimped their latest card....

very good another review and back to confirm this card suffer so much beetween poorly decision to cut pci-e lanes and feature capabilities

another detail dont like is this, maybe for tiny heatsink and fan:

The GPU temperature was reasonable though, peaking at 78 C

and stay agree with some conclusions like this:

Ultimately, the Radeon RX 6400 sucks just as much as we knew it would.

Those of you seeking a budget graphics card for gaming should certainly look elsewhere, especially if you have a PCIe 3.0 system.
The best alternative for those wanting to get their hands on a graphics card for under $200 is to shop second hand if possible.

Right now on eBay, 4GB RX 570 cards are regularly selling for around $150 with some models going for just over $100.
Yes, they're old, and they use more power, but as we've seen here for 1080p gaming in a PCIe 3.0 system you're looking at ~30% better performance, while retaining features such as hardware encoding.

Alternatively, second hand GTX 1650's are selling for under $150 and offer 20% greater performance for PCIe 3.0 users.
That means GTX 1650 Super cards are going for anywhere from $130 to $200 and in a PCIe 3.0 system our 1080p average data points to over 60% greater performance.

So if you care about getting the most bang for your buck, then paying even $200 for a GTX 1650 Super is a far wiser investment given that makes it at most 25% more expensive for 60% better gaming performance on average.

rx 6400 in sff use maybe a option and if you live in EU because gpu prices there are a crazy, but sff use are a minimal part or users because most users have typical pc aka no sff

and in another topic apparently vcn 4.0 in rdna 3 dont come with av1 encode support for now:


:)
 
Joined
Jan 14, 2019
Messages
12,359 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Just asking, but wouldn't it be better to learn some programming and get job in that? From what I see, at least in Lithuania, there is quite a lot of demand for them, often you can work from home and making 2-4 or even 10 times as much as other degree holders seems entirely possible. That looks like crazy shortcut in life and you only need to learn for one year initially. Job offers say that after first 2-5 years you can get full wage, but even entry level programmer gets average national wage. I don't know any other profession that seems to be rather easy to get in and pays a lot. You can either work from home and have a nice office at home or you can save a lot of money and keep putting it into investments and after decade, you can just retire and live off your pot until you die. And by saving, I mean you are basically making times more dosh than basically everyone else, so you can live basically like them and not spend more. I wonder if it really works out like that, if it does that sounds amazing. But even in non programming fields, you can work from home and if you hate your office, there are possibilities like that. Not only that, but you can just buy cheaper property in middle of nowhere too, so even if you don't make more money than others, you have unique expense lowering opportunities with improved quality of life. As long as you don't mind WFH, it seems like a bit of no brainer thing to do.
I've got personal reasons for not going down that way. I'm happy to talk about it in private, but let's not spam the thread even more. :ohwell:

That's not bad deal, but the real problem with old CPUs is that their motherboards are getting less common and people scalp them badly.
That again sounds like a country vs country thing. I've seen H81 boards selling for £20-30 on ebay. So 30 quid for the board, 15 for the CPU, another 8 for two 4 GB sticks of RAM, that's £53 altogether.

That was your HTPC thread, I certainly won't find it.
Oh, you mean my small form factor build thread (that has a link my signature). I'll update it soon with my recent experiences with the 6400 and with my other HTPC that I just built a week ago.

I remember it now. Even though the GT 710 supports H.264 decoding, it can't do it in 4K (which isn't mentioned anywhere on its datasheet). It's trying at 100% usage while the video is basically unwatchable and the CPU is sitting idle. That's why I bought the 1030. This wasn't a complaint, either. Pure fact.

Edit: If you want complaint, I think this is more of a trap situation than the 6400 where AMD states it outright that it doesn't support AV-1.

Maybe, but that's still really often.
And?

Except that wasn't exactly that way. R420 core in X800 Pro and X800 XT PE had 12 and 16 pipelines. That exact same core was cut down to 4 pipes for lower end GPUs like ATi Radeon HD X550 XT. You got the same capabilities, but for a lot less. And even X800 GT or SE were affordable cards. They had 8 pipes. With enough moding and luck, you may have been able to unlock all 16 pipes and overclock it. You could raise voltage manually with potentiometer and slap ATi Silencer from Arctic. But anyway, those cards were cheap, modable and feature wise identical to flagship cards. You even got the exactly the same cooler on it and often very similar PCB too. And I remember Apple bragging about Intel GMA from similar era and how it could play BluRays just fine and that was the lowest of the low, not even integrated into CPU, but on board graphics. RX 6400 really has some inexcusable regressions that don't really save much, but makes user experiences crippled. If I remember well, nVidia cards handle decoding with CUDA, so there's no easy way to cut out such functionality too. Even if you manage to cut it out, you same barely anything anyway. A CU or two likely take up same space, so it's pretty silly to mess with VCN hardware.
We're talking about a different age here. Graphics cards in general were a lot more affordable back then. Heck, I bought an X800 XT and then a 7800 GS from pocket money as a high school kid with no job. I work full time now, but anything above a 3070 Ti or 6700 XT is out of my range (even those are iffy at 5-600 GBP). But this is besides the point...

All in all, the 6400 doesn't have "regressions". It's more advanced than previous generations. It only lacks certain features that more expensive models of the same generation have. If having no AV-1 decoder is inexcusable for you because you want to use it with a CPU that can't handle it, that's a unique problem. That CPU is a terrible pair with any modern GPU, including the 6400.

Edit: If someone doesn't have the money to replace a 10+ year-old CPU, they most definitely won't have the money for a 6400, either.

I love being proven right:


TL;DR: on PCIe 3.0 the 6400 loses roughly 15% on average vs 4.0. One again, AMD has gimped their latest card....
IMO, talking about average is pointless. It can offer relatively the same performance with any PCI-e version in some games, but perform like ass in others.

another detail dont like is this, maybe for tiny heatsink and fan:

"The GPU temperature was reasonable though, peaking at 78 C"
That's strange. My Sapphire Pulse peaks at 67 °C. I guess the heatpipe that runs across its cooler helps more than I thought.
 
Last edited:
Joined
May 8, 2021
Messages
1,978 (1.52/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
That again sounds like a country vs country thing. I've seen H81 boards selling for £20-30 on ebay. So 30 quid for the board, 15 for the CPU, another 8 for two 4 GB sticks of RAM, that's £53 altogether.
Not really, if you watch Brian from TechYesCity, there's basically universal i5 or i7 "tax". And motherboards on eBay have always ben terribly overpriced.


Oh, you mean my small form factor build thread (that has a link my signature). I'll update it soon with my recent experiences with the 6400 and with my other HTPC that I just built a week ago.

I remember it now. Even though the GT 710 supports H.264 decoding, it can't do it in 4K (which isn't mentioned anywhere on its datasheet). It's trying at 100% usage while the video is basically unwatchable and the CPU is sitting idle. That's why I bought the 1030. This wasn't a complaint, either. Pure fact.
lol that's quite sad. Even cards as ancient as Radeon HD 7750 can decode H264 and in 4K too.

We're talking about a different age here. Graphics cards in general were a lot more affordable back then. Heck, I bought an X800 XT and then a 7800 GS from pocket money as a high school kid with no job. I work full time now, but anything above a 3070 Ti or 6700 XT is out of my range (even those are iffy at 5-600 GBP). But this is besides the point...
But what about Intel UHD 710. It decode everything on budget. Even 4K AV-1. Can be found even in Celeron. That's stupidly affordable. Kinda shitty excuse for RX 6400 to not be able to do as much at nearly 4 times the cost of Celeron. GT 1030 is also somewhat superior as it supports VP9 decoding and has ShadowPlay.

All in all, the 6400 doesn't have "regressions". It's more advanced than previous generations. It only lacks certain features that more expensive models of the same generation have.
um, literally RX 5500 XT is superior to RX 6500 XT. Performs the same, used to be available at same or lower price, had ReLive, had PCIe x8 and has VP9 decoding/encoding, has more outputs. Sure, it doesn't have AV-1 decoder, but when it was launched AV-1 was more in experimental stage. Even RX 5300 is superior to RX 6400, not only smashing it at feature set, but offering superior performance in even lower tier. If AMD just haven't stopped production of 5000 series low end cards, they would have cheap and competitive products. I don't see any reason why low end RX 6000 series have to suck so much. This snafu reminds me of FX launch, when Phenoms beat FXs, while being more efficient and cheaper.

If having no AV-1 decoder is inexcusable for you because you want to use it with a CPU that can't handle it, that's a unique problem. That CPU is a terrible pair with any modern GPU, including the 6400.
I have i5 10400f and it skips some frames at 4k60 in YT with VP9. Also skipping parts of video is quite sluggish with it. 8k is completely out of question, it just doesn't work well at all. You can argue that my screen is only 1440p, but, mate, I love extra bitrate and supersampling. Youtube is just not the same, once you try that. I used to do it on phone too, before YT app had options for above native resolution, but there isn't as much benefit there as on bigger screen. But yeah, I will keep my "crappy" i5 away from RX 6400 snafu edition, it's not worthy of proper CPU. BTW FX and Athlon chips become sluggish and skippy with 1440p60, 1440p is usually fine. They can sort of handle 4k30, but skipping is too laggy then.
 
Joined
May 21, 2009
Messages
270 (0.05/day)
Processor AMD Ryzen 5 4600G @4300mhz
Motherboard MSI B550-Pro VC
Cooling Scythe Mugen 5 Black Edition
Memory 16GB DDR4 4133Mhz Dual Channel
Video Card(s) IGP AMD Vega 7 Renoir @2300mhz (8GB Shared memory)
Storage 256GB NVMe PCI-E 3.0 - 6TB HDD - 4TB HDD
Display(s) Samsung SyncMaster T22B350
Software Xubuntu 24.04 LTS x64 + Windows 10 x64
lol that's quite sad. Even cards as ancient as Radeon HD 7750 can decode H264 and in 4K too.

But what about Intel UHD 710. It decode everything on budget. Even 4K AV-1. Can be found even in Celeron. That's stupidly affordable.

Kinda shitty excuse for RX 6400 to not be able to do as much at nearly 4 times the cost of Celeron.

GT 1030 is also somewhat superior as it supports VP9 decoding and has ShadowPlay.

um, literally RX 5500 XT is superior to RX 6500 XT. Performs the same, used to be available at same or lower price, had ReLive, had PCIe x8 and has VP9 decoding/encoding, has more outputs.

Sure, it doesn't have AV-1 decoder, but when it was launched AV-1 was more in experimental stage.
Even RX 5300 is superior to RX 6400, not only smashing it at feature set, but offering superior performance in even lower tier.

I have i5 10400f and it skips some frames at 4k60 in YT with VP9. Also skipping parts of video is quite sluggish with it.
8k is completely out of question, it just doesn't work well at all.

You can argue that my screen is only 1440p, but, mate, I love extra bitrate and supersampling. Youtube is just not the same, once you try that.
I used to do it on phone too, before YT app had options for above native resolution, but there isn't as much benefit there as on bigger screen.
But yeah, I will keep my "crappy" i5 away from RX 6400 snafu edition, it's not worthy of proper CPU.
sadly amd will be more mediocre and greedy (more notable with ryzen 5 5600X) with lastest products, for this reason i dont have interest in give any money to this scumbag company

and are youre said rx 5300 xt is better than rx 6400 and curiously have same performance than rx 6500xt without forget be x3xx tier


resuming rx 5300 have pci-e gen 4 at 8x - 1408 shaders - 128bit memory bus with 112gbs on 4gb variant and 64bit memory bus with 168gbs on 3gb variant - h264/h265 encode capabilities



:)
 
Last edited:
Joined
Jan 14, 2019
Messages
12,359 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Not really, if you watch Brian from TechYesCity, there's basically universal i5 or i7 "tax". And motherboards on eBay have always ben terribly overpriced.
Did you check the links I sent? Here's the 4460 for 15 GBP with warranty. There is an i7 "tax", but old i5 CPUs are dirt cheap. H81 motherboards from ebay (link) between 20-30 quid. The RAM is in the post you replied to (with warranty again). £50-60 for the whole system. How much cheaper do you want it?

lol that's quite sad. Even cards as ancient as Radeon HD 7750 can decode H264 and in 4K too.
Did you try that? I didn't even know that 4K existed when I had a 7000-series Radeon in my PC. Here it says that they only decoded up to 2K, by the way.

But what about Intel UHD 710. It decode everything on budget. Even 4K AV-1. Can be found even in Celeron. That's stupidly affordable. Kinda shitty excuse for RX 6400 to not be able to do as much at nearly 4 times the cost of Celeron. GT 1030 is also somewhat superior as it supports VP9 decoding and has ShadowPlay.
A new Intel CPU + motherboard + RAM combo is too expensive just to watch movies. I thought we talked about this before.

um, literally RX 5500 XT is superior to RX 6500 XT. Performs the same, used to be available at same or lower price, had ReLive, had PCIe x8 and has VP9 decoding/encoding, has more outputs. Sure, it doesn't have AV-1 decoder, but when it was launched AV-1 was more in experimental stage. Even RX 5300 is superior to RX 6400, not only smashing it at feature set, but offering superior performance in even lower tier. If AMD just haven't stopped production of 5000 series low end cards, they would have cheap and competitive products. I don't see any reason why low end RX 6000 series have to suck so much. This snafu reminds me of FX launch, when Phenoms beat FXs, while being more efficient and cheaper.
Did you visit any 6500 XT review thread by any chance? I was one of the few who said in one of them what a terrible card it was for its identity crisis: it tries to be low-power, but has a power connector. It tries to be a gaming card, but it's too slow and restricted on PCI-e 3.0. It doesn't even have low profile variants, making it useless in SFF cases. It's also too expensive for 200 USD. But that's not the topic here. I like the 6400 because it's the complete opposite: it really is low-power, has low profile options for SFF maniacs (like myself), it isn't trying to be a gaming card, and while it isn't 2018 levels cheap, its price is a bit closer to what it offers than that of the 6500 XT. It's everything the 6500 XT should have been.

I have i5 10400f and it skips some frames at 4k60 in YT with VP9.
How? Even my Ryzen 3 3100 + RX 6400 HTPC doesn't do that. :eek: (Or maybe the 6400 isn't so useless after all?) ;)
 
Joined
May 8, 2021
Messages
1,978 (1.52/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Did you check the links I sent? Here's the 4460 for 15 GBP with warranty. There is an i7 "tax", but old i5 CPUs are dirt cheap. H81 motherboards from ebay (link) between 20-30 quid. The RAM is in the post you replied to (with warranty again). £50-60 for the whole system. How much cheaper do you want it?
I mean, outside of UK.


Did you try that? I didn't even know that 4K existed when I had a 7000-series Radeon in my PC. Here it says that they only decoded up to 2K, by the way.
4K sort of existed, AMD was pushing Eyefinity technology. It wasn't for video exactly, but for gaming. Here's a blast from the past:

In theory, card supported 6 displays, but good luck trying to to connect them all and then trying to look at all those displays. Anyway, in NCIX demo it ran with 3 displays at 5720x1080, which is 6.2 million pixels. 4k is 8.2 million pixels. Radeon 5870 could run 6 displays, with DVIs maxing out at roughly 1440p. So, it seems that in theory 5870 could run 6 1440p displays. One 1440p display is nearly 3.7 million pixels, multiplied by 6 it ends up being 22.1 million pixels. That's quite above 4K. 4K itself became known 9-8 years ago in monitor scene, which was 2013-2014. And 7750 was launched in 2012, so that's close enough. 7750 had DP 1.2, which supports 4k75 or 5k30, but card seemingly was limited to outputting only 4k60.

As far as decoding goes, I guess I will take your word for it. But there's ancient R9 285 that could do that. And then after that, even RX 460 could do it. On nV side, it seems that GT 610 already could decode 4k with h264 codec. So, my point remains the same. 4k decoding is old and should be nothing special. Lately, it has been a norm for cards to be able to decode it.


A new Intel CPU + motherboard + RAM combo is too expensive just to watch movies. I thought we talked about this before.
But you say that buying 4k decoding capable card is too expensive for ancient machine too. Buying Celeron upgrade is roughly 200 EUR and there will be a tons of other benefits of doing that, unlike with card only.


Did you visit any 6500 XT review thread by any chance? I was one of the few who said in one of them what a terrible card it was for its identity crisis: it tries to be low-power, but has a power connector. It tries to be a gaming card, but it's too slow and restricted on PCI-e 3.0. It doesn't even have low profile variants, making it useless in SFF cases. It's also too expensive for 200 USD. But that's not the topic here. I like the 6400 because it's the complete opposite: it really is low-power, has low profile options for SFF maniacs (like myself), it isn't trying to be a gaming card, and while it isn't 2018 levels cheap, its price is a bit closer to what it offers than that of the 6500 XT. It's everything the 6500 XT should have been.
yeah, I watched HWUB review. But crucially, cards like RX 6400 and 6500 XT fail to be HTPC cards due to nerfed decoding support. You are better off with Quadro T400 or GT 1030 or just Alder Lake Celeron with UHD 710. And in case of RX 6400 being fast, wouldn't have GTX 1650 served you equally as well in gaming, few years earlier and for the same price without gimped decoding? It's not like there wasn't LP GTX 1650 either.


How? Even my Ryzen 3 3100 + RX 6400 HTPC doesn't do that. :eek: (Or maybe the 6400 isn't so useless after all?) ;)
Like this:

Capture.PNG

That's just 4k60 VP9 video not in full screen mode. I skipped some parts of it myself, but yeah it dropped some frames. CPU was loaded to ~70% all the time too with some spikes, which lead to dropped frames. I tried 8k demo too and that just left CPU pegged to 100% and basically dropping half of frames. 1440p60 usually uses 35-45% CPU with as high as 75% CPU usage spikes, when manually skipping parts of video. During skipping video or going fullscreen, frames are dropped. So, while 1440p videos are perfectly watchable, experience isn't exactly immaculate. For such reasons I hope you can see why I say that GPU decoding is important. Because even pretty modern i5 in software can't truly ensure proper 4k video decoding experience. Sure it works fine, if you don't ever skip parts of video or you never go fullscreen while videos is playing, but then there could be some particularly intensive parts of 4k60 video, where even fast CPU may drop frames. Just pausing and starting video leads to big CPU usage spikes and dropped frames. If there was VP9 and AV-1 decoder card available, I might actually think of getting it, just for HW decoding.

BTW if you wonder why RX 580 doesn't do anything, it's because AMD said that there was an embargo for VP9 HW decoding in 2016 and VP9 decoding will come with driver updates. It turned out that there was either any or full VP9 decoding hardware in GPU and all AMD did was briefly offer hybrid VP9 decoding capabilities, which were buggy and didn't work right. So soon VP9 decoding was completely scrapped from drivers and Polaris ended up not having any proper VP9 decoding. BTW hybrid VP9 decoding means partial acceleration, meaning that card couldn't do complete job by itself and would be partially offloading decoding to CPU. That sounds exactly as shit as it was. I'm quite surprised that nobody sued AMD for lying about non-existant features
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,345 (6.65/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
I would use the 6400 or 6500 for htpc, business, or as a diagnostic board.

I had a R7 250X Ghost by XFX from 2016 but its driving a 5800 rig now.
 
Joined
Jan 14, 2019
Messages
12,359 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I mean, outside of UK.
Fair enough - I can only talk for the UK market.

4K sort of existed, AMD was pushing Eyefinity technology. It wasn't for video exactly, but for gaming. Here's a blast from the past:

In theory, card supported 6 displays, but good luck trying to to connect them all and then trying to look at all those displays. Anyway, in NCIX demo it ran with 3 displays at 5720x1080, which is 6.2 million pixels. 4k is 8.2 million pixels. Radeon 5870 could run 6 displays, with DVIs maxing out at roughly 1440p. So, it seems that in theory 5870 could run 6 1440p displays. One 1440p display is nearly 3.7 million pixels, multiplied by 6 it ends up being 22.1 million pixels. That's quite above 4K. 4K itself became known 9-8 years ago in monitor scene, which was 2013-2014. And 7750 was launched in 2012, so that's close enough. 7750 had DP 1.2, which supports 4k75 or 5k30, but card seemingly was limited to outputting only 4k60.

As far as decoding goes, I guess I will take your word for it. But there's ancient R9 285 that could do that. And then after that, even RX 460 could do it. On nV side, it seems that GT 610 already could decode 4k with h264 codec. So, my point remains the same. 4k decoding is old and should be nothing special. Lately, it has been a norm for cards to be able to decode it.
Yes, 4K "sort of" existed through Eyefinity and other multi-display technologies, but 4K decoding did not (nor did anyone need it).

But you say that buying 4k decoding capable card is too expensive for ancient machine too. Buying Celeron upgrade is roughly 200 EUR and there will be a tons of other benefits of doing that, unlike with card only.
Well, if you absolutely cannot live without 4K 60 fps Youtube, then you need hardware decoding either in a new CPU or a new graphics card. The 6400 can apparently do it (it has VP9), so there you go. 4K 30 fps works on anything I mentioned above.

BTW if you wonder why RX 580 doesn't do anything, it's because AMD said that there was an embargo for VP9 HW decoding in 2016 and VP9 decoding will come with driver updates. It turned out that there was either any or full VP9 decoding hardware in GPU and all AMD did was briefly offer hybrid VP9 decoding capabilities, which were buggy and didn't work right. So soon VP9 decoding was completely scrapped from drivers and Polaris ended up not having any proper VP9 decoding. BTW hybrid VP9 decoding means partial acceleration, meaning that card couldn't do complete job by itself and would be partially offloading decoding to CPU. That sounds exactly as shit as it was. I'm quite surprised that nobody sued AMD for lying about non-existant features
Wow! Now that's what I'd call a trap! The 6400 is nothing of the sort - it does exactly what AMD say it does. Nothing more, nothing less.
 
Joined
May 8, 2021
Messages
1,978 (1.52/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Yes, 4K "sort of" existed through Eyefinity and other multi-display technologies, but 4K decoding did not (nor did anyone need it).
To be fair, even Matrox Parhellia had surround capabilities. Imagine 3 CRTs running games in 2002 (*at slideshow framerates)


Well, if you absolutely cannot live without 4K 60 fps Youtube, then you need hardware decoding either in a new CPU or a new graphics card. The 6400 can apparently do it (it has VP9), so there you go. 4K 30 fps works on anything I mentioned above.
Nah, I can live without that and CPU does the job, but it just wasn't as decent as one might expect. This wouldn't be a problem at all, if YT allowed to force h264.


Wow! Now that's what I'd call a trap! The 6400 is nothing of the sort - it does exactly what AMD say it does. Nothing more, nothing less.
Aye, RX580 was a big trap, not only that, but hype around Polaris in general was insane. I have to admit, that in terms of PR, that was one of AMD's best campaigns, meanwhile the actual product ended up sort of okay (og Polaris) or quite poor (Polaris refresh). It was also quite ironic to see AMD advertising power efficiency, when RX 480s burned PCIe slots and then RX 580 consumed as much power as GTX 1080. Still the best thing about Polaris cards is how long lasting they are. They were launched like 5-6 years ago, but they are still fast, fully supported, get updates, got Finewine, have enough VRAM (it didn't end well for GTX 970 owners with 3.5 GB or GTX 1060 3GB owners). Actual functionality of cards improved too. They got professional drivers, special compute mode, RIS, enhanced sync, Chill, Anti-Lag, integer scaling, 10 bit pixel format, AMD Wattman. That's a lot of stuff. Besides that, Polaris and Vega cards were the last without encrypted vBIOS, so vBIOS mods are super simple and easy. AMD also was very generous with voltage, therefore these cards are easy to overclock and undervolt. We also have heard a lot of Polaris/Vega capabilities in mining. Really, besides some lies and stupid hype, Polaris (or better said GCN) was one of the best releases by AMD. It's definitely up there with Radeon HD 4870. it also happened to avoid awful QC problems, like og GCN being poo at any tessellation at all, or entire RX 5000 series suffering common voltage/driver malfunctioning (anyone remembers black screen issues? Turns out it was hardware defect, voltage was set too low on many cards and the only fix was sacrificing boost speed or doing some software magic to avoid getting into super low power states). Despite TDP reduction and speed cap, my RX 580 still runs games at 1440p at medium-high.

BTW I actually managed to almost make my computer run 8k30 Youtube. I only needed to use Xubuntu, some script blocker in web browser and it was very close to playable. Gotta say, that supersampling looked awesome and I miss that kind of bitrate. At that point, it really starts to look like lossless video. I did the same with another Athlon X4 machine, but with 4K on 1080p screen. That ended up not as great. There was so much resolution, that it started to have aliasing :D. Wanna try running 8k60 video on that poor Ryzen? It's literally harder than running Crysis. Funny thing is that RX 580 most likely could run GTA 5 at normal settings at 8K. But supersampling in games is something entirely different than in videos. RX 580 with 8x SSAA struglles to run Colin McRae Rally 2005. Original resolution was 1920x1440, so actual resolution was 7680x5760. To be fair, it mostly ran well, but smoke kills performance. Looked very nice and sharp.
 
Last edited:
Joined
May 21, 2009
Messages
270 (0.05/day)
Processor AMD Ryzen 5 4600G @4300mhz
Motherboard MSI B550-Pro VC
Cooling Scythe Mugen 5 Black Edition
Memory 16GB DDR4 4133Mhz Dual Channel
Video Card(s) IGP AMD Vega 7 Renoir @2300mhz (8GB Shared memory)
Storage 256GB NVMe PCI-E 3.0 - 6TB HDD - 4TB HDD
Display(s) Samsung SyncMaster T22B350
Software Xubuntu 24.04 LTS x64 + Windows 10 x64
for courtesy of videocardz as tagged as rumor appear possible some prices of arc gpus:




:)
 

Trov

New Member
Joined
Apr 26, 2022
Messages
5 (0.01/day)
Update on my Single Slot RX 6400 for Lenovo Thinkcentre Tiny analysis now that my XFX card arrived:

The card didn't immediately fit into my Thinkstation P330 Tiny; I had to dremel a little bit off of the front frame section that the front wifi antenna sits on. Alternatively this metal frame piece can simply be removed entirely with just 1 screw. After that the XFX RX 6400 fits inside. I dont think the Sapphire model, being about 1 or 2 cm longer has any chance of fitting inside the Lenovo Tiny. The PowerColor model appears to be the same length as the XFX version so should also fit.

The fan holes on the P330 cover don't line up with the RX 6400 fan, so eventually I will drill more holes later.

My first impression is that the XFX Single Slot RX 6400's fan is way louder (probably a good 2x-3x louder) at the same RPMs and has way more of a 'tone' to it than my Quadro T600/T1000. Unfortunately it's loud enough that I am sufficiently annoyed enough to consider staying with the Quadro instead as the final choice. The fan kicks in even in games such as Risk Of Rain 2 at 1080p. However, it does run about 10C cooler than the T1000 in the same game.(tested with P330 top cover off in both cases). I wonder if my particular fan is faulty or if it's just a crappy fan that XFX used. Maybe there is an alternative fan I can jerry-rig in place instead, since the fan can be removed without having to remove the heatsink. It does not appear that I can alter the fan curve of the RX 6400, at least with Afterburner.

Time Spy will not run for some reason as the benchmarks close as soon as loading is complete. Wonder if either a recent TimeSpy update or AMD Drivers broke it. Will try again in a week.

Since Time Spy isn't working I haven't done a whole lot of performance testing yet. I can say though that at PCIe 3.0, FurMark 1080p is about 5fps faster on the RX 6400 vs the Quadro T1000 but a few frames slower at 4K than the T1000.

Unless I can come up with a satisfactory fan solution I don't think I will want to keep the card, unfortunately. I was hoping for something a little quieter than the T1000, but it's much louder, despite running much cooler. Since the RPM and fan size are more or less equal between the two I don't think this is a case of "the card is much cooler because the fan is working much harder" so I think the noise can be totally solved with a better fan part.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,359 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Update on my Single Slot RX 6400 for Lenovo Thinkcentre Tiny analysis now that my XFX card arrived:

The card didn't immediately fit into my Thinkstation P330 Tiny; I had to dremel a little bit off of the front frame section that the front wifi antenna sits on. Alternatively this metal frame piece can simply be removed entirely with just 1 screw. After that the XFX RX 6400 fits inside. I dont think the Sapphire model, being about 1 or 2 cm longer has any chance of fitting inside the Lenovo Tiny. The PowerColor model appears to be the same length as the XFX version so should also fit.

The fan holes on the P330 cover don't line up with the RX 6400 fan, so eventually I will drill more holes later.

My first impression is that the XFX Single Slot RX 6400's fan is way louder (probably a good 2x-3x louder) at the same RPMs and has way more of a 'tone' to it than my Quadro T600/T1000. Unfortunately it's loud enough that I am sufficiently annoyed enough to consider staying with the Quadro instead as the final choice. The fan kicks in even in games such as Risk Of Rain 2 at 1080p. However, it does run about 10C cooler than the T1000 in the same game.(tested with P330 top cover off in both cases). I wonder if my particular fan is faulty or if it's just a crappy fan that XFX used. Maybe there is an alternative fan I can jerry-rig in place instead, since the fan can be removed without having to remove the heatsink. It does not appear that I can alter the fan curve of the RX 6400, at least with Afterburner.
That's sad. :( My Sapphire is very quiet. I can barely hear it even when the fan reaches 3500 rpm at 66 °C GPU temp. It's weird because it looks like the same fan that any other low profile 6400 uses.

Time Spy will not run for some reason as the benchmarks close as soon as loading is complete. Wonder if either a recent TimeSpy update or AMD Drivers broke it. Will try again in a week.
Try updating your motherboard BIOS. My 6400 displayed a weird, distorted green boot image on my 4K TV (but only on that - every other TV or monitor was fine) until I updated the BIOS on my TUF A520M.

Since Time Spy isn't working I haven't done a whole lot of performance testing yet. I can say though that at PCIe 3.0, FurMark 1080p is about 5fps faster on the RX 6400 vs the Quadro T1000 but a few frames slower at 4K than the T1000.

Unless I can come up with a satisfactory fan solution I don't think I will want to keep the card, unfortunately. I was hoping for something a little quieter than the T1000, but it's much louder, despite running much cooler. Since the RPM and fan size are more or less equal between the two I don't think this is a case of "the card is much cooler because the fan is working much harder" so I think the noise can be totally solved with a better fan part.
I'm not sure if that's your card's fault, or XFX in general, or if the Sapphire's cooler is really that much better. It does have a flat heatpipe running through the length of it, which I haven't seen on other models.

Still the best thing about Polaris cards is how long lasting they are. They were launched like 5-6 years ago, but they are still fast, fully supported, get updates, got Finewine, have enough VRAM (it didn't end well for GTX 970 owners with 3.5 GB or GTX 1060 3GB owners). Actual functionality of cards improved too.
You can say the same about the 1060 6 GB, 1070 (Ti) and 1080 as well. Other than that, I agree. I personally never thought much of the RX 500 series, but looking at their popularity, I have to give them some credit.
 

Trov

New Member
Joined
Apr 26, 2022
Messages
5 (0.01/day)
I'm not sure if that's your card's fault, or XFX in general, or if the Sapphire's cooler is really that much better. It does have a flat heatpipe running through the length of it, which I haven't seen on other models.
The XFX heatsink also has a heatpipe and is made out of skived fins. The sapphire heatsink is probably a couple cm longer but I doubt that makes a massive difference.
 
Joined
May 8, 2021
Messages
1,978 (1.52/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
You can say the same about the 1060 6 GB, 1070 (Ti) and 1080 as well. Other than that, I agree. I personally never thought much of the RX 500 series, but looking at their popularity, I have to give them some credit.
I wouldn't say that 1060 aged as well. It was initially a bit faster than RX 580 8GB, but years later, RX 580 beats it, not to mention that RX 580 loves vulkan, where it beats 1060. And if you need to run any computational software, then Polaris cards are a lot faster than Pascal cards. It's so ridiculous, that in floating point (double precision) operations, RX 560 beats GTX 1060. RX 580 is beating GTX 1080 Ti. I managed to take advantage of that in BOINC, but yeah I know that this isn't particularly interesting thing for average consumer. Even in single precision FP tasks, Polaris cards beat pascal cards significantly. In double precision floating point compute, RX 580 is still faster than RTX 3070 Ti. That's nuts. The old Vega 64 is still faster than RTX 3090 Ti. So if you need FP64 compute card, Polaris was insanely good. Today, you would need to buy RTX A6000 or Radeon Pro W6800 to beat Vega 64 in FP64 compute. Just to match (kind of Vega 64), minimum spec would be RTX A4000 or Radeon Pro W6600. If you wanted to help C19 vaccine research via BOINC or Folding@Home, you basically had to have GCN based card.
You know what? Polaris and Vega cards strongly remind me of Tesla arch cards. They guzzle power like no tomorrow, but in terms of processing power architecture was very well balanced and just lasted a hella long time. Tesla card like 8800 GTS lasted at least good 6 years and was bearable for 8 years. There just wasn't any architectural flaw (like in Kepler) to make them useless way before their time. GCN is AMD's equivalent of Tesla arch, but more modern and still relevant today. I honestly couldn't say the same about Terrascale or rDNA, GCN was special. Even on nVidia side Fermi, Kepler felt quite disposable architecture and then Turing 1 and Maxwell just weren't great either. Tesla is still the best arch with Pascal being close second.
 
Joined
Jan 14, 2019
Messages
12,359 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I wouldn't say that 1060 aged as well. It was initially a bit faster than RX 580 8GB, but years later, RX 580 beats it, not to mention that RX 580 loves vulkan, where it beats 1060. And if you need to run any computational software, then Polaris cards are a lot faster than Pascal cards. It's so ridiculous, that in floating point (double precision) operations, RX 560 beats GTX 1060. RX 580 is beating GTX 1080 Ti. I managed to take advantage of that in BOINC, but yeah I know that this isn't particularly interesting thing for average consumer. Even in single precision FP tasks, Polaris cards beat pascal cards significantly. In double precision floating point compute, RX 580 is still faster than RTX 3070 Ti. That's nuts. The old Vega 64 is still faster than RTX 3090 Ti. So if you need FP64 compute card, Polaris was insanely good. Today, you would need to buy RTX A6000 or Radeon Pro W6800 to beat Vega 64 in FP64 compute. Just to match (kind of Vega 64), minimum spec would be RTX A4000 or Radeon Pro W6600. If you wanted to help C19 vaccine research via BOINC or Folding@Home, you basically had to have GCN based card.
You know what? Polaris and Vega cards strongly remind me of Tesla arch cards. They guzzle power like no tomorrow, but in terms of processing power architecture was very well balanced and just lasted a hella long time. Tesla card like 8800 GTS lasted at least good 6 years and was bearable for 8 years. There just wasn't any architectural flaw (like in Kepler) to make them useless way before their time. GCN is AMD's equivalent of Tesla arch, but more modern and still relevant today. I honestly couldn't say the same about Terrascale or rDNA, GCN was special. Even on nVidia side Fermi, Kepler felt quite disposable architecture and then Turing 1 and Maxwell just weren't great either. Tesla is still the best arch with Pascal being close second.
I don't know much about compute (and I don't care, either), so I take your word for it.

As for which architecture is better, I somewhat disagree. Terascale was awesome at the time of release, but newer games killed it. GCN was also great.

On nvidia's side, I agree with what you said about Kepler and Fermi - they were hot, hungry, but otherwise kind of meh. Maxwell was a huge improvement on them, just like Pascal was on Maxwell. They're both great architectures up to this day, imo. Turing never aimed for more performance over Pascal. It only introduced RT and DLSS. One could call it "Pascal RT" as well. For this, I cannot say that it's bad because it's not. Just a bit different. Ampere on the other hand, is nothing more than a dirty trick. Nvidia added FP32 capability to the INT32 cores to say that they "doubled" the CUDA cores without actually doubling them. Performance per watt stayed the same, though, so technically, it's "Pascal RT v2.0".

The XFX heatsink also has a heatpipe and is made out of skived fins. The sapphire heatsink is probably a couple cm longer but I doubt that makes a massive difference.
Probably not. Although, it's strange to see that the Sapphire runs as cool as the full-height MSi card in the review. I have no issues with noise, either. :confused:
 
Top