Tuesday, August 6th 2024

AMD Readies Radeon RX 7400 and RX 7300 Based on "Navi 33" Silicon

AMD is rumored to be readying two new entry-level desktop GPU models in the Radeon RX 7000 series. These are the RX 7400 and the RX 7300, which probably succeed the RX 6500 XT and RX 6400, respectively. Perhaps the most interesting aspect of the two are the silicon they're based on. Apparently, AMD is carving the two out from its 6 nm "Navi 33," the same chip it uses for its Radeon RX 7600 and RX 7600 XT SKUs.

The "Navi 33" monolithic silicon is based on the RDNA 3 graphics architecture, and has 16 workgroup processors (WGPs), or 32 compute units (CU), worth 2,048 stream processors, 64 AI accelerators, 32 Ray accelerators, 128 TMUs, and 64 ROPs. The silicon is maxed out in the RX 7600 and RX 7600 XT, and we haven't seen anything to suggest the existence of a desktop RX 7500, which means the RX 7400 and RX 7300 could be heavily cut down from the chip, with AMD reducing not just the CU count, but even the 128-bit GDDR6 memory bus width.
Sources: komachi_ensaka (Twitter), VideoCardz
Add your own comment

34 Comments on AMD Readies Radeon RX 7400 and RX 7300 Based on "Navi 33" Silicon

#26
john_
DaemonForceNope, too expensive at any price point.
That's the current reality. 15 years ago anything 10-20% faster than an iGPU was priced at $40-$60. Today anything modern, meaning not 2-3-5-10 generations old, that is better than simply sending image to the monitor, starts at over $150. It's pity that just 5 years ago someone could buy an 8GB RX 580 at that price, a card that was closer to mid range than low end and today at that price someone buys a card that is somewhat better than the best iGPU. But that's the current reality.
Posted on Reply
#27
DaemonForce
kapone32In your opinion. You see this card would not be used for 4 monitors. Most people only use 2 anyway.
The thing about that. It's "possible" just like my RX 580 but that doesn't quite mean you should do it.
I figured it was enough to drive the pixels on a 1080p144 display and VR LOW. It did great and still does.
The people buying the RX 6400 are price/availability/time locked out of some better feature product.
kapone32You have no idea how RDNA 3.5 will work with older titles.
I know it will have similar growing pains that RDNA3 has right now.
kapone32Then in the specs to you see HDMI 2.1 support for 4K 120Hz. Show me another budget card in that price point that has that.
No. Where the hell do you see 4K120? I don't even see that anywhere and I'm max 1080p144 territory.
Budget cards? Here you go.



See the problem yet? No?


Have fun with more of the Chinese chopping block mystery silicon.
We are effectively going backwards (again).

john_Today anything modern, meaning not 2-3-5-10 generations old, that is better than simply sending image to the monitor, starts at over $150.
Cannibalizing current product from either consumer or professional space isn't the way to go for these cards but at some point very soon someone is going to break rank and start doing it. I don't see it happening with 7000 series but it's possible and far more likely to come of 8000 series. I don't like how we get all the way here from such high grade silicon. They should use these to experiment with MCM layouts and put some serious pressure on the market.
Posted on Reply
#28
Vincero
DaemonForceThere it is. So what can we expect from the RX 7400 if these are the issues for the previous gen+class?
It's not an encoder card which means OBS is out.
That's a lot of assumption. The RX 64/6500 was based on a dedicated lower tier IC (Navi 24) which was feature hobbled from the outset - it had a VCN block but it was not the same / as capable as the VCN blocks in other Navi 2x products (probably with the expectation of being pared with iGPU solutions with encoder acceleration - let's not kid ourselves, the Navi24 was a laptop first part - PCIe x4 limitations are for power saving as well as cost reduction).
IF the replacements are using Navi 33 chips with shader/compute and TMU/ROP reductions, there is no reason to automatically assume the VCN block will be crippled or have similar restrictions - disabling encoder/decoder features while leaving others intact is not as straightforward as fusing off additional pipelines (although yes it might be possible to nerf most encoding and leave an intact decoding block).

Also to what end would this achieve seeing as Intel is bringing competition? If they are repurposing dies you generally want to make the least amount of core logic changes apart from nuking the bits that don't work.
It's not going to play well with most games at 1080p unless LOW settings.
There may be driver related issues just getting certain antique apps to behave.
It's not something you pick up for mission critical work. It's a last resort option for getting a picture without the stank of a GT1030.
I wouldn't want this in a laptop either.
To be fair this would probably work as a reasonable option in a laptop that's not gaming specific and isn't using a large APU chip, but has instead got a 'big' CPU like a Core i9-13900HX or some such, maybe for the mobile workstation crowd and branded as a Radeon Pro part and use the stable driver branch - it's not gonna beat the best of the nvidia quadro RTX mobile options but it doesn't really need to.

The other points depend a lot on use cases - it would probably be fine for most esport crap at 1080p.
Posted on Reply
#29
kapone32
DaemonForceThe thing about that. It's "possible" just like my RX 580 but that doesn't quite mean you should do it.
I figured it was enough to drive the pixels on a 1080p144 display and VR LOW. It did great and still does.
The people buying the RX 6400 are price/availability/time locked out of some better feature product.

I know it will have similar growing pains that RDNA3 has right now.

No. Where the hell do you see 4K120? I don't even see that anywhere and I'm max 1080p144 territory.
Budget cards? Here you go.



See the problem yet? No?


Have fun with more of the Chinese chopping block mystery silicon.
We are effectively going backwards (again).



Cannibalizing current product from either consumer or professional space isn't the way to go for these cards but at some point very soon someone is going to break rank and start doing it. I don't see it happening with 7000 series but it's possible and far more likely to come of 8000 series. I don't like how we get all the way here from such high grade silicon. They should use these to experiment with MCM layouts and put some serious pressure on the market.
I know that because I have a 6500XT and a 4K 120hz VRR TV, but here is some food for thought. That is also if you want to use an APU get a As Rock board as the HDMI port on those AMD boards support 120Hz. Where it matters is Freesync for tear free 1080P Gaming.

comprehensiveco.com/introducing-hdmi-2-1/
Posted on Reply
#30
Mr. Perfect
_roman_A waste of sand. Even older games struggle with less "gaming power" as a Radeon 6600XT in WHQD.

For quite a while I have the Powercolor Radeon 7800XT hellhound - basically an entry gaming card for WHQD.
Oh yeah, these are definitely not for WHQD. 1080p at most, possibly at minimum settings too. Should work for the esports titles I suppose.
Posted on Reply
#31
GhostRyder
If they make some decent ones in the single slot/low profile form factor I can see these being interesting.
Posted on Reply
#32
Lew Zealand
TheinsanegamerNBy "attacked" you mean "rightfully criticized" your lord and savior AMD for releasing a card that underperformed vs 4 ear old nvidia tech. Immediately after AMD played up how 8GB was obsolete, they then released a 4GB garbage card that was missing functionality was was effectively useless.

Nobody has done so, stop lying. Not everything has to be about how AMD good, nvidia Bad.

Hmmm...nope. 1630 was released and mocked, of course not as badly as the rx 6400, since it didnt suck as much.
Your "lord and savior" Nvidia's 1630 is a much worse card than the 6400 which has 60% higher performance.
TheinsanegamerNDoes the GT 1030 just not exist in your reality or something?
A 7 year old card with nearly 4-gen old tech that's no faster than a 5 year older AMD card? In what reality is that useful or even relevant?
Posted on Reply
#33
sLowEnd
TheinsanegamerNHmmm...nope. 1630 was released and mocked, of course not as badly as the rx 6400, since it didnt suck as much.
How'd the 1630 not suck as much? It performs a lot worse, doesn't have a power consumption advantage, and was barely cheaper
Posted on Reply
#34
Minus Infinity
I would prefer to use my long since sold GTX 1070 or the slightly more modern 5700.
Posted on Reply
Add your own comment
Aug 11th, 2024 00:23 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts