Tuesday, September 22nd 2020
AMD Radeon "Navy Flounder" Features 40CU, 192-bit GDDR6 Memory
AMD uses offbeat codenames such as the "Great Horned Owl," "Sienna Cichlid" and "Navy Flounder" to identify sources of leaks internally. One such upcoming product, codenamed "Navy Flounder," is shaping up to be a possible successor to the RX 5500 XT, the company's 1080p segment-leading product. According to ROCm compute code fished out by stblr on Reddit, this GPU is configured with 40 compute units, a step up from 14 on the RX 5500 XT, and retains a 192-bit wide GDDR6 memory interface.
Assuming the RDNA2 compute unit on next-gen Radeon RX graphics processors has the same number of stream processors per CU, we're looking at 2,560 stream processors for the "Navy Flounder," compared to 80 on "Sienna Cichlid." The 192-bit wide memory interface allows a high degree of segmentation for AMD's product managers for graphics cards under the $250-mark.
Sources:
VideoCardz, stblr (Reddit)
Assuming the RDNA2 compute unit on next-gen Radeon RX graphics processors has the same number of stream processors per CU, we're looking at 2,560 stream processors for the "Navy Flounder," compared to 80 on "Sienna Cichlid." The 192-bit wide memory interface allows a high degree of segmentation for AMD's product managers for graphics cards under the $250-mark.
135 Comments on AMD Radeon "Navy Flounder" Features 40CU, 192-bit GDDR6 Memory
The issue comes with the leaks also saying that this will happen at 150W (170W in other leaks), down from 225W for a stock 5700 XT and more like 280W for one operating at ~2GHz. Given that power draw on the same node increases more than linearly as clock speeds increase, that would mean a massive architectural and node efficiency improvement on top of significant tweaks to the node to reach those clock speeds at all. This is where the "this isn't going to happen" perspective comes in, as the likelihood for both of these things coming true at the same time is so small as to render it impossible.
And remember, these things stack, so we're not talking about the 30-50% numbers you're mentioning here (that's clock speed alone), we're talking an outright >100% increase in perf/W if the rumored numbers are all true. That, as I have said repeatedly, is completely unprecedented in modern silicon manufacturing. I have no problem thinking that AMD's promised "up to 50%" perf/W increase might be true (especially given that they didn't specify the comparison, so it might be between the least efficient RDNA 1 GPU, the 5700 XT, and an ultra-efficient RDNA 2 SKU similar to the 5600 XT). But even a sustained 50% improvement would be extremely impressive and far surpassing what can typically be expected without a node improvement. Remember, even Maxwell only beat Kepler by ~50% perf/W, so if AMD is able to match that it would be one hell of an achievement. Doubling that is out of the question. I would be very, very happy if AMD managed a 50% overall improvement, but even 30-40% would be very, very good.
Anyways, seeing how this launch feels so rushed by Nvidia, I don't think their marketing just got dumb all of a sudden, I think they have better info than us and they feel some pressure. I do not think that pressure comes from consoles, because 400 USD consoles do not compete with 800USD graphic cards, so I think the pressure must come from RDNA2. But the numbers we've seen in the past 5 days really look to good to be true, like it's not even a hype train anymore, it's a hype jet.
But there's very little availability of the cards everywhere and for that, I can think of only 2 reasons, they launched very in a hurry without building up any stock (there must've been like 20k total cards worldwide), or their yields are much lower than expected, but I feel that they should've known how the yields are since August, at least.
A big part of the draw of these things is having an item no one else has. This plays into NVidia's marketing strategy, and is overall beneficial to NVidia IMO. Its how you market luxury goods. If anything, AMD should learn from NVidia and work towards that kind of marketing. If everyone had a thing, it isn't a luxury anymore. Its just normal.
AMD, for better or worse, seems to be using an inferior good strategy. IMO, that diminishes the brand a bit, but it does make AMD's stuff a bit more personable. I don't believe that the average Joe buys a $799 GPU, and seeing AMD consistently release stuff in the $150 to $400 market is a laudable goal (especially because NVidia seems to ignore that market). The argument AMD makes is almost always price/performance, but that just solidifies the idea of "inferior goods" to the mindset of people. Its subconscious, but that's the effect.