That being said I hope AMD continues strong, because we can't expect a company with 7 billions income per quarter to keep fighting with a company that is closing to 40 billions per quarter.
The problem for the average consumer is that, Nvidia primarily and AMD secondarily, proved that the average consumer today is willing to pay 2-4 times more compared with 5-10 years ago for a GPU. That's simply bad. Maybe if Intel fixes it's manufacturing it will be in a position to flood the market with cheap 18A graphics cards in the near future. For now, let's see what the xx60 models will have to offer and at what price points. And they need even higher numbers of GPUs to cover the demand for the xx60 models, so let's see.
They've been fighting Intel from this position (relatively small-scale esp wrt R&D/marketing budget) their literal whole existence. Similar w/ GPG vs nVIDIA for also a very, very, long time. Almost 20 years.
nVIDIA has gotten larger, but were always operating from an advantage. By advantage, I mean those budgets; not necessary engineering prowess, especially wrt customer needs.
In-fact, sometimes despite the latter (arguably because of the former), which has been successful for them in chaining along customers based on marketing. AMD operates on their tech speaking for itself.
That isn't even hyperbole. AMD is primarily an engineering company (as was ATi before it), nVIDIA is a marketing company first (in both advertising and engineering goals), and a technology company second.
We already know how '60' GPUs will perform. Like a 9070 xt but with half the units. It is less than half the size as 9070 xt as it's meant for 'sweet-spot' clocks w/ 20gbps ram, where-as N48 can scale higher.
Probably for a currently unannounced product (which I don't know will ever see the light of day, but probably). Obviously their goal is a 1440pRT GPU to compete with a 5080 w/ 24GB (maybe w/ 32GB).
The 'problem' is obviously using a chip that small (and scaling clocks) will likely use a lot of power. OTOH, likely be relatively inexpensive; we'll have to see how that works out in mind/market.
Both wrt N48 scaling higher and N44 being 'good-enough' for what people want (which I would argue it is not; I think 9070 xt is the baseline you should buy for RT or just buy something cheaper/better for raster).
Simiarly, even if a faster N48 is 'ok' for 1440p native RT, it will be questionable wrt 4k up-scaling. It may do those things 'okay', but IMO (just like wrt 5080) worth waiting until 3nm if you didn't buy a 4090.
Prices have adjusted some, but wrt to AMD I don't think it's that bad. I'll always be the guy saying 9070 xt should have been $550, but asking $50 for early RT/FSR4 adopters at this budget RN isn't drastic.
They've slowly been trying to to jump from the $300 market to the $400 market at the bottom, and in some ways have to, as going from 8GB to 16GB adds extra cost.
I could imagine N44 being $300-350 at the top, but the thing that really screws N44 is Intel (B570/580).
Intel undercut that market by taking almost zero margin, plus adequete bus/ram; pretty tough to directly compete in 1080p raster market wrt price/perf with a 128-bit bit bus as more games req >8GB.
Similarly, they want to catch the areas 7900xt and 7800xt/7900gre sold (~$600-650 and ~$470-550), but I think the market will reject it for N48 long-term. Right now people thirsting for not $1000 16GB RT cards.
AMD just don't want, for instance, 9070 to freefall in price like 7700xt did, as it then overlaps their market with a lower chip. Instead, they need to build products towards each of those markets, and likely are.
And likely have, clearly hoping 9070 doesn't exhibit that drop from ~7800xt pricing eventually (until it is replaced). If successful or not, TBD. It likely depends on what nVIDIA does wrt 5070 pricing.
Even if you extrapolate to the future, where perhaps their low-end may be $350-400 for a full chiplet stack (and perhaps $50-100 less with units disabled), that isn't too bad of a hike compared to nVIDIA at $400/500.
Similarly I expect their stack to scale, and while it perhaps not be exactly equal in MSRP (although it could be), it's still conceivable it will be in similar markets (not whatever they can charge; unlike a '90').
With likely AMD undercutting what nVIDA wants to sell for formerly $1200, now $1000 (because that market rejected that price long-term after early-adopters buy and/or supply settles).
Looking back at AMD, we have clear bellwethers at consistant markets. Not only with where 9070 xt is selling, but 7900xt/xtx (~600/800 or so). Furthermore, attempted markets (like 6900xt/6950xt/7900xtx MSRP).
Essentially, AMD has to make products for what 7700xt actually sold for (~$350), and not much lower. This is likely where they will aim UDNA as a base. Cut-down chips and cheaper, sure, but that's the meat.
They also need to hit where 7800xt, GRE, 7900xt, and 7900xtx all sold. 9070 series currently doing the GRE/79XT, eventually first two. UDNA will likely also target these markets and accepted pricing.
So, say something like a stack that is one chiplet stack at ~$350 or so (whatever ASP for 7700xt), a disabled two-chiplet stack for ~$500, another for ~600 or so, and a full double stack for around $800ish.
nVIDIA price-checks the market constantly, and adjust supply, where-as AMD very much builds TO the market and what it will pay. This is very fact-checkable, and something they mention consistantly.
My prices for UDNA speculation, but it truly does all make sense (especially if figure adjusting RAM amounts on a 256-bit bus to 16/24/32GB etc; maybe disabling chiplets from 2048 to 1792/1536).
IMO, this strategy is going to absolutely slaughter nVIDIA if the latter keeps doing what they do, which is overprice and underspec, especially when AMD builds not only to accepted market pricing, but also a spec.
Like 1080p RT (or 1080p RT scaled to 1440p). Or 1440pRT native. Or 1080pRT up-scaled to 4k. Or 1440pRT to 4k scaled. Or 4k native. Unlike nVIDIA with their unbalanced specs and/or planned obsolescence...
ALL CARDS YOU CAN EXPECT AMD TO MAKE AND DO THOSE THINGS W/O COMPROMISE. ALL AT PRICES THAT THE MARKET ACCEPTS. THIS IS WHAT THEY DO. No shenanigans, unlike their competitor.
I wonder if I should sell my 7900XTX while it still has decent value. I don't need the absolute performance if 9070 XT is almost just as fast (and faster in RT which might be relevant in the card's lifetime), but newer and can be made more power efficient.
What do you think?
IMHO, I would just overclock the 7900xtx to wherever it's stable until you're ready for another real upgrade. The 7900xtx is a better over-all card (although not by a ton), excluding FSR4.
It all comes down to if you if you're a 4k (raster) gamer that plays at 4k and/or 1440p->4k. If you do, keep the 7900xtx. If you're a 1080p/1440p gamer, 9070xt will run 1440p raster and up-scale RT better.
The only real downside to 7900xtx is the fact RT has to up-scale from 1080p->4k and FSR3 looks....well...not great. The expected market of 9070 xt is not 4k. It's 1440p. I would argue you could also run 'quality' up-scaling of RT at 1440p on a 7900xtx, but that's the value in the 9070 xt (and FSR4 will look better; although I don't think FSR3 in that scenario is unusable unlike 1080p->4k). 7900xtx get a bad rap.
It's only true failing is that it has to scale RT from 1080p->4k (it's intended market) and FSR3 looks bad scaling that much. If they implentented FSR4 (and improve it), and with an OC, it would be a great card.
AMD truly did put those users in a pickle, and it's clearly because they want to start fresh with everyone on the same page of their new arch goals of one level lower raster (or one level higher RT) per market.
Right now that market is 1080p/1440p. 4k users real only alternative (imho) is a 4090 until either company makes something similar (and cheaper), which they didn't this gen.
Again, I have speculated it is because it would cost ~$1200, and neither wanted to attempt that market long-term given 4080 was unsuccessful at that pricing (and AMD didn't attempt it after 4080 price cuts).
It's also possible they believed anyone that wanted that spec and was willing to pay over $1000 bought a 4090. Both of those explanations, perhaps in conjunction, most feasible.
As I've said before, that leaves users like me waiting on cheaper used 4090's and/or another generation. Similar for 7900xtx users, if not just some kind of FSR4 port as a stop-gap.