If AMD really wants to be a disruptor here against the 5070 and 5070 Ti and just garner widespread positive feedback from the market, then the 9070 XT should not be more than $549 tops at launch in my opinion. The fact they called it the "70", I think matters for positioning, and $649 or more is just going to be seen as the same price hike strategy Nvidia is doing (just not as high). I truly think that as far as mindshare is concerned, anything north of $600 is just not going to win them anything as long as the market can still say RT is worse and FSR is worse. They need to be ultra aggressive like they were with Zen - Zen 2 to win back marketshare.
Right, I agree. But they ARE launching into an atmosphere where product is scarce (atm), so it's always possible they might try to take advantage, at least short-term. Hopefully not, bc it always screws them.
Ray-tracing (960p/1080p; 1440 'quality'/4k 'performance' up-scaling) and native 4k lows will tell the story if this thing even makes sense versus so many other cards out there rn for those (sometimes different) uses. If it's just another 1440p raster card, who cares? Lots of those. Even if it can do 4k native okay sometimes; that also exists for a similar price already. They need actually GOOD rt/upscaling IQ/minimal perf hit.
And decent price. Only if they accomplish ALL of those can they replace the (often) 7900xt price. If it falters at ANY of those things, it has to be the GRE, imo.
I'm thinking the real thing they could do, and it's kind of is ridiculous but kind of isn't, is offer a 32GB 24gbps card with high clocks/3x8-pin and call it a 9080. It *feels* like something like that is being floated.
I mean it (mostly) doesn't need the ram, as I think most running 4k are probably going be up-scaling from 1440p, but there are some sitch >16GB could help for native or LLM, ntm bw for scaling core speeds higher, which could help 1080pRT and/or 1080pRT->4k upscaling. DGMW, I hope a 24gpbs/high-clocked card exists with just 16GB, but that's one way to pad margins and make it look like people are getting something for the extra money. Again, it's superfluous, just like 24GB is for 5080 (these will all be replaced by 18GB/192-bit cards on 3nm, and those make a lot more sense), but it would work as a stop-gap.
That's kind of what they did with 4890, back in the day (1GB vs 512MB; higher clock potential). Obviously that card was a re-spun 4870 on a higher-performance process, but N48 perhaps already that (vs N32).
It kinda comes down to how high they let each model clock, if it's bw-starved, how FSR4 performs, and how high the chip itself is capable of clocking (if given say, up to 450W+, which yes...I know sounds absurd).
NGL, it would be interesting to see 32GB N48 and 24GB 5080 fight with a similarish power envelope of >375-525w. I mean, ofc 5080 would win, but N48 would probably be 5070ti comp and potentially no slouch.
The whole question really comes down to if the chip is 'good-enough', and there's just no way to know right now. It still might not be possible to know until someone circumvents to the 9070xt power limit.
Even then, I still have a really difficult time believing they can really fight (outside of a novelty of a 32GB card) against something higher than a 5070 (16GB vs 12GB).
If they go head-to-head, or even proportional, I just can't see it working; it truly never does. They have to fight one tier down, even though sometimes nVIDIA makes that (purposely) tough on their margins.
That's JMO, and I know some people think it sounds farsical, but it has literally proven to be the only way people will buy their cards; regardless if they 'deserve' that.