• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD RDNA 4 and Radeon RX 9070 Series Unveiled: $549 & $599

Lies.

View attachment 387214



2080Ti was the first card with faux MSRP. The actual MSRP slipped out on NV's own presentation with Huang later on.


Elaborate. Assuming this is a thought and not a random for lulz sentence.

With AMD's 10 times smaller market share and top dog cards being about 10% (or less, for AMD possibly much less) of total GPU sales, the effect is split 1%.

You cannot possibly deduct given the accuracy of "market share" coming from the only source at the moment and with god knows what, but certainly not fraction of % precision.

I meant my country. Not whole EU. :oops:
 
Quote is spot on, that is exactly what you did.
Still nope. Reading comprehension is hard for you eh? At no point did I say anything about the 480.
This wasn't about FEs.
It was about fake MSRP.

This is the first time it happened:

View attachment 387219

Later on, when next gen hit, Huang has referenced the MSRP of 2080Ti.
As.... wait for it... $1200.

Mkay? :D
VEga 64 wasnt a founder edition, it was a bait and switch price that, frankly, should have been illegal. AMD marketed the card at $500 then raised the price to $600 because they wanted to.

Take your Ls and move on, Mkay? :slap:
 
Well, it's right there in their video:
View attachment 387218

Timestamped for you as well.

I am on 6000 cards anyway (and prefer to play native) but here you go:


FP8 is the plausible reason.
 
AMD has said they might add FSR4 to RDNA3 later, I'm not expecting them to and with less time spent on previous gen is more time for development on FSR4 for RDNA4.
But man, the Nvidia Defense Force is going strong, digging up crap on cards from 10 years ago, seriously? When Nvidia plays the pricing game, they gimp their cards with less bandwidth, VRAM, and price them at least $100 too high since at least Turing, but they always get the free pass and buyers will buy it anyway.
Oh yes, because AMD never does this *cough cough 7600, 7700xt cough cough*.

You should really drop the "nvidia defense force" thing. Middle schoolers would think that's really cool and epic, but everyone else thinks its just plain silly. Especially when you level it at people who own AMD cards.
 
Keep smoking the copium, do you understand what "possible" means? It means "maybe".
I think I do. Do you understand what "might" means? :D

1740763991682.png
 
Keep smoking the copium, do you understand what "possible" means? It means "maybe".
No way man, there was a post on Reddit by u/deleted! That means its CONFIRMED! NVDEAD!
 
My issue with the non-XT costing 549 is not even the price per se (though I'd like it'd rather cost 499), it's the fact it's the exact MSRP of the 5070. So either the 9070 conclusively beats it (because we all know 5070 will sell on name alone), or it'll tank on the shelves more than it'd normally do.
the 12GB on the 5070 is a HUGE drawback. if the 9070 can perform on both raster AND RT at a comparable level, the 9070 can success (especially taking into account the huge amount of issues with the Blackwell series generally)
 
Oh yes, because AMD never does this *cough cough 7600, 7700xt cough cough*.
Which competed with the even worse 4060 and 4060Ti, whats your point?
I know you and the rest of the geforce buyers want AMD to hand out 9070XT's for free though.
You should really drop the "nvidia defense force" thing. Middle schoolers would think that's really cool and epic, but everyone else thinks its just plain silly. Especially when you level it at people who own AMD cards.
You say that yet post things like this. I expect the hyperbolic childish BS from reddit, not here.
AMD are saintly, granted unto us to fight against the tyrants, all problems are the result of nGREEDia mindshare, never of AMD's accord

Radeons 420:69
 
$50 too high. This won't gain AMD any new customers.
 
$50 too high. This won't gain AMD any new customers.
IMO, $150 too high. If I wasn't motivated to buy the RTX 4070 Super at $600 or the 7900 GRE for $550 a year ago, why would I want to buy the 9070 XT for $600 now?
 
It's the same pricing game they played last generation, they want you to buy the more expensive parts so they jack up the price of lesser parts to make higher end parts look like a better deal.

This is partly true, part of the thing you need to consider is to not devalue older parts such as 7900gre/7800xt/6800xt. Had 9070 been priced more-fairly ($400-450 imo, ~$500 to others) the price on those older GPUs would have fallen through the floor, destroying the new low-end market (9060 etc). By keeping this level of performance this price, it ensures the aftermarket value of the aforementioned cards stays at ~$400, rather than going down to as low as $300 (on average). To me, that's the apparent reason, given it's essentially a price hike on a 7800xt to the price of 7900gre (which over-all are very similar cards if you factor in overclocking). For this reason, I'm not a fan of this pricing. They could have kept it stagnant, they could have given a small cut, they could have given a large cut (to compete with 5060ti, which again I know sounds a little crazy but are the moves AMD really needs to make if they want to gain market share). Instead, the price went up. Some will argue things like new RT and up-scaling, but wrt a card like this, I don't really think they'll be strong-enough to take advantage of any of those features to a large extent. You really need a 7900xt to capitalize on 1440p->4k raster, and this ain't that. We still don't even know if a 9070xt is 'good-enough' for demanding RT, and the lower-end card is almost-certainly not. People can buy them and/or argue value for these features on such a product, but I would not.

As for 9070xt, I'm glad people appear to be mostly content. I think they're again over-priced by a good 10%, but this is on the highest-end of acceptable. I could write a whole spiel on why, but would prefer to wait for reviews to truly explain it. These are mid-range cards; they're not high-end cards. To me, $600 is a bit excessive for mid-range, and I do not think these cards will be able to do a lot of things people are hoping (which would require a 5080, give or take extra ram). Had a card for this price had faster ram and/or high power-level/clock potential, I would think it is fair, but it does not. AMD apparently wants to charge even MORE for that, give or take extra ram it *may* not need, and I think that's a little greedy.

It's possible they wait and the cards launching now drop to more where I think they should be by the time something like that launches.
In that case it would make sense, but I still think they're 'optimizing revenue' on this chip.
I think AMD needs to optimize more people using their cards, but that's JMO.
It's possible this will happen anyway, given nVIDIA's 12GB cards have very severe and apparent limitations (some of which shared by 6800xt/7800xt/7900gre/9070).
I would have preferred they fought them head-on to accentuate these differences, but I guess they figure they can get away with a small premium, which may be true.
 
Last edited:
$599 is a fine price for the XT if AIBs can stay close to that price, stock remains steady, and there the performance presented is still there for 3rd party reviewers.

If you want a non-XT for $499 or less, you already know the drill: pretend it hasn't launched yet and will actually be launched 6 months from now.

N48 is a small die on a mature node, likely very low defect rate, and AMD doesn't even have that many 9070 cards produced and ready to go... so why make it an attractive price if its going to sell out right away and stay out of stock for long period of time. Better to price it too high to slow down its sales, and then drop price when there is enough volume there that you can absorb the sales (Same as 7700XT, GCD there was 200mm^2, so absurdly low defect rate so why make it a good price when you won't be able to make enough to satisfy demand at that price anyway).

I've been looking to replace the 980Ti in my steambox for a while and the 9070XT, if done right, might displace my 6800XT in my main rig, which will then flow down and displace the 980Ti in my Steambox. Not likely, but the best chance of that happening in a long time.
 
Last edited by a moderator:
the 12GB on the 5070 is a HUGE drawback. if the 9070 can perform on both raster AND RT at a comparable level, the 9070 can success (especially taking into account the huge amount of issues with the Blackwell series generally)
Ehh, the 4070 and 4070 Super sold far better than the competing 16GB AMD cards (7800 XT and 7900 GRE, respectively). 4070-Ti didn't do as well, but that's because it was up against the 20GB 7900 XT (which wasn't popular either), and neither card was in a good place in the market. VRAM is only a concern for enthusiasts, not the general public.

And almost no games actually had a problem with 12GB of VRAM at 1440p. The extra bandwidth of the wider bus was usually what helped AMD cards at 1440p and 4k, and that's addressed with GDDR7. The 5070 has more bandwidth than either RX 9070 card. Plus, DLSS 4 and the Transformer model are more VRAM friendly, and neural rendering/neural compression save VRAM space as well. I reckon we'll see a 5070 Super with 3GB chips for 16GB of VRAM next year, but I don't think 12GB will hurt it the base model in any game for the target resolution.
 
The pricing is OK but only ok. 9070 should have been 499. OEMs will gouge the shit out of these.
If I can get a XT at MSRP I'll definitely buy one that has a water block released for it. It will be a decent upgrade for my 6800xt. But if they are hundreds above MSRP I'll not be touching as I'd consider that a side grade.
No reference cards means no waterblocks for a longer while, I guess. And then we wait to buy to see which ones get waterblocks, but only the ones that sell the most will get waterblocks. Momen 22 thing. And the ones that sell the most will not be the models I am interested in. And MSRP probably only at launch, then go higher. Just lose situations, sigh.
 
Ehh, the 4070 and 4070 Super sold far better than the competing 16GB AMD cards (7800 XT and 7900 GRE, respectively). 4070-Ti didn't do as well, but that's because it was up against the 20GB 7900 XT (which wasn't popular either), and neither card was in a good place in the market. VRAM is only a concern for enthusiasts, not the general public.

And almost no games actually had a problem with 12GB of VRAM at 1440p. The extra bandwidth of the wider bus was usually what helped AMD cards at 1440p and 4k, and that's addressed with GDDR7. The 5070 has more bandwidth than either RX 9070 card. Plus, DLSS 4 and the Transformer model are more VRAM friendly, and neural rendering/neural compression save VRAM space as well. I reckon we'll see a 5070 Super with 3GB chips for 16GB of VRAM next year, but I don't think 12GB will hurt it the base model in any game for the target resolution.
yes, because feature wise, AMD was nowhere near nvidia (DLSS vs FSR, RT, wattage). but this time around, the gap will probably be much less
 
Not optimizing for old cards isnt gimping. Many have claimed nvidia "gimped" their older cards, and sites like hardware kanucks have proven this claim wrong many times.

Here's the secret sauce, ready for it? Nvidia consistently puts out an entire stack of cards that have some sort of improvement, even if minor. AMD has not done this since 2011 with the HD 7000 series. The RX 6000s were the first time AMD had a complete stack since GCN 1, and the RX 7000s let their midrange card sit in limbo for 9 months to clear out old inventory, meanwhile the market filled with RTX 4070s.

Consumers value consistency from a brand. It's why they become repeat customers. If a brand becomes inconsistent in their offerings, the consumer trust declines, and the longer this continues, the worse the effect.
On this we disagree. We actually don't, but you're looking at it wrong. It is indeed the secret sauce.

It is partly semantics, but important semantics. Every gen on every card nVIDIA holds it back a small amount in [x] way so the next iteration (cheaper for them) looks better. People need to learn the difference.
Sometimes compute limitations, sometimes ram, sometime features (software) that could run and/or run fine on older products but can't given the artificial limitations put on older products.

This is literally their business model; to up-sell and for people to desire a constant upgrade as they are chasing the dragon caused by this practice. You can disagree, but you are provably wrong.

AMD does not *typically* do this, and it has hurt them financially (as often-times it appears like they are re-releasing similar-performing products), but also caused people whom notice it to appreciate it.

You can't make a product more 'well-matched', you can only sell it cheaper. You can do what nVIDIA does, and they do sell it the way you interpret it by literal design, but I think it's scummy.

Again, this is because some people do not understand how the actual products work. They believe the marketing rather than understanding the logistics, which to me is very unfortunate.
 
I am on 6000 cards anyway (and prefer to play native) but here you go:


FP8 is the plausible reason.
I don't think a quick interview that happened last month is more relevant than their actual announcement today.
That's the same as Nvidia saying that the new DLSS MFG might come to Ada/4000 series, so pretty meaningless.

FP8 might be an reason, but the actual lack of dedicated GEMM units makes more sense. Doing stuff with their WMMA delivers way less performance and may not deliver good results.
 
Reputation has a lot of inertia, just look at the number of people on this site that post outdated info, or are mislead by NV marketing tactics (5070=4090?).

Hell we still see AMD=terribad drivers all the time here.

AMD will have to not only catch up to NV in RT and FSR4, but stay caught up for 2-3 generations before the dull is shined on their rep on that front.

Just cause the 9070 matches 5070 across the board on everything, people will still buy 5070 5:1 or 10:1 just cause the reputation.

1740767342387.png
 
$599 for the XT is amazing (provided there are models sold at this MSRP). $549 for the non-XT, though? Why even bother?
Its there so you buy an XT, quite simply

Gotta applaud this pricing. Well placed! Might consider a sidegrade here.

Reputation has a lot of inertia, just look at the number of people on this site that post outdated info, or are mislead by NV marketing tactics (5070=4090?).

Hell we still see AMD=terribad drivers all the time here.

AMD will have to not only catch up to NV in RT and FSR4, but stay caught up for 2-3 generations before the dull is shined on their rep on that front.

Just cause the 9070 matches 5070 across the board on everything, people will still buy 5070 5:1 or 10:1 just cause the reputation.

View attachment 387228
The trajectory is there, with RDNA though. We have to be realistic. AMD isn't far off from Nvidia in any way shape or form.

- They have an effective architecture
- They have now optimized that and are building a small die on it with high efficiency
- There is a framework for a good upscaler now
- RT performance is in the usable range, as far as Nvidia is 'considered usable' for similar perf
- The VRAM/perf/$ offering relatively is a lot better than Nvidia, which keeps this card relevant at a reasonable price point longer
- RDNA2-3 were excellent overall, in terms of stability and driver quality. I would dare say, drivers have shown to be more stable than Nvidia's clusterfuck of hotfixes and new-gen early woes.

I have to say I was impressed with the tech deep dive on TPU and especially the CU efficiency improvements. This is a big move forward, even if the high end isn't covered.
 
Last edited:
I don't think a quick interview that happened last month is more relevant than their actual announcement today.
That's the same as Nvidia saying that the new DLSS MFG might come to Ada/4000 series, so pretty meaningless.

FP8 might be an reason, but the actual lack of dedicated GEMM units makes more sense. Doing stuff with their WMMA delivers way less performance and may not deliver good results.
If FSR4 ends up coming to my RDNA3 card, I win (even if I rarely ever never use FSR), even if the performance gains aren't as noticeable against native in comparison to RDNA4. If it doesn't, why would I care?
At least this time there's a feasible enough technical reason for the older gens not to support the tech. 7900XT and XTX may be actually able to brute force it, but would it be worth it?
 
Reputation has a lot of inertia, just look at the number of people on this site that post outdated info, or are mislead by NV marketing tactics (5070=4090?).

Hell we still see AMD=terribad drivers all the time here.

AMD will have to not only catch up to NV in RT and FSR4, but stay caught up for 2-3 generations before the dull is shined on their rep on that front.

Just cause the 9070 matches 5070 across the board on everything, people will still buy 5070 5:1 or 10:1 just cause the reputation.

View attachment 387228
Honestly though if you buy a 12 GB 5070 at the same price as a 16GB card that is much faster, you need a doctor's appointment, or you're simply drunk. Giving away a whole tier just like that is retarded.

If FSR4 ends up coming to my RDNA3 card, I win (even if I rarely ever never use FSR), even if the performance gains aren't as noticeable against native in comparison to RDNA4. If it doesn't, why would I care?
At least this time there's a feasible enough technical reason for the older gens not to support the tech. 7900XT and XTX may be actually able to brute force it, but would it be worth it?
FSR is still a per-game affair. As long as there isn't a community and modding scene to just plonk that stuff in any game you want, its still whateverland to me and I totally ignore it in any purchase decision, just as I did with the previous iterations and DLSS. Even DLSS is still a per-game affair, let's face it. The moment you start 'relying' on it, you've sold your soul to said company that promises to offer it... but needs to keep wasting capital to do so. You don't need to be a rocket scientist to figure out that is a finite thing, not something that'll keep on giving just because you bought a GPU.

It comes down to, when people sell me progress, it better be actual progress, and not some bandaid for graphics that needs constant TLC to be even available everywhere. Fuck that. You haven't improved jack shit, you're just wasting ungodly resources to get something reasonably playable. It reminds me of just one thing: the Chinese and their supposed node shrinks that are on DUV and thus economically not viable at all, but hey, they can sell you the lie now that they moved forward. The proof is in the pudding, with all those devs 'relying' on upscalers to get somewhat playable frames on an engine that sucks monkey balls. I'm not paying a single goddamn dime for that clusterfuck. Ever. Devs better take note, too, because this group won't be getting smaller as upscalers get ever more fragmented (what version, what games? etc.). A unification effort is on the horizon sooner rather than later, and ironically, the more this gets pushed, the faster it'll happen. That's when I become a believer.

The trajectory of FSR is fast becoming a trajectory similar to DLSS, which makes me an instant non-supporter. If you can deploy this tech GPU agnostic (to a reasonable degree, say, all DX12 feature enabled cards for example), by all means. If not? GTFO - I don't need it, don't want it, won't use it. End of. History only repeats, we've been here before, we know how it ends.
 
Last edited:
Have not retailers have these cards since January?
Indeed. Let's hope they have already made and stored enough to not allow for price hiking.
 
The 9070 should have been 60$ cheaper ;)
 
Back
Top