• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 Series Technical Deep Dive

Also, why bring up the 1070? It's not 2016 anymore, you have to think in terms of what you can buy now.
Are we factoring in diminishing returns?! Maybe comparing to something like Neural rendering whenever that rolls out would be more apt? Relatively toward the beginning of a technological arc (curve?!)

This is where I think Musk comes unstuck.
 
Last edited:
9070 XT vs 5070 Ti. That's it. What other comparison do you want in this segment?


So tell me, where is the 9000 series flagship?

Why are you attacking AMD for being greedy when you have Nvidia with a similar card for $150 more?

Also, why bring up the 1070? It's not 2016 anymore, you have to think in terms of what you can buy now.

so you tell me names are arbitrary but are comparing arbitrary namings?!!?? AMD just changed their naming and i guess it worked, it fools you at least
how does that makes sense. You can only compare things if you have a metric to do it, comparing to the lowest or fastest card eliminate the naming problem.

I'm not attacking AMD, it's even worst with Nvidia if you use the same comparison, what are you talking about?

What 9000 flagship, again with the naming?

don't we use performance percentages here to compare cards, why are we using names? should TPU stop using performance %'s and compare x's and 9's in the naming?

sure let's not compare things unless they are made in the same year, what kind of logic is that
 
how do you make a fair comparison price vs performance than?

this is the only thing that makes sense to compare, you can't run away from it:
The 1070ti adjusted to inflation was 600usd back in 2016, but the 1070ti was 33% slower than the flagship card. The 9070xt costs the same 600usd but it's 70% SLOWER than the flagship. You're all just normalising greed. It's like the sheep thanking the wolf for being eaten.

you're no paying more and getting much less than in 2016

Again... compare the die sizes mate.
You're getting 43 square mm MORE than you did in 2016, for the same 600 USD.

You need to quickly get your head examined at this point. If you start comparing, do it right.
 
Don't make me do the math... Its not justification for anything, who said I'm buying?

Markets are what they are, at best we can try to nudge people to make a smarter choice. But you and I both know it doesn't work like that, for most.


Exactly. Different time, different market, the comparison doesn't really make sense. There was no x90.
There were the Titan cards, though.
I think you need a coffee. AMD gave us major progress on perf/$ here.
Hard disagree, "major" would be >40% performance over last gen at the same price, or at least >30% and some extra hardware. 9070 XT price is 20% higher than the 7800 XT and rumored performance is 50% higher, bringing price/perf down to 25%. 9070 is even worse. Almost any other gen this would just be a regular release.

Even the 7800 XT got better price/perf from dropping the price $150 from the 6800 XT, this isn't going to move the needle in AMD's favor by a substantial degree.
The 9070XT is a BETTER offering than what Pascal gave us ;) AMD just gives you 43 mm2 for free.
Nice bait.
 
No its not. The x90 is a new entry in Nvidia's line up and they even added an SKU to cover the gap and the increased disparity in shader counts between top and bottom end. In other words, since Ampere, we've gotten an x103 SKU in addition to the x102, and the shader counts are miles apart from x104, a gap that was widened substantially since then, and widened further with Blackwell.
GTX 280 was similar but 80 Class gpu. Not even Ti, 90 or Titan!
 
GTX 280 was similar but 80 Class gpu. Not Ti, 90 or Titan!
Yes, and still About 40% smaller than the current 5090 we should compare everything under the sun with today, according to some.
So that still doesn't really work out well for saying the x90 we're looking at today is somehow business as usual, its clearly not, its one of the largest Geforces ever.

But you really don't need the flagship comparisons here that never work out; are we going to spend half a page including the 690 as well which was actually just 2x GK104? Come on.

Its really bloody simple. The (fair) cost of a chip is directly related to its DIE SIZE. So compare similar sized dies on price/perf adjusted for inflation. And then... @Bomby569 really just confirmed the 9070XT is a great offer.

The end.

Nice bait.
Its not, though. Or are you saying that a 4nm 357mm2 die is cheaper to make than a 16nm 314mm2 die?

Its brutally simple... the current offer is much better.
 
so you tell me names are arbitrary but are comparing arbitrary namings?!!?? AMD just changed their naming and i guess it worked, it fools you at least
how does that makes sense. You can only compare things if you have a metric to do it, comparing to the lowest or fastest card eliminate the naming problem.
I have a metric. Price and performance.

I'm not attacking AMD, it's even worst with Nvidia if you use the same comparison, what are you talking about?
Then let's at least give AMD a point on the fact that they didn't pull another Nvidia -$50 move. There's no need to be overly negative about everything.

What 9000 flagship, again with the naming?
No. Performance and price.

don't we use performance percentages here to compare cards, why are we using names? should TPU stop using performance %'s and compare x's and 9's in the naming?

sure let's not compare things unless they are made in the same year, what kind of logic is that
Let's use performance percentages, then. The MSRP of the 7900 GRE (corrected) (which the 9070 XT is replacing) is $549. The 5090 is 229% faster. 229% of $549 is $1247. There's your comparison to the "flagship" (if there is such a thing).
 
Last edited:
Its not, though. Or are you saying that a 4nm 357mm2 die is cheaper to make than a 16nm 314mm2 die?

Its brutally simple... the current offer is much better.
I understand what you're saying, but it's also a comparison across brands. If we went by that logic the A770 is a killer deal.
 
I understand what you're saying, but it's also a comparison across brands. If we went by that logic the A770 is a killer deal.
IF the A770 also had performance and stability that is comparable to the other two brands per square mm of die space, then it is.

That is also what Intel counted on when they released it.
Now look at B580.
Its in a better state, and it immediately received a markup because, indeed, for its die size (and vram), its a fantastic offer, and also seems to perform more comparable to the rest.

The performance and stability between Nvidia and AMD are perfectly comparable though. You should do the trip through GPUs database comparing various GPU die sizes over time. I think you might be surprised at how small dies used to be, some 10-12 years ago. Anything over 400mm2 was exceptionally large. x70 generally didn't exceed 300mm2.

The GTX 670 for example was a mere 294mm2! AND it had 15% of that disabled on top of that.
 
Last edited:
IF the A770 also had performance and stability that is comparable to the other two brands per square mm of die space, then it is.

That is also what Intel counted on when they released it.
Now look at B580.
Its in a better state, and it immediately received a markup because, indeed, for its die size (and vram), its a fantastic offer, and also seems to perform more comparable to the rest.

The performance and stability between Nvidia and AMD are perfectly comparable though. You should do the trip through GPUs database comparing various GPU die sizes over time. I think you might be surprised at how small dies used to be, some 10-12 years ago. Anything over 400mm2 was exceptionally large. x70 was built on something not even x60 can run on today.

The GTX 670 for example was a mere 294mm2! AND it had 15% of that disabled on top of that.
Ehh... Idunno, the 980 Ti was 601mm2... kinda feel shortchanged by the 9070 XT now, actually...
 
so you tell me names are arbitrary but are comparing arbitrary namings?!!?? AMD just changed their naming and i guess it worked, it fools you at least
how does that makes sense. You can only compare things if you have a metric to do it, comparing to the lowest or fastest card eliminate the naming problem.

I'm not attacking AMD, it's even worst with Nvidia if you use the same comparison, what are you talking about?

What 9000 flagship, again with the naming?

don't we use performance percentages here to compare cards, why are we using names? should TPU stop using performance %'s and compare x's and 9's in the naming?

sure let's not compare things unless they are made in the same year, what kind of logic is that
I'm not even sure what you are trying to say by filling up this thread with you're "logic". What I do know is the gpu in your system is about 4 yrs old and still costs more than these 9070's, while performing about 50% less, and you are doing your best to cry about it.
 
They've done it twice now (7800 vs 7700) but I don't understand why they're doing these $50 price gaps. It's not a meaningful difference and the nobody is going to buy the lower card at that price. Just drop it by another $50. You're going to have to within a month anyway, so why let everyone's review say that it's a bad value first?
I think part of the idea is that it puts a fairly tight price ceiling on the 9070 SKUs, and allows for a wide range of higher MSRP AIB cards on the XT. Bit of a price ladder perhaps. $550 "entry point" becomes $600 "might as well for the extra juice" becomes $750+ Strix or whatever because you might as well invest in a premium card for the better/quieter cooler and sundry blinkenlights.

Curious to see how supply plays out. If there's a lot more supply of the 9070 and 5070/5070 Ti remain inflated and unavailable, I could see plenty of new builders taking that option, especially if there's decent in-stock options at or at least near $550.
 
5070 TI is cutdown though. Only 70 out of 84 SM enabled on GB203.

5080 would be the direct comparison based on die size alone. (Both fully enabled at similar mm²).
Oh...I thought it was the other way with 5080 being cut down 5090....then that's another L for AMD and they should have priced it even less
If i'm not mistaken last time it was RX 6800

performance-per-watt_2560-1440.png
That's only perf per watt
 
Oh...I thought it was the other way with 5080 being cut down 5090....then that's another L for AMD and they should have priced it even less

That's only perf per watt
AMD should have priced the 9070 XT less than $600 because the 5080 which is $1000 has a similar die size? Are you serious? :kookoo:
 
so many amd and nvidia fanboys here crying over "spilled milk" ! you all acting like children.

Nobody cares about market share or ANYTHING! simple gamers buy what is best when it comes to their purchasing power. that's it !
they buy amd or nvidia depending on their real needs. not your hypotheticals "X is better than Y, so im bigger better stronger than you" kind of mentality.
 
so many amd and nvidia fanboys here crying over "spilled milk" ! you all acting like children.

Nobody cares about market share or ANYTHING! simple gamers buy what is best when it comes to their purchasing power. that's it !
they buy amd or nvidia depending on their real needs. not your hypotheticals "X is better than Y, so im bigger better stronger than you" kind of mentality.
This ^^
 
so many amd and nvidia fanboys here crying over "spilled milk" ! you all acting like children.

Nobody cares about market share or ANYTHING! simple gamers buy what is best when it comes to their purchasing power. that's it !
they buy amd or nvidia depending on their real needs. not your hypotheticals "X is better than Y, so im bigger better stronger than you" kind of mentality.
Real people (outside of tech enthusiast circles) buy whatever's available at their price point, or just anything Nvidia because "it's the way to be played" or something.

You're right, nanometres, square millimetres, transistors and all that technobabble doesn't matter to them.
 
Ehh... Idunno, the 980 Ti was 601mm2... kinda feel shortchanged by the 9070 XT now, actually...
Can you elaborate?
The 980ti wasnt 599.... and it sure wouldnt be today adjusted for inflation either. The MSRP was 649,- 9,5 years ago. Additionally... it was the last gen on a very mature node (28nm) so exceptional in terms of yields and therefore a lot cheaper than near cutting edge 4nm.
 
As for performance, looking at the graphs AMD put out, and accounting for PR gimmicks, we give the RX 9070 XT a realistic chance at competing with the GeForce RTX 5070 series, slotting in somewhere between the RTX 5070 Ti and RTX 5070, while the RX 9070 could perform close to the RTX 5070 and the upcoming RTX 5060 Ti.

This reads like a very pessimistic scenario. Let's hope it is more like 9070 XT very close to 5070 Ti and 9070 clearly above 5070.
 
die sizes compared:


now justify it away for me based on evidence, go ahead
GB202 is clearly lonely at the top, right? Not really a great point of comparison. Together with TU102, which was also priced rather high. Its not justification for anyone or anything. You misread what Im saying... Im simply explaining why things get priced as they do/did.

You are free to think whatever you want though. I do think emotions cloud your judgment here. The 9070XT is priced fine and definitely along the lines most people expected and preferred. In the current market, similarly, it offers good bang for buck.
 
All I read is that it's ~20% faster vs 7900 GRE. That having an average (based on eg this review) of around 68fps at 4K gives 9070XT card approximately 80fps average. Bring in FSR and I don't see many people needing anything beyond that card. So I believe AMD made the right decision, and hopefully it sticks to it and pushes prices down.

What I hope even more is that this will give us some nice mid-range cards later with 9060 series - AT SENSIBLE PRICES.

I have wide 3440x1440 screen (which is roughly 40% less pixels than 4K). And I am good with 60fps averages and dips to 30-40fps. So if we get 50% cheaper cards with 50% lower perf, or there about, we're looking at 250-300$/€ mainstream cards that can run 1440p and wide 3440x1440 screens at acceptable performance BEFORE counting on FSR and related tricks.

I am literally praying inside that this happens... That would be like first sane-priced midrange card after something like GTX1660 series (from 2019!!!!)

Am I overly optimistic? Probably :-/ could it still happen? Damn hope so!

Edit: typos, lol, probably more leftover, sorry, phone typing
 
TL;DR - The short-lived greed will seriously hurt sales of the 9070 throughout the entire sales lifespan of the GPU, even when the price drops.
One of these cards will be the popular performer and the other needs to be razor thin margin or loss leader.
That's how I would price for marketshare. We'll see insane corrections if there's a panic and likely. Be wary of it.
I'm not sure what the encoder has to do with my argument that the vanilla 5070 is overpriced.
I get you're not a creator but hardware encode, especially ones relevant to the future and right now are a massive selling point.
If this encoder stacks up against or topples the 5000 series, that's GG no RE for a very LARGE and anxious creator/editor market.
On GPUs the encoder hardware is often independent of the core. Sometimes there's calls for CUDA power or maybe ROCm here.

There is a lesser SKU. The 9060 is scheduled to ship next Q. If it replaces the terrible aging RX6400s, that's compact compute too.
The rumor that the vanilla 9070 is missing some encode like AV1 might hinder its sales further. DOA if true. Not sure but we'll see.
I think at best it's just good. Does a "good" product drive market share for a company with ZERO mindshare in the market outside of hardcore tech forums
I mentioned yesterday it's rough. Either one near $600 is DOA but there are so many moving parts in this market that can make it worse.
We may be staring down a panic buyer situation over missing/malpriced GRE cards and similar. Storage war or not, they will fly off shelves.
what they call the "off-brand" of Radeon only undercuts Nvidia by $150 for a comparable GPU: if you have an Nvidia card in your system, you only have yourself to blame.
You absolutely have NOT been paying attention to the market in recent months. Price discovery is completely fuxx0red.
When trying to find agreeing prices between countries it's like capital controls are in effect. These markets are all different.
This is what happened to my plans of trying to pick up another 7900XT.
No chance, the GRE has been MIA and some 7800XT units went $1000+.
I'm a big fan of shiny, quiet, affordable and overclockable. I'll find it too.
Of course, unless it's a 4090/5090, you use CUDA or some other reason prevents you from going AMD. Then by all means, choose Nvidia.
Fffhahahahaha~!
The 9070xt costs the same 600usd but it's 70% SLOWER than the flagship. You're all just normalising greed.
The 9070XT is the flagship of this launch and possibly all of RDNA4. It's not even an upper level card.
RDNA4 silicon: 9070XT (flawless?) -> 9070 (min flaws) -> 9060 (worse binning) -> All other scraps???
All the worst silicon of this gen will possibly go to some IGP on later specialty Ryzen board products.
What I hope even more is that this will give us some nice mid-range cards later with 9060 series - AT SENSIBLE PRICES.
The 9060 is obviously going to be scraps and I'm willing to bet there's a ton of them to go around.
In such case I hope it gets all encoding hardware needed and sets the new min performance floor.
That should kick off the seriously needed bloodbath that burns Intel and nvidia at the very bottom.
Intel will have to try something new like listening to customers and nvidia anything below 5070, SOL.
 
Can you elaborate?
The 980ti wasnt 599.... and it sure wouldnt be today adjusted for inflation either. The MSRP was 649,- 9,5 years ago. Additionally... it was the last gen on a very mature node (28nm) so exceptional in terms of yields and therefore a lot cheaper than near cutting edge 4nm.
We're comparing price/performance using die sizes for dies that were stable and performant, right? Just figured we'd take it to it's logical conclusion. It's 875 adjusted for inflation, so a 45% increase in price for a 68% increase in die size means it's a better value. N4C is also mature, it's the third revision of TSMC's 5nm process and is cheaper to produce than their high-end nodes.

While we're comparing die sizes, a more relevant look would be at last gen's 7800 XT - the 9070s are a value regression compared to that card.
so many amd and nvidia fanboys here crying over "spilled milk" ! you all acting like children.

Nobody cares about market share or ANYTHING! simple gamers buy what is best when it comes to their purchasing power. that's it !
they buy amd or nvidia depending on their real needs. not your hypotheticals "X is better than Y, so im bigger better stronger than you" kind of mentality.
Damn, Intel fanboys getting no love :cry:
 
Back
Top