• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon 9070 XT Rumored to Outpace RTX 5070 Ti by Almost 15%

These are, still, speculations. Which means actual performance might be vastly different. Might be a glorified 7800 XT, or a 7900 XTX murderer, who knows.

And, okay, they never planned on pricing it at 900 USD so what's the plan? Is there a plan?

Oh, and most importantly, it's crucial to strike at a proper time and the time was up ages ago. With this "slightly worse and slightly cheaper" approach AMD GPUs got outsold so obscenely I'm not sure it's not illegal to post these numbers. Was RDNA a game changer? Not really, it didn't offer anything you couldn't get already by buying a Pascal GPU, a 3 years older architecture. Was RDNA2 a game changer? Far from it. RDNA3, even less impressive despite new building approach.

They must include killer features. Must have included, rather. Now it's a "boy who cried, 'wolves!'" situation. No matter what they release the vast majority won't buy it because they already quote-unquote know AMD GPUs suck.

AMD should have put the RX 7900 XT as a direct RX 6800 XT replacement - 650 bucks, instead they went green greedy goblin and charged 900$ :kookoo:
 
Question 1: How? It will have extremely low memory bandwidth - leaks of GPU-Z show mediocre 644 GB/s. For a comparison, the RTX 5090 is able to reach 2176 GB/s !

Question 2: How with only 16 GB ? If the VRAM doesn't matter, then why do they put so much, and instead don't limit the cards to 10 GB (as of RTX 3080) ?

View attachment 382548

I don't understand your questions. Are you asking about the 9070XT or are you asking about a hypothetical 96CU RDNA4 die or are you asking something else or are your questions purely rhetorical.
 
  • Like
Reactions: ks2
Wow. This is very close to the 5080... With AIBs pushing a 330W TDP, it'd comes even closer; yet people still insist it should be priced at $500? Meanwhile, Nvidia is selling its overpriced hardware for $1,200–$1,400 with only an 8–10% performance gain. For real, Even at $600, the 9070XT would sell exceptionally well and be a strong win in the current market.

What’s puzzling is that if they can extract this much performance from a die smaller than 400mm², why not replicate the RDNA3 MCM approach and produce a 500–600mm² GCD + Six MCDs to compete with the 5090? TDP? Nvidia is already pushing the game to 600w so who cares?
Perhaps Nvidia has secured all the available GDDR7 stock, or AMD sees greater profitability in allocating its TSMC capacity to Instinct accelerators (most likely).
As anyone in business knows: Any fool can sell at a loss.
Would you sell a wafer for ( say ) 200 DIY or 100 OEM for PC when you can sell the same to servers for 1000?
If this gen doesn't sell well ( it won't, there's far too much ingrained resistance to AMD ) I can see them dropping out of the
DGPU market altogether to concentrate on the server CPU/CPU/APU side where they can at least turn some profit.
Still, at least we now have Intel, right?
 
I don't understand your questions. Are you asking about the 9070XT or are you asking about a hypothetical 96CU RDNA4 die or are you asking something else or are your questions purely rhetorical.

About how the RX 9700 XT could come close to RX 7900 XTX, when its specs scream a slightly modified RX 7800 XT ?
50% gap !

1738264221718.png


Realistically, the 9700 XT should be no faster than there:

1738264314770.png
 
AMD should have put the RX 7900 XT as a direct RX 6800 XT replacement - 650 bucks,
That, frankly, would've killed both the 7800 XT and 7900 XTX because the former isn't cheaper enough at its 500 and the latter isn't faster enough to justify 50% on top of the price. The problem with RDNA3 isn't so much the prices, it's how little difference RDNA3 makes. More of the same isn't enough when you're behind. RDNA3, just like any other RDNA, lacks being revolutionary.

Imagine releasing RDNA3 alongside some feature that makes DLSS and other upscalers irrelevant. Or maybe some feature that allows a stupid image quality improvement so the games look better at 1080p with AMD than they do at 1440p with NVIDIA. Or some ray tracing stuff that makes RT experience on AMD better than on anything else. Or literally anything you can only do on AMD GPUs and you wanna do it.

But alas.
 
Not if the RX 7800 XT was 349$, and the XTX 799$.
No point in releasing RX 7600 then. I mean, I don't mind that since 350 dollars gimme a GPU that can do 4K60 sans RT in most everything with occasional upscaling.
 
Realistically, the 9700 XT should be no faster than there:
Frank Azor already debunked it and that's not a rumor! Straight from AMD

From where your logic comes from blackwell (micropenis) uplift ?
 
Last edited:
Those numbers just don't add up. 5070 ti is 5/6 of a 5080 with (almost) the same memory configuration, it's gaming performance should land close(r) to 4080/4080s, not marginally higher than 4070 ti super.

Btw if a miracle happens and 9070 xt reaches 7900 xtx performance then....I want to believe!!!;)
 
About how the RX 9700 XT could come close to RX 7900 XTX, when its specs scream a slightly modified RX 7800 XT ?
50% gap !

View attachment 382553

Realistically, the 9700 XT should be no faster than there:

View attachment 382554
— Doesn't the 4080 trade punches with the 7900XTX even though it has much less bandwidth?
— Doesn't the 7600XTX beat the Radeon VII ?

The obvious answer is that newer architectures tend to make better use of the available bandwidth.
 
The obvious answer is that newer architectures tend to make better use of the available bandwidth.
Yeah, especially noticeable with GTX 760 VS GTX 960. Twice the bus for GTX 760 but still slower in everything.
 
and that is suppose to be exciting ?



This comment is so over the top, it reads like sarcasm
Sadly, i think he (like many others in this and other forums and rest of the internet) is really serious.
Please tell me in how many titles are you enabling ray tracing with your 3060 Ti? Do you have to enable DLSS as well to use it?

Multi fake frame generation is already considered a feature in your nvidia fanboy booklet?
See above.
I swear if I didn't know better, I would think these rumors and leaks are trying to hurt AMD by raising expectations to unrealistic levels. Then when the actual performance comes nowhere close, they get to beat AMD down.

Wait maybe they are!
Bingo!

Its setting up the playfield and when the gpu is finally released...then boom!
“AMD sucks! and “AMDoa “ fly all over the net.

its really tiresome.
 
Last edited:
Those numbers just don't add up. 5070 ti is 5/6 of a 5080 with (almost) the same memory configuration, it's gaming performance should land close(r) to 4080/4080s, not marginally higher than 4070 ti super.

Btw if a miracle happens and 9070 xt reaches 7900 xtx performance then....I want to believe!!!;)
You didn’t factor in the fact the 5080 has higher memory clock and core clock than the 5070 ti.
 
— Doesn't the 4080 trade punches with the 7900XTX even though it has much less bandwidth?
— Doesn't the 7600XTX beat the Radeon VII ?

The obvious answer is that newer architectures tend to make better use of the available bandwidth.

The 7600 XTX Ultra Super has 57% faster pixel fillrate, and 68% more TFLOPS.

Which again proves one thing - AMD is very poor and the only thing they do achieve is terribly misbalanced GPU configurations.
 
About how the RX 9700 XT could come close to RX 7900 XTX, when its specs scream a slightly modified RX 7800 XT ?
50% gap !

View attachment 382553

Realistically, the 9700 XT should be no faster than there:

View attachment 382554

- Right. On one hand we don't really have the 9070XT specs, so this is all pure speculation.

AMD could have done a few things from a 10,000ft level:
- Beefed up their CUs to have higher IPC. RDNA3's CU's essentially performed exactly like RDNA2 CUs, you only got more performance when you increased the number of CUs.
- Increased the boost clock (one of RDNA3's biggest missteps was missing clock targets)
- Tweaked or rebalanced infinity cache/L2 Cache to make-up for the loss of bandwidth.

Ultimately, the thing is Nvidia is able to get 4080/5080 performance out of a 380mm^2 256-bit piece of silicon on the N4 process, so (again, theoretically) there isn't a reason AMD cannot do the same (assuming N48 is in fact ~380mm^2).
 
That, frankly, would've killed both the 7800 XT and 7900 XTX because the former isn't cheaper enough at its 500 and the latter isn't faster enough to justify 50% on top of the price. The problem with RDNA3 isn't so much the prices, it's how little difference RDNA3 makes. More of the same isn't enough when you're behind. RDNA3, just like any other RDNA, lacks being revolutionary.

Imagine releasing RDNA3 alongside some feature that makes DLSS and other upscalers irrelevant. Or maybe some feature that allows a stupid image quality improvement so the games look better at 1080p with AMD than they do at 1440p with NVIDIA. Or some ray tracing stuff that makes RT experience on AMD better than on anything else. Or literally anything you can only do on AMD GPUs and you wanna do it.

But alas.
especially since they're providing the GPU for the consoles. it'd have been nice to have had some console-exlusive features on PC with AMD cards only
 
AMD: "We will not release high end GPU next gen."
*nvidia releases a bunch of turds
AMD: "Guess we are competing in the high end after all."
 
— Doesn't the 4080 trade punches with the 7900XTX even though it has much less bandwidth?
— Doesn't the 7600XTX beat the Radeon VII ?

The obvious answer is that newer architectures tend to make better use of the available bandwidth.

Last rumored specs put rdna 4 at a higher ratio of infinity cache to shader unit than rdna 3 as well.
 
But..... will they have stock in abundance?
 
In this thread I'm amused to see how much speculations are based on different insights/opinions/prejudices/rumours, some plausible some questionable.
Comments close to trolling, complot theories, whishfull thinking and so on passing by, some close to creating myths.

Talking about myths, I'll just wait untill TPU's very own "MythBuster" the one and only mister W1zzard ;) had a go at these cards and publishes subtantiated results to see where they stand in the everchanging GPU playing field.

Then it will all come down on the (consumer) price and how that compares to the competition, as well as the quality of software (drivers/features) it is using.

All the tactical bright minds in the red, green and even blue team are well aware of that.

:lovetpu:
 
I swear if I didn't know better, I would think these rumors and leaks are trying to hurt AMD by raising expectations to unrealistic levels. Then when the actual performance comes nowhere close, they get to beat AMD down.

Wait maybe they are!
It's really starting to feel like this isn't it. It's either AMD fans with big dreams accidentally hurting AMD, or Nvidia fans doing exactly as you describe. People saying it's a 4080 competitor & they'll charge ~$500 for it are fully in la la land. If it's 4080 levels then it'll be something like $100 under 4080 pricing, not many hundreds of dollars under it!
 
AMD should have put the RX 7900 XT as a direct RX 6800 XT replacement - 650 bucks, instead they went green greedy goblin and charged 900$ :kookoo:
Sure, AMD was the greedy one, while Nvidia tried to sell the 4080 12GB for $900, then re-labeled it as a 4070Ti for $800, and then re-released it as a Super version, thats the real greed.
I swear if I didn't know better, I would think these rumors and leaks are trying to hurt AMD by raising expectations to unrealistic levels. Then when the actual performance comes nowhere close, they get to beat AMD down.

Wait maybe they are!
It really does seem on purpose at this point, overhyping coming from the rumor mills and "leakers" just to set people up for disappointment while the nvidia diehards cheer for proprietary features and a refresh of the 40 super series.
 
Back
Top