• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Debuts Radeon RX 9070 XT and RX 9070 Powered by RDNA 4, and FSR 4

Everyone but the most diehard amd fanboy knows they fumbled RDNA3 big time and they were coming off some momentum with RDNA2 and squandered it.

Hopefully the 9070XT is good right out of the gate and everyone goes damn that is what I am talking about.
How was RDNA3 fumbled? The fact that they launched it in Germany?
 
How was RDNA3 fumbled? The fact that they launched it in Germany?

Right on cue.....

Bruce Willis Smoking GIF by Bell Brothers
 
In other words: people are stupid. It's not logical, but it's fact.
Maybe. Keep in mind how many consumer choices we make all the time, on top of all the other life decisions that actually matter. It’s not impossible to make the most informed choice on any purchase, but to do it with every choice we make is incredibly time consuming. Make good decisions on the stuff that really matters. Choosing the right GPU for the ideal price for recreational gaming just isn’t critical for survival, unless you are spending beyond your means.
 
Looks great though, not seeing the problem :confused:
It was used as a Bludgeon to make a negative point for AMD. How many people complained about RT on AMD that were not AMD users?
 
Does it though? If you have the infrastructure laid, supply chain at the ready, they really don't cost much. Heck sell X amount and it pays for the hardware that built it, for the guys on the software end, and with a little profit on top, or a lot of profit depending on how efficient your operation is..

R&D certainly does cost though... that's the bulk of the spending. 2024 CapEx for NV turned out to $2.4B (USD), this capital expenditure is actually higher than pretty much any other technology company except for Intel. In a sense it is quite remarkable AMD went as far as they did with their relatively low CapEx but they need to start investing back their share price to advance their products. More importantly, they need to pay their engineers better than NV to make them want to deal with the state of affairs IMHO.

It was used as a Bludgeon to make a negative point for AMD. How many people complained about RT on AMD that were not AMD users?

You don't need to drink spoiled milk to know that it's spoiled, let alone complain about it being spoiled. Your point being?
 
It was used as a Bludgeon to make a negative point for AMD.
What? No... it is a natural progression.. looks good to me. Looks good to AMD too since they are embracing it..
R&D certainly does cost though...
I thought they were using their AI to plot the course :confused:
 
R&D certainly does cost though... that's the bulk of the spending. 2024 CapEx for NV turned out to $2.4B (USD), this capital expenditure is actually higher than pretty much any other technology company except for Intel



You don't need to drink spoiled milk to know that it's spoiled, let alone complain about it being spoiled. Your point being?
It will always be the same people bashing AMD and calling me out of touch with a PC I use everyday. Thanks for confirming why RT is such a focus. How is 7000 spoiled?
 
I thought they were using their AI to plot the course :confused:

Not yet, but I fear that might as well be the case someday. Engineering is one of the tasks that will eventually be largely relegated to machine learning models, with humans used only for relative correction of the floorplan overall. Will be any day now...

It will always be the same people bashing AMD and calling me out of touch with a PC I use everyday. Thanks for confirming why RT is such a focus. How is 7000 spoiled?

I did not say it was spoiled, I proposed a counter to your universal dismissal of inconveniences by claiming that people are complaining without having experienced the product themselves. People complaining about low RT performance on Radeon are not doing so to "bash", and they don't need to have one to attest to the fact that it is so. I mean, that's what reviews are for. If anything, a product's deficiencies justifies choosing an alternative which does not suffer from them.
 
Everyone but the most diehard amd fanboy knows they fumbled RDNA3 big time and they were coming off some momentum with RDNA2 and squandered it.

Hopefully the 9070XT is good right out of the gate and everyone goes damn that is what I am talking about.
What did they fumble? What did they squander? I see solid products in both cases, paired with terrible marketing.

Well, ask yourself that question:

If AMD is slower in RT AND in raster compared to nvidia, why is it "not logical" for AMD to be sold significantly cheaper then nvidia? You already agreed that if the card is slower it should be cheaper, so.....
It is cheaper than Nvidia at basically any price point.

Maybe. Keep in mind how many consumer choices we make all the time, on top of all the other life decisions that actually matter. It’s not impossible to make the most informed choice on any purchase, but to do it with every choice we make is incredibly time consuming. Make good decisions on the stuff that really matters. Choosing the right GPU for the ideal price for recreational gaming just isn’t critical for survival, unless you are spending beyond your means.
That is a fair assumption. Although, I would do my best to be as informed as possible if I was about to spend hundreds on something, especially if it's my hobby, therefore an important part of my life.
 
Don't spin this around, if it's anti-consumer when Nvidia does it, warranting years of rhetoric and scorn, it's anti-consumer when AMD does it. Where is all the outrage of AMD going back on their word and "betraying the trust of the loyal Radeon customer"? It's simple, really, it's pure hypocrisy. You people never once cared for it being closed source software, you only ever cared that it was better and you couldn't use it.
Look, anyone who thinks that AMD is "pro-consumer" for altruistic reasons needs his head examined. AMD is a multi-billion-dollar globocorp, not your friend or mine. But it isn't incorrect to say that a disadvantaged market position frequently forces AMD to adopt a pro-consumer posture, at least relative to the competition. As far as the consumer is concerned, this might even seem like a distinction without a difference. That is, until AMD gains a market advantage.

All else being equal, open standards are better. There's nothing wrong with having rooted for FSR on that basis. And in fact, historically, open standards tend to win. VHS won out over Betamax. The CD won out over the Mini-Disc. The PC defeated the Macintosh. If we're discussing the GPU wars, Freesync was ultimately vindicated, and PhysX dwindled to obscurity.

(I still loved my Betamax, though.)

In this case, it just so happens that AMD's open standard couldn't compete with Nvidia's closed standard on image quality. But it got pretty close, and I'm glad they tried. Whatever else you want to say about the tech itself or the motivations behind its design, FSR represents a significant value add for consumers of all shapes or sizes. I wish AMD luck on their new proprietary tech, but they're going to need more than "we've got DLSS too" to move the needle substantially on their position in the GPU market.

Those cards were just under the heat of competition. And they had their merits which made them successful products.

And the CPU question... yes, for a while, until Intel chips lost momentum and they started charging $300 for a 6 core Ryzen 5. Then the 12400F happened and suddenly even motherboards that had "technical limitations" supported it all right quick alongside a nice price cut.
Exactly right.
 
What did they fumble? What did they squander? I see solid products in both cases, paired with terrible marketing.

1. They showed performance that was well beyond what the actual products benchmarked at during reviews at launch. They had been historically pretty good with this but then squandered it.
2. They priced multiple sku too high leading to negative feedback and then their price tanking. People can view this however they want but even AMD admits this much.

Are the two main things they did poorly but AMD said so themselves multiple products they released specifically the 7600/7700XT/7900XT were poorly received and they would like to fix that. Which is a good thing.

Performance has also regressed over time the best AMD card is now 50% slower in RT vs the 4080 and it loses to a card that was 200 usd cheaper and has been retired because even Nvidia wanted something better at it's price point which isn't a good look either. FSR hasn't really improved over RDNA3 lifetime.

And now they are in the awkward position of not improving raster for at least another generations and hopefully catching last generation Nvidia products in the same tier in RT while doing something their most diehard fans hate potentially locking FSR4 to RDNA4 although I think if they can make it work on RDNA3 and older they will backtrack on this.

Hey, you forgot me in that picture!

View attachment 378659

@freeagent seems like he gets the most heat from my observation but what I wouldn't do to be a fly on the wall of your pms from users lmao especially the reports.... Well the mods in general not just you lol.
 
I will also eat my used shorts (no skidmarks here folks) if the RTX 5090 debuts at $1799. That would (technically) be fair pricing for a 32 GB GDDR7 monster.

RTX 5080 is hard to guess since the previous Super did come out at $1000 (so NVIDIA made themselves look "generous" :laugh:) but the predecessor was $1200. I would be surprised as hell if they do $1000 again.

Yeah I'm surprised (about the RTX 5080, not the RTX 5090):

1736219920529.png


Actually I'm surprised about the 5070 Ti and 5070 so they were lowered $50, although it would've been nice to be at a $700 or $500, but meh. At least the RX 9070 XT would be around maybe $600 now.
 
1. They showed performance that was well beyond what the actual products benchmarked at during reviews at launch. They had been historically pretty good with this but then squandered it.
2. They priced multiple sku too high leading to negative feedback and then their price tanking. People can view this however they want but even AMD admits this much.

Are the two main things they did poorly but AMD said so themselves multiple products they released specifically the 7600/7700XT/7900XT were poorly received and they would like to fix that. Which is a good thing.

Performance has also regressed over time the best AMD card is now 50% slower in RT vs the 4080 and it loses to a card that was 200 usd cheaper and has been retired because even Nvidia wanted something better at it's price point which isn't a good look either. FSR hasn't really improved over RDNA3 lifetime.

And now they are in the awkward position of not improving raster for at least another generations and hopefully catching last generation Nvidia products in the same tier in RT while doing something their most diehard fans hate potentially locking FSR4 to RDNA4 although I think if they can make it work on RDNA3 and older they will backtrack on this.
Yeah, the 7600, 7700 XT and 7900 XT were badly priced at launch. But the 7800 XT and 7900 XTX were great.

Showing false benchmark results at launch is really poor, but people should know better than to believe marketing hype, and look for real reviews before buying. That doesn't affect the end product in any way, imo.

RT didn't improve because they basically used RDNA 2's RT engine. They spent all their R&D on chiplets and improving raster, neither of which paid off really well. But that still didn't make the end products bad, just not as much better than RDNA 2 as we'd hoped. Now, they're doing the opposite: they're trying to improve RT and the video engine while only doing minor fixes on raster, and backtracking on the chiplet design. I'm curious if it'll pay off, this is why I'm disappointed that we didn't get more detail.

Yeah I'm surprised (about the RTX 5080, not the RTX 5090):

View attachment 378664

Actually I'm surprised about the 5070 Ti and 5070 so they were lowered $50, although it would've been nice to be at a $700 or $500, but meh. At least the RX 9070 XT would be around maybe $600 now.
Oh so they've been announced now? What a shame for AMD... The 5070 is coming and the 9070 XT is nowhere in sight.

Also, AI TOPS? That's how we measure GPU performance now? *Facepalm*
 
Yeah, the 7600, 7700 XT and 7900 XT were badly priced at launch. But the 7800 XT and 7900 XTX were great.

Showing false benchmark results at launch is really poor, but people should know better than to believe marketing hype, and look for real reviews before buying. That doesn't affect the end product in any way, imo.

RT didn't improve because they basically used RDNA 2's RT engine. They spent all their R&D on chiplets and improving raster, neither of which paid off really well. But that still didn't make the end products bad, just not as much better than RDNA 2 as we'd hoped. Now, they're doing the opposite: they're trying to improve RT and the video engine while only doing minor fixes on raster, and backtracking on the chiplet design. I'm curious if it'll pay off, this is why I'm disappointed that we didn't get more detail.


I forgot to add that the chiplet thing didn't work out....Honestly the only reason the 7800XT/7900XTX looked ok was because the competing cards were massively hiked in price especially the 4080 and even then neither was better than the alternative just cheaper and both took a massive hit to RT performance that has grown over time.
 
I forgot to add that the chiplet thing didn't work out....Honestly the only reason the 7800XT/7900XTX looked ok was because the competing cards were massively hiked in price especially the 4080 and even then neither was better than the alternative just cheaper and both took a massive hit to RT performance that has grown over time.
Cheaper at the same performance level means better in my books. And RT... Like I said, I don't care as long as I have to spend a grand to use it properly (which I won't).
 
Also, AI TOPS? That's how we measure GPU performance now? *Facepalm*
Are they really GPUs, or can they just play videogames as a byproduct :D
 
@freeagent
…welcome to 2007 when CUDA was released, I heard Crysis is gonna be mighty impressive and I seriously can’t wait for The Orange Box.
No, seriously, this isn’t new. The writing was on the wall for almost two decades.
 
Cheaper at the same performance level means better in my books. And RT... Like I said, I don't care as long as I have to spend a grand to use it properly (which I won't).

Regardless my favorite RDNA3 card is the 7900GRE when factoring in the US launch price hopefully the 9070 is a much better version of that because the 5070 and 5070ti are cheaper than expected....
 
Great, but can it run Indiana Jones at Supreme settings??

:D
 
Last edited:
Regardless my favorite RDNA3 card is the 7900GRE when factoring in the US launch price hopefully the 9070 is a much better version of that because the 5070 and 5070ti are cheaper than expected....
Yet, we won't know their true performance because Nvidia is sidetalking with some FG 4x bullshit that makes the cards look better than they actually are.

Just when I thought it couldn't get worse than the AMD keynote. *Sigh*
 
Now we know why AMD bailed on presenting RDNA 4 this afternoon. It’s bottom market trash compared to what Nvidia is showing now.

This Nvidia announcement isn’t a stomping, it’s a shredding. A year from now AMD will have 5% market share from their loyalists and that’s it.

Radeon is dead. Long live Radeon.
 
Back
Top