• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 XT Boosts up to 3.10 GHz, Board Power Can Reach up to 330W

I don't think people do that. Well, not enough people to make a dent in any statistics anyway. I genuinely think one main reason AMD isn't selling better is because stuff like this, which shows the AMD card being faster in everything not raytraced, but the article ends up recommending the Nvidia card because RT support and it's better at AI stuff because that was in the early AI boom. It's basic FOMO, especially today when everyone and their dog is nothing but a Youtuber-in-waiting, so you gotta get a card that has that sweeeeeeet software support!
Non-enthusiasts tell me AMD cards are buggy and don't run well which is simply not the case. They don't even know what DLSS or Ray tracing is. I don't know how AMD can overcome this myth. AMD cards run great.
 
Non-enthusiasts tell me AMD cards are buggy and don't run well which is simply not the case. They don't even know what DLSS or Ray tracing is. I don't know how AMD can overcome this myth. AMD cards run great.
I use both AMD and nVidia cards depending on applications and both serve me well apart from some random crashes(drivers) on both parties nothing that stands out as a deal breaker.
 
Since it will most likely be using a 4nm or 3nm process and running at such high clocks, at this power draw (330W) my guess it would perform about the same as an RTX 4080.
N4P. And no it wont perform about the same as 4080. Not on average anyway. There might be some outliers.
 
330w and samll die size sounds good let's hope it's small monolithic die not chiplet....

It would be strange if next gen gpu with 330w tdp would be slower than RX 7900 XT. This rumor are suggesting ~ RTX 4070 Ti Super performance levels.
 
Last edited:
I don't think people do that. Well, not enough people to make a dent in any statistics anyway. I genuinely think one main reason AMD isn't selling better is because stuff like this, which shows the AMD card being faster in everything not raytraced, but the article ends up recommending the Nvidia card because RT support and it's better at AI stuff because that was in the early AI boom. It's basic FOMO, especially today when everyone and their dog is nothing but a Youtuber-in-waiting, so you gotta get a card that has that sweeeeeeet software support!
Yep, Tom was always like this. An article about gaming cards and the only chart in full color that someone can see without a magnifying glass, is the stable diffusion test.
But don't worry, Tom's changed recently. The article you posted is from 2023. A totally different period. In the latest tests that I have seen from Tom's, AMD always gets better numbers, even if this means their tests contradicting tests from all other sites. Now I don't know if Tom fixed it's testing (after 25 years) and everyone else keeps doing it wrong making AMD products look worst.
I am pretty sure also that Intel's financial problems are just a coincidence here and have nothing to do with that change.
 
N4P. And no it wont perform about the same as 4080. Not on average anyway. There might be some outliers.
I do think it will perform between an RX 7900XT and RTX 4080 at least when running at 1080p and 1440p resolutions and fall behind at 4K due to the lower memory bandwidth.
 
Non-enthusiasts tell me AMD cards are buggy and don't run well which is simply not the case. They don't even know what DLSS or Ray tracing is. I don't know how AMD can overcome this myth. AMD cards run great.
It's nothing new to me just people who are not using AMD lastest GPUs doesn't get it becouse of old driver issues they have in past or just some rumors even in different forums i always heard somewhere that there are issues with amd drivers that's way i will not buy AMD again..... It's just cheap thinking in general and it's not amd fault anymore.....
 
Last edited:
it wasn't more hostile compared to Intel's ARC and see how things are now for them.
In the meantime AMD will make it even more hostile i bet, and keep complaining about how hostile it is. Sometimes you make your own fate.
Intel and Nvidia usually get favorable coverage for their products. Intel made it easier with putting reasonable prices on cards with more than 8GB of VRAM and olnly someone who hates them wouldn't agree that B580 and B570 are two products that the market and gamers needed them. Intel being first in the party also means that AMD and Nvidia will have to rethink how they price their sub $300 models. Intel's B series is welcomed.
But be certain that while Intel got favorable reviews, an AMD graphics card at $250 with 12GB of VRAM that beats the B580, will not get a positive conclusion in most reviews, but a look warm one that will be pointing in everything negative, real or not, that the average gamer today believes it associates with AMD graphics cards.
 
Yep, Tom was always like this. An article about gaming cards and the only chart in full color that someone can see without a magnifying glass, is the stable diffusion test.
But don't worry, Tom's changed recently. The article you posted is from 2023. A totally different period. In the latest tests that I have seen from Tom's, AMD always gets better numbers, even if this means their tests contradicting tests from all other sites. Now I don't know if Tom fixed it's testing (after 25 years) and everyone else keeps doing it wrong making AMD products look worst.
I am pretty sure also that Intel's financial problems are just a coincidence here and have nothing to do with that change.

It was an example of what the general consensus is: Geforce cards are universally better than AMD cards, even if the AMD cards are faster and cheaper.
 
It's nothing new to me just people who are not using AMD lastest GPUs does't get it becouse of old driver issue they have in past or just some rumors even in different forums i always hear somwere than there are issues with amd drivers that's way i will not buy AMD again..... It's just cheap thinking in general.
It's a monolithic die. They packaging for the chiplet version of RDNA4, that was to be the basis for their high end cards had issues, was apparently cancelled.
 
I don't think people do that. Well, not enough people to make a dent in any statistics anyway. I genuinely think one main reason AMD isn't selling better is because stuff like this, which shows the AMD card being faster in everything not raytraced, but the article ends up recommending the Nvidia card because RT support and it's better at AI stuff because that was in the early AI boom. It's basic FOMO, especially today when everyone and their dog is nothing but a Youtuber-in-waiting, so you gotta get a card that has that sweeeeeeet software support!

Again, that's factoring in ray tracing games, DLSS upscaling, and AI workloads. If you have no interest in any of those, AMD's RX 6950 XT comes out ahead, but those are three pretty major topics in the GPU world these days.

that's just you trying to force your point of view, the review is actually well structured and goes over several parts in competition between team green and red. And it's true raster is not enough anymore, do you know how i know this, because AMD just told us so and will improve on that, their words.
It should have won in price but they explained it and it's not their fault AMD just waits for Nvidia to price their cards first. Maybe they could have learned something from Intel on RTX and pricing.
 
Non-enthusiasts tell me AMD cards are buggy and don't run well which is simply not the case. They don't even know what DLSS or Ray tracing is. I don't know how AMD can overcome this myth. AMD cards run great.
Yeah, I don't know what the deal is. I have been using AMD cards for over a decade and they seem to work just fine, even if they aren't the fastest in the segment. My Mac Pro 2010 is running an old Polaris 480 and it's still good, too, so it's not like I've had a card fail after a while. That last GPU I had die on me was so long ago that I can't remember the name (Radeon VE maybe?), and really none of them have been all that problematic. I have way more random issues with W11 than anything, and that's an Intel+NVIDIA setup, so go figure.

I know this article is about the 9070, but I'm curious what sort of leap will translate to the 9060. The 7600 was really not much better than the 6650, and I'd like to see what AMD has for the segment where power is limited to a single 8 pin. Are we tapped out there?
 
But be certain that while Intel got favorable reviews, an AMD graphics card at $250 with 12GB of VRAM that beats the B580, will not get a positive conclusion in most reviews
Which card is that?
 
Which card is that?
AMD isn't going to just announce ONE card, the 9070 XT. We will probably get also a 9060 for example. Let's see it's price and specs. It might be selling close to B580 and probably be faster. The question is if AMD will keep it at 8GB and offer a 16GB at over $300, like what it did with 7600 and 7600 XT. Hope they will follow Intel's example.
 
9070xt have to beat atleast 7900xt in raster and touch 4080 in rt.
If not I go nvidia.
 
Board power on my 4070Ti can hit 400w, so this isn't too bad.
 
Can't wait for the next gen GPUs to be available in stock at MSRP... not.
 
What's the incentive to not just buy 7900XT today for $650 if you don't care about having DLSS? All leaks are pointing out that 9070 XT will likely rival 7900GRE in rasterization. Even if it costs like $600, you don't lose much if anything. Sure you get worse RT performance (who really cares?), but you get 15-20% better rasterization performance, 4 gigs more ram & even a bit lower TDP. The point I'm trying to make is that 9070 XT should really not cost more than $500 in order not to be DOA. Just my 2 cents.
 
that's just you trying to force your point of view, the review is actually well structured and goes over several parts in competition between team green and red. And it's true raster is not enough anymore, do you know how i know this, because AMD just told us so and will improve on that, their words.
It should have won in price but they explained it and it's not their fault AMD just waits for Nvidia to price their cards first. Maybe they could have learned something from Intel on RTX and pricing.

"We'll have to see what AMD has to offer with a future RX 7800 or 7700 series card that's designed to compete directly with the RTX 4070. How much VRAM will it have, how will it perform, and how much power will it require? We don't know and it might be another couple of months before we find out. But for now, in the $600 price bracket, in our view the RTX 4070 is the best option available.

Overall Winner: Nvidia RTX 4070"
 
Non-enthusiasts tell me AMD cards are buggy and don't run well which is simply not the case. They don't even know what DLSS or Ray tracing is. I don't know how AMD can overcome this myth. AMD cards run great.
I've run AMD cards in my desktop since 2019. I didn't have any more issue with them than I did with my laptops with Nvidia cards. I've had issues with the laptop I bought last year but those are on Intel. The i9-13900HX in my laptop was giving me all sorts of issues. Issues that later on mirrored CPU failure issues I saw in their 13th/14th gen desktop CPUs. So they replaced the unit with a new one and it worked flawlessly for year but died completely two weeks ago. Suffice it to say, I searched high and low for a highend laptop with an AMD CPU. It was way too difficult but I managed to find one with a 7945X3D. Whoops went off topic. Sorry, I'm still salty about the high end paperweight that used to be my laptop.
 
I hear the problem Radeon division is having atm is that they hoped for much better performance results with RDNA4 arch. Now they're in full damage control mode as they're caught between the TSMC's high waffers costs and AMD group not allowing them to take a loss on the product. So it might well be silly $649 pricetag for 9070XT at first as some leakers suggest just to be DOA and then after the fiscal year ends heavily discounted by the blessings of Lisa.
 
I hear the problem Radeon division is having atm is that they hoped for much better performance results with RDNA4 arch. Now they're in full damage control mode as they're caught between the TSMC's high waffers costs and AMD group not allowing them to take a loss on the product. So it might well be silly $649 pricetag for 9070XT at first as some leakers suggest just to be DOA and then after the fiscal year ends heavily discounted by the blessings of Lisa.
The high end RDNA4 dies, like RDNA3, were supposed to be chiplet based. The packaging for RDNA4 was more complex than RDNA3. It would have allowed to use multiple GPU dies. There some sort of issue with the design that would have caused a significant delay, so they canned it to focus on RDNA5 aka UDNA1. What is getting billed as the 9070XT is likely half of what would have been the 9090XT.
 
330 Watt mid-range GPU.

GG AMD.
 
Back
Top