Thursday, December 26th 2024

AMD Radeon RX 9070 XT Boosts up to 3.10 GHz, Board Power Can Reach up to 330W

AMD's upcoming Radeon RX 9070 XT graphics card can boost its engine clock up to 3.10 GHz, a new leak that surfaced on ChipHell says. Depending on the board design, its total board power can reach up to 330 W, the leak adds. The GPU should come with a very high base frequency for the engine clock, with the leaker claiming a 2.80 GHz base frequency (can be interpreted as Game clocks), with the GPU boosting itself up to 3.10 GHz when the power and thermals permit. The RX 9070 XT will be the fastest graphics card from AMD to be based on its next-generation RDNA 4 graphics architecture. The company isn't targeting the enthusiast segment with this card, but rather the performance segment, where it is expected to go up against NVIDIA's GeForce RTX 5070 series.

RDNA 4 is expected to introduce massive generational gains in ray tracing performance, as AMD is rumored to have significantly developed its ray tracing hardware, to reduce the performance cost of ray tracing. However, as it stands, the "Navi 48" silicon that the RX 9070 XT is based on, is still a performance-segment chip, which succeeds the "Navi 32" and "Navi 22," with a rumored compute unit count of 64, or 4,096 stream processors. Performance-related rumors swing wildly. One set of rumors say that the card's raster graphics performance is in league of the RX 7900 GRE but with ray tracing performance exceeding that of the RX 7900 XTX; while another set of rumors say it beats the RX 7900 XT in raster performance, and sneaks up on the RTX 4080. We'll know for sure in about a month's time.
Sources: ChipHell Forums, HXL (Twitter), VideoCardz
Add your own comment

102 Comments on AMD Radeon RX 9070 XT Boosts up to 3.10 GHz, Board Power Can Reach up to 330W

#51
Bomby569
Frick"We'll have to see what AMD has to offer with a future RX 7800 or 7700 series card that's designed to compete directly with the RTX 4070. How much VRAM will it have, how will it perform, and how much power will it require? We don't know and it might be another couple of months before we find out. But for now, in the $600 price bracket, in our view the RTX 4070 is the best option available.

Overall Winner: Nvidia RTX 4070"
so your past argument is the same: don't review reality, they should just review based on things that don't exist.
Posted on Reply
#52
Krit
3valatzyIsn't good, either - the negatives are:
1. High temperatures;
2. Large heatsinks which don't fit in all PC cases;
3. Higher requirement for the power supplies, spikes could go up to 500-600W
And where is the most important thing noise output? o_O
Posted on Reply
#53
marios15
The GPU market is really sad since the pandemic+AI boom.
We went from new GPU architecture with lower power or higher performance and similar prices every 12-18 months to new GPUs(not necessarily arch) every 18-36 months with same or lower performance but the new GPUs can render 720p much faster, so let's increase the prices

There's no more new GPUs released for 100-250$ from amd or nvidia...you still get the same 480/1060 performance from 8 years ago though (or is it 290X performance from 11 years ago?)\


Edit: just did a quick check on 2008 vs 2016 vs 2024
9800 GTX -> 1080Ti =11x
1080Ti -> 4090 = 3.3x


Everything else is just brainwashed kids and marketing bots
Posted on Reply
#54
3valatzy
KritAnd where is the most important thing noise output? o_O
They make enormous heatsinks which make the cards relatively quiet. But that size means compatibility issues in some mid PC (mini-ATX, micro-ATX, ITX, etc.) cases.

About the spikes, this is about the cards only, if you add the CPUs, things become dark.


www.techpowerup.com/review/xfx-radeon-rx-7900-xtx-magnetic-air/39.html
Posted on Reply
#55
Visible Noise
KritCopy paste from nvidia!
What? Are you telling me you don't find that shocking? Let's see what you have said about power draw in the past.
KritI don't like high end RX 7900 XT and XTX because of high power consumption. Even in good ventilated case it still will be loud especially in summer.

~ RX 7900 XT performance at 250w TDP would be great.
Krit350w power draw that's a lot for mid range gpu that also could mean that actual architectural updates may be not so great.
KritCritical thing will be how much performance/RT jump from RX 7000 series and also efficiency/power draw is very important.
Textbook example of cognitive dissonance.
Posted on Reply
#56
Dr. Dro
Legacy-ZAMeh, doubt it, AMD could have clawed away so much marketshare from nGreedia if they prices their previous generation GPUs well, however, they too decided to price gouge their customers. The only light at the end of the tunnel I see is Intel, strange as that may sound.

Anyhoo, 3Ghz Clocks are awesome, hope we see those on the new nGreedia GPUs too. :)
I disagree, Navi 31 was clearly designed to go after the RTX 4090 and it utterly failed to do this, despite its flagship SKU's 4080-like performance, it still very much has the bill of materials of a card that was designed to match the 4090. They have been selling it at next to no margin to make up for that, that is all. The result of this is that they simply cannot lower prices any further to just break even on their investment.
Posted on Reply
#57
RedelZaVedno
marios15Edit: just did a quick check on 2008 vs 2016 vs 2024
9800 GTX -> 1080Ti =11x
1080Ti -> 4090 = 3.3x
Adding price increase:

9800 GTX (299 USD) -> 1080Ti (699 USD) = +233%
1080Ti (699 USD) 4090 (1.599 USD)-> +229%

So we're getting the same price increase yet nearly 4x less rasterization performance increase.
Dr. DroI disagree, Navi 31 was clearly designed to go after the RTX 4090 and it utterly failed to do this, despite its flagship SKU's 4080-like performance, it still very much has the bill of materials of a card that was designed to match the 4090. They have been selling it at next to no margin to make up for that, that is all. The result of this is that they simply cannot lower prices any further to just break even on their investment.
Now think of RDNA4 being only half of intended die size as chiplet failed, yet TSMC is probably charging AMD full die size for it. Ouch that's gotta hurt Lisa and Radeon execs if true:eek:
Posted on Reply
#58
3valatzy
Dr. DroI disagree, Navi 31 was clearly designed to go after the RTX 4090 and it utterly failed to do this
It failed immediately when it was decided to cut the monolithic chip into several smaller parts, thus left at least 10% performance on the table.
Navi 31 was simply a very arrogant, very ambitious attempt going against the basic physics laws, and the general GPU design rules.
Posted on Reply
#59
Dr. Dro
RedelZaVednoNow think of RDNA4 being only half of intended die size as chiplet failed, yet TSMC is probably charging AMD full die size for it. Ouch that's gotta hurt Lisa and Radeon execs if true:eek:
AMD buys wafers wholesale from TSMC and they decide how its allotted between their product portfolio, it is common knowledge that EPYC gets most of the attention, followed by Ryzen and then Radeon, but in general yes, a larger processor which requires more components will have a higher bill of materials. The AD103 is a small chip next to Navi 31, and since it has a narrower memory bus and lower power footprint, that means less memory chips, less expensive/complex VRM circuitry, etc. - couple this lower BoM with higher sale price, Nvidia has its cake and eats it too. Sure, their software production value is way higher and so are the expenses, but it's more than enough to make up for it.
3valatzyIt failed immediately when it was decided to cut the monolithic chip into several smaller parts, thus left at least 10% performance on the table.
Navi 31 was simply a very arrogant, very ambitious attempt going against the basic physics laws, and the general GPU design rules.
No. Chiplets reduce total cost because it increases yield. Besides, N31 chips with bad memory controllers and shader partitions can easily be sold as 7900 XT and 7900 GRE.
Posted on Reply
#60
3valatzy
Dr. DroNo. Chiplets reduce total cost because it increases yield. Besides, N31 chips with bad memory controllers and shader partitions can easily be sold as 7900 XT and 7900 GRE.
No. Chiplets lead to market share loss, because the gaming benchmarks show lower FPS.
Posted on Reply
#61
sbacc
It's extremely baffling how people always dismiss the very idea that media have a bias (positive or negative) toward certain brands or line of products. Bias is everywhere, if it wasn't the case, do you think that people would have financed and launch a service like "Ground news" ?

Bias in media doesn't mean it will automatically result in lies or misinformation, it could be just omissions or over accentuation of whatever merit or demerits a certain product has.

To give an exemple from the console space, the xbox controller use AA battery, it is old technology which is bad, but you can put also rechargeable AA batteries instead and this mean the controller is less disposable if the battery is dead in hands of not tech savy people unlike Sony's one. But in reality, the huge majority of media will always stop at Xbox controller use AA batteries = old and bad. This is undoubtedly bias.

Saying bias is only tinfoil hat talk is a lie.
Posted on Reply
#62
Cheeseball
Not a Potato
I hope the 330W TGP is for SPIKES and not the average power target because the 7900 XTX is 355W and 7900 XT is 300W, with spikes at nearly 400W and 450W.

If it is indeed the TGP and we take into account generational improvements, my guess is that this will at least match the 7900 XTX in rasterization performance and, of course, be better at ray tracing.
Posted on Reply
#63
joseLopez
My wishes for this Christmas are that the 9070xt is like the 7900xt in rasterization, and in RT that it matches the nvidia 4080 or even more. And all at a good price and low consumption.
:):toast::respect:
Posted on Reply
#64
AcE
With these numbers, 4096 shaders, updated shaders, more frequency, it could have around 7900 GRE to 7900 XT performance, I also expect the RT performance to be much better, maybe a higher improvement then last time. Lastly the price will decide as price to performance is everything in the mid range.
Posted on Reply
#65
ModEl4
The variations between the performance claims isn't so great if you think about it.
Let us suppose that the time spy score wasn't optimized and it wasn't indicative for the performance of 9070XT, regarding performance I will double down on what I said earlier, reference 9070XT will achieve max 4070Ti Super 4K raster performance (in raytracing will be at least 10% slower) and let's suppose that there will be 9070XT partners board designs with +10% higher clocks vs reference (this will achieve only +7,5% difference since the memory bandwidth will be the same between them and anyway the design is bandwidth limited with 128RBs and so high clocks and only 20Gbps memory ICs) then regarding 4K raster performance we will have something like the below.As you can see with the new info (6144 cores instead of 6400) I assume RTX 5070 will have only 10% performance difference vs RTX 4070 super but in this case it is logical to assume that RTX 5070 will be at the same $599 price.Like I said before, reference 9070XT in this best case scenario regarding performance must be $549 and no higher, below 4K raster performance:






RTX 4080S110
RX9070XToc100$599
RX 7900XT95
RTX4070TiS93
RX9070XTref93$549
RTX 507087$599
RTX 4070S78.3
RX 7800XT72.5

Edit: I put in the table the $549 price in RX9070XT reference
Posted on Reply
#66
Frick
Fishfaced Nincompoop
Bomby569so your past argument is the same: don't review reality, they should just review based on things that don't exist.
The point is that both of those cards could be had for the same price and the general consensus was the 4070 was a better purchase, and I disagree with that. The ultimate point was articles like that is the main reason why people tend to think AMD cards are just inferior.
Posted on Reply
#67
Neo_Morpheus
Bomby569it wasn't more hostile compared to Intel's ARC and see how things are now for them.
In the meantime AMD will make it even more hostile i bet, and keep complaining about how hostile it is. Sometimes you make your own fate.
Some examples:

Intel releases broken cpus, people keep buying them because its intel.

Yet Zen 5 was nailed to the cross with rusty nails even though the issue was due to Winblows.

Intel gpus have horrible drivers, people dont even mention that, since only AMD has horrible drivers.

3 or so articles about leaks and rumors about the upcoming rdn4 gpus, 99% of the comments are negative and hostile towards AMD.

But I guess that those things are figments of our imagination.
FrickIt was an example of what the general consensus is: Geforce cards are universally better than AMD cards, even if the AMD cards are faster and cheaper.
I no longer believe in most of today’s reviewers.

To me, they are bribed influencers.

Granted, places like LTT might not have a choice but to take those bribes just because of how many employees are there and how much their salaries are.

Same for Tom’s and many others.
Posted on Reply
#68
Krit
Visible NoiseWhat? Are you telling me you don't find that shocking? Let's see what you have said about power draw in the past.
It depends how it will undervolt. For example my RX 7900 XT with TDP of 320w was terrible at UV and can not undervolt whatsoever without performance loss. RX 7800 XT was complete opposite i managed to undervolt it to 230w without performance loss but that's only -10% of power which is max limit. Probably chiplet architecture limit.
Posted on Reply
#69
eidairaman1
The Exiled Airman
CheeseballI hope the 330W TGP is for SPIKES and not the average power target because the 7900 XTX is 355W and 7900 XT is 300W, with spikes at nearly 400W and 450W.

If it is indeed the TGP and we take into account generational improvements, my guess is that this will at least match the 7900 XTX in rasterization performance and, of course, be better at ray tracing.
There are 3080s that allow a tdp of 330W due to info in the bios.
Posted on Reply
#70
3valatzy
sbaccIt's extremely baffling how people always dismiss the very idea that media have a bias (positive or negative) toward certain brands or line of products. Bias is everywhere, if it wasn't the case, do you think that people would have financed and launch a service like "Ground news" ?
Bias in media doesn't mean it will automatically result in lies or misinformation, it could be just omissions or over accentuation of whatever merit or demerits a certain product has.

Saying bias is only tinfoil hat talk is a lie.
It's AMD's own responsibility to have a vertical ecosystem, their own stores, their own benchmark centres, their own media which will be responsible to market the product portfolio.
Missing on all of these, or any of them, leads to where AMD is now - downward spiraling.
Posted on Reply
#71
Bomby569
FrickThe point is that both of those cards could be had for the same price and the general consensus was the 4070 was a better purchase, and I disagree with that. The ultimate point was articles like that is the main reason why people tend to think AMD cards are just inferior.
please go ahead and read what they wrote about the price, what was the situation when they wrote that. You want so much to drive your point you didn't even read it.
Neo_MorpheusSome examples:

Intel releases broken cpus, people keep buying them because its intel.

Yet Zen 5 was nailed to the cross with rusty nails even though the issue was due to Winblows.

Intel gpus have horrible drivers, people dont even mention that, since only AMD has horrible drivers.
People bought broken cpus because they didn't knew they were broken, didn't you follow the events?
Intel arc 1st gen sold next to nothing, again are you not following what is happening?
Posted on Reply
#72
Krit
To me ~ RTX 4070 Ti/Super peformance is good enought for a midrange gpu way bigger question is how much they will ask for it. Becouse right now RX 7900 XT costs ~ 670€ and technically it is more expensive to produce.
Posted on Reply
#73
Vayra86
BlaezaLiteIf the 9070XT can draw 330 watts surely it must be more powerful than a 7900XT? I know its not a fact about any of this yet, I'm hoping it is at least worth being released.
Well if the shader count is 4096 then I'm not sure how it needs to clock to surpass 7900XT... The 7900XT has 5376 shaders (31% more). And it also clocks to 2800 mhz just fine. If this clocks 3100 there's 300 mhz there to be had; that's +10% at the same wattage as a 7900XT.
KritTo me ~ RTX 4070 Ti/Super peformance is good enought for a midrange gpu way bigger question is how much they will ask for it. Becouse right now RX 7900 XT costs ~ 670€ and technically it is more expensive to produce.
500,-, maybe 549,- is my guess
CheeseballI hope the 330W TGP is for SPIKES and not the average power target because the 7900 XTX is 355W and 7900 XT is 300W, with spikes at nearly 400W and 450W.

If it is indeed the TGP and we take into account generational improvements, my guess is that this will at least match the 7900 XTX in rasterization performance and, of course, be better at ray tracing.
The 330W is just the TDP it will target. So spikes will be similar to 7900XT with a +15% pt most likely. Not exactly problematic. There is no way they'll hit 7900XTX perf, it has 96 compute units vs this one's 64, if we believe the rumors.
Posted on Reply
#74
Krit
marios15Edit: just did a quick check on 2008 vs 2016 vs 2024
9800 GTX -> 1080Ti =11x
1080Ti -> 4090 = 3.3x


Everything else is just brainwashed kids and marketing bots
RedelZaVednoAdding price increase:

9800 GTX (299 USD) -> 1080Ti (699 USD) = +233%
1080Ti (699 USD) 4090 (1.599 USD)-> +229%

So we're getting the same price increase yet nearly 4x less rasterization performance increase.
That's the problem no.1 we have these days. If we look at 7900/8800 series from ~2006 the situation would be even worse! (Golden Age Generation performance increase more than 100% in lots of cases just in one gen!)
Posted on Reply
#75
Cheeseball
Not a Potato
Vayra86The 330W is just the TDP it will target. So spikes will be similar to 7900XT with a +15% pt most likely. Not exactly problematic. There is no way they'll hit 7900XTX perf, it has 96 compute units vs this one's 64, if we believe the rumors.
Ah, right I forgot it was rumored to have only 64 CUs. But 330W TGP for that many CUs though? Unless they're pulling an RTX 3060 to 4060 (where the 4060 is still ~10% better for much less power due to newer arch), that much power draw doesn't seem to correlate with the number of CUs it may have. The 7900GRE has 80 CUs at 260W and the 7900XT has 84 at 300W. I don't think AMD is just going to throw efficiency out of the window just because of higher game/boost clocks.

Perhaps RDNA4 is a big architectural improvement where 64 CUs can now do just as much work as 80 or 72 CUs at slightly higher clocks? Hopefully this is the case.
Posted on Reply
Add your own comment
Dec 27th, 2024 16:27 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts