Thursday, December 26th 2024

AMD Radeon RX 9070 XT Boosts up to 3.10 GHz, Board Power Can Reach up to 330W

AMD's upcoming Radeon RX 9070 XT graphics card can boost its engine clock up to 3.10 GHz, a new leak that surfaced on ChipHell says. Depending on the board design, its total board power can reach up to 330 W, the leak adds. The GPU should come with a very high base frequency for the engine clock, with the leaker claiming a 2.80 GHz base frequency (can be interpreted as Game clocks), with the GPU boosting itself up to 3.10 GHz when the power and thermals permit. The RX 9070 XT will be the fastest graphics card from AMD to be based on its next-generation RDNA 4 graphics architecture. The company isn't targeting the enthusiast segment with this card, but rather the performance segment, where it is expected to go up against NVIDIA's GeForce RTX 5070 series.

RDNA 4 is expected to introduce massive generational gains in ray tracing performance, as AMD is rumored to have significantly developed its ray tracing hardware, to reduce the performance cost of ray tracing. However, as it stands, the "Navi 48" silicon that the RX 9070 XT is based on, is still a performance-segment chip, which succeeds the "Navi 32" and "Navi 22," with a rumored compute unit count of 64, or 4,096 stream processors. Performance-related rumors swing wildly. One set of rumors say that the card's raster graphics performance is in league of the RX 7900 GRE but with ray tracing performance exceeding that of the RX 7900 XTX; while another set of rumors say it beats the RX 7900 XT in raster performance, and sneaks up on the RTX 4080. We'll know for sure in about a month's time.
Sources: ChipHell Forums, HXL (Twitter), VideoCardz
Add your own comment

102 Comments on AMD Radeon RX 9070 XT Boosts up to 3.10 GHz, Board Power Can Reach up to 330W

#76
Vya Domus
DavenLatest leak says 9070xt within 5% of 4080.
Not possible according to the shader count, one or the other are wrong.
Dr. DroNavi 31 was clearly designed to go after the RTX 4090
It clearly wasn't, different die sizes and process nodes. Not to mention more space is wasted when using chiplets, they simply aren't comparable.
Posted on Reply
#77
Darmok N Jalad
Neo_MorpheusSome examples:

Intel releases broken cpus, people keep buying them because its intel.

Yet Zen 5 was nailed to the cross with rusty nails even though the issue was due to Winblows.

Intel gpus have horrible drivers, people dont even mention that, since only AMD has horrible drivers.

3 or so articles about leaks and rumors about the upcoming rdn4 gpus, 99% of the comments are negative and hostile towards AMD.

But I guess that those things are figments of our imagination.

I no longer believe in most of today’s reviewers.

To me, they are bribed influencers.

Granted, places like LTT might not have a choice but to take those bribes just because of how many employees are there and how much their salaries are.

Same for Tom’s and many others.
I have been around long enough to remember the Tom’s article that reviewed how CPUs would perform when you intentionally removed the HSF while it’s running. The Intel CPU would throttle, the Athlon CPU would burn up. A seemingly pointless article, but one that painted Intel in a better light at a time when AMD was actually competitive for the first time. Also we can’t forget the CTS labs “Zen flaws” release. When intel was taking hits for all its flaws, suddenly a “research company” spills the beans on AMD vulnerabilities, too. Legit issues or not, it had all the looks of a shell company established to produce a hit piece, and they have published nothing else.
Posted on Reply
#78
marios15
Neo_MorpheusSome examples:

Intel releases broken cpus, people keep buying them because its intel.

Yet Zen 5 was nailed to the cross with rusty nails even though the issue was due to Winblows.

Intel gpus have horrible drivers, people dont even mention that, since only AMD has horrible drivers.

3 or so articles about leaks and rumors about the upcoming rdn4 gpus, 99% of the comments are negative and hostile towards AMD.

But I guess that those things are figments of our imagination.

I no longer believe in most of today’s reviewers.

To me, they are bribed influencers.

Granted, places like LTT might not have a choice but to take those bribes just because of how many employees are there and how much their salaries are.

Same for Tom’s and many others.
Darmok N JaladI have been around long enough to remember the Tom’s article that reviewed how CPUs would perform when you intentionally removed the HSF while it’s running. The Intel CPU would throttle, the Athlon CPU would burn up. A seemingly pointless article, but one that painted Intel in a better light at a time when AMD was actually competitive for the first time. Also we can’t forget the CTS labs “Zen flaws” release. When intel was taking hits for all its flaws, suddenly a “research company” spills the beans on AMD vulnerabilities, too. Legit issues or not, it had all the looks of a shell company established to produce a hit piece, and they have published nothing else.
Nothing generates clicks like a thumbnail and a title in the form of "[insert great product] sucks - and here's why" or "THE BIG FLAW WITH [insert popular product]"

or "AMD Radeon RX 9070 XT Boosts up to 3.10 GHz"(but that's barely +5% VS 6000 series)

Even though AMD announced that they pulled out from the high-end, even if their marketing department won't advertise this as a mid-range, the narrative is "9070 XT to compete with 5090" - then you get 40% slower performance and it's a disappointment, even if it's priced accordingly

Or it's another "poor volta" moment
Posted on Reply
#79
wolf
Better Than Native
marios15Edit: just did a quick check on 2008 vs 2016 vs 2024
9800 GTX -> 1080Ti =11x
1080Ti -> 4090 = 3.3x
Why did you choose to use the 9800GTX that was a die shrunk and tweaked 8800GTX and not the GTX280 which also released in 2008? it would have still served your point with a 7.87x uplift to a 1080Ti.
Dr. DroNavi 31 was clearly designed to go after the RTX 4090 and it utterly failed to do this
Yeah it absolutely was, the total die area, the BOM as you say, at best I'd say they wanted to at least split the difference between a 4080 and 4090 and then still fell a good 15% short.
Posted on Reply
#80
freeagent
Oh yeah.. they are totally making a 4070Ti..

Posted on Reply
#81
TheinsanegamerN
Legacy-ZAMeh, doubt it, AMD could have clawed away so much marketshare from nGreedia if they prices their previous generation GPUs well, however, they too decided to price gouge their customers. The only light at the end of the tunnel I see is Intel, strange as that may sound.

Anyhoo, 3Ghz Clocks are awesome, hope we see those on the new nGreedia GPUs too. :)
I know math is hard, and throwing out silly names like "ngreedia" must be the peak of comedy, but you do know when your margins are 2-3%, you cant just cut prices willy nilly and stay in business for long, right?

When it comes to GPUs, people just outright refuse to believe that inflation is real.
wolfWhy did you choose to use the 9800GTX that was a die shrunk and tweaked 8800GTX and not the GTX280 which also released in 2008? it would have still served your point with a 7.87x uplift to a 1080Ti.

Yeah it absolutely was, the total die area, the BOM as you say, at best I'd say they wanted to at least split the difference between a 4080 and 4090 and then still fell a good 15% short.
This is why I wish Anandtech hadnt gone to crap, a circuit analysis of a MCD would be fascinating. I'd bet good money that if the chips were on the same 5nm node as the GPU, and you didnt need those MCM interconnects, that the memory controllers would be a lto smaller and the total die size would be a lot closer to the 4080.
Posted on Reply
#82
Dr. Dro
TheinsanegamerNThis is why I wish Anandtech hadnt gone to crap, a circuit analysis of a MCD would be fascinating. I'd bet good money that if the chips were on the same 5nm node as the GPU, and you didnt need those MCM interconnects, that the memory controllers would be a lto smaller and the total die size would be a lot closer to the 4080.
Thankfully the smartest folks that used to be part of Anandtech's editorial staff now publish at Chips and Cheese, which is a much higher level technical publication instead.
Vya DomusIt clearly wasn't, different die sizes and process nodes. Not to mention more space is wasted when using chiplets, they simply aren't comparable.
It is painfully obvious, there is no technicality that will work around the fact that this graphics card was designed to achieve much greater heights than it actually did. It is a very complex design and the overall die area is exceptionally large, let's not pretend that AMD is several nodes behind or that the chiplets are at fault or anything.

The decisions they made to implement things like a 384 bit interface with 12 memory chips, the power target (because as we have come to know, it scales miserably past its default 330 to 350 W range, even if you pump it 475 W+ into it), none of these are taken lightly when designing a graphics card, they are very much aware of their product's strengths and weaknesses, yet the end result is that both it and the 7900 XT visibly had to be subjected to pre-launch price cuts once they saw that they stood no chance in hell against the RTX 4090 despite being late, especially with the horribly broken launch drivers (which they knew was a problem at the time).

I guarantee you if the RTX 4080 had come first by itself and the 4090 had delayed for no more than a month, the 7900 XTX would have an $1499 MSRP (justifying itself against the 4080 for having 24 over 16 GB) and they would place the 7900 XT at 1199 while telling people they could still get 20 GB and almost as much performance, still sounding like it was a good deal.
Posted on Reply
#84
Vya Domus
Dr. DroIt is painfully obvious, there is no technicality that will work around the fact that this graphics card was designed to achieve much greater heights than it actually did.
There actually isn't a single technical detail that proves it did, lower number of transistors, smaller die size, lower TDP and shader count, etc. AMD never expected it to compete with whatever Nvidia was going to sell as it's flagship, I'll remind you both the 4090 and 4080 were more expensive than the 7900XTX.
Posted on Reply
#85
eidairaman1
The Exiled Airman
AusWolf330 W? Yikes!:fear:
A 3080 ti has that tdp in the bios code, i just gave someone gainward and palit bios to use on a pny card since all 3 use the same pcb and gainward has the same exact cooler as the pny.
Posted on Reply
#86
AusWolf
eidairaman1A 3080 ti has that tdp in the bios code, i just gave someone gainward and palit bios to use on a pny card.
I'd still like to see lower values by default without any tinkering. I guess I'll have a reference model, then (fingers crossed that it'll be available in the UK).
Posted on Reply
#87
Mr_Engineer
CheeseballPerhaps RDNA4 is a big architectural improvement where 64 CUs can now do just as much work as 80 or 72 CUs at slightly higher clocks? Hopefully this is the case.
Yes probably exactly that. 64 CUs, new arch, faster clocks, more cache, and maybe more shading units/cores per CU.
Posted on Reply
#88
Vayra86
Mr_EngineerYes probably exactly that. 64 CUs, new arch, faster clocks, more cache, and maybe more shading units/cores per CU.
Sounds like a fantasy though, that.
Posted on Reply
#89
DemonicRyzen666
marios15The GPU market is really sad since the pandemic+AI boom.
We went from new GPU architecture with lower power or higher performance and similar prices every 12-18 months to new GPUs(not necessarily arch) every 18-36 months with same or lower performance but the new GPUs can render 720p much faster, so let's increase the prices

There's no more new GPUs released for 100-250$ from amd or nvidia...you still get the same 480/1060 performance from 8 years ago though (or is it 290X performance from 11 years ago?)\


Edit: just did a quick check on 2008 vs 2016 vs 2024
9800 GTX -> 1080Ti =11x
1080Ti -> 4090 = 3.3x


Everything else is just brainwashed kids and marketing bots
That's a much larger time frame from 9800 GTX to the 1080 Ti.
you would have to go form the GTX 680 to 1080 Ti ti match the same time frame as 1080 ti to RTX 4090. It's basically about the 3.3x increase.

The main difference is the price the GTXX 680 had a price of $499 & the 1080 ti had a price of $699 That's only 40% more
then you go the 1080 TI & at $699 going the RTX 4090 is 2.2 times more expensive. Even with inflation the Nvidia RTX cards are all horribly priced & always have been since their introduction.

The RX 9070 is only showing to even come close the RX 7900 GRE , but a 20% increase in RT fps. while having the same total CU count. AMD did the same thing with RDNA3 They gave 20% more RT cores & barely made it faster than their RNDA2 by cheeping out shaders & RT cores. Meanwhile ever single one of NVidia's cards have always increase both shaders & RT cores, by far mor than 20%.
The card is horribly over priced a $649 Or even $449 That card is so slow compare the 5000 series is should be a short release of 6 to 8 month there is almost no reason to release it all.
The price should be $350 it's a last generation tier performance & it will still be behind.
Posted on Reply
#90
Frick
Fishfaced Nincompoop
Bomby569please go ahead and read what they wrote about the price, what was the situation when they wrote that. You want so much to drive your point you didn't even read it.
"AMD hasn't officially dropped the price of the RX 6950 XT to $599, at least as far as we're aware — the AMD store lists the reference card at $699 (and it's out of stock). But Newegg, Amazon, and others have regularly had RX 6950 XT cards priced in the $599–$699 range for several months, basically since the RX 7900 XT launched. Will such cards continue to exist, now that the RTX 4070 launch is over? Perhaps until they're all sold, or until a lower tier RX 7800- or 7700-series card comes along with similar performance (and hopefully at a better price).

And that's the problem. One card has an official $599 MSRP and should remain at that price point going forward, the other got a price promotion to hit $599 and those cards seem to have disappeared now. The best we can find is $619 at present. Sure, it's "only" $20 extra, but it's also for roughly equivalent performance across our gaming suite, and again we favor Nvidia's GPU across a larger suite of testing that factors in DLSS and AI.

If you can find an RX 6950 XT for the same price as an RTX 4070, it's worth considering, but that's not a decisive win. And in fact, since you'll also use about 100–150 watts more power while gaming with the RX 6950 XT, depending on how much you play games, the ongoing cost of owning the RX 6950 XT could also be higher."

I genuninely don't know what we're arguing about anymore.
Posted on Reply
#91
Dr. Dro
FrickI genuninely don't know what we're arguing about anymore.
I stopped following some time ago tbh. These always devolve into some victim complex where a pitiful one is facing a great evil, followed by a stream of self reassuring posts and brand loyalty remarks, its amusing at first but it gets old fast
Posted on Reply
#92
Squared
Why is AMD making a 0-class GPU? The lowest class last generation was 6 (7600).
Posted on Reply
#93
Macro Device
SquaredWhy is AMD making a 0-class GPU?
Because they started realistically anticipating the sales.

No one but about a couple dozen ludomaniacs wants a 330 W midrange krankenachtung that's pretending to be a GPU whilst being open about being an AMD product. Catching up with upper midrange Ada in terms of RT performance could have been an achievement three years ago when Ada GPUs didn't exist. In this day and age, this is like a postman finally delivering you a fax machine. Cool but you ordered that a lifetime ago.

This is a loop of underdelivery. AMD promise 7th heaven but what you see on shelves is shitposting. Then they don't get good revenue and once again promise sky high quality but then the product/feature comes out a couple years too late and it's still worse than what competition had readily available when AMD only were running an ad campaign. Disregard facts, pretend it's a part of a genius plan, get set, repeat.

What AMD need is to become pounders, not smacktalkers. Or to sell their graphics division to someone who ACTUALLY cares.
Posted on Reply
#94
AusWolf
Macro DeviceBecause they started realistically anticipating the sales.

No one but about a couple dozen ludomaniacs wants a 330 W midrange krankenachtung that's pretending to be a GPU whilst being open about being an AMD product. Catching up with upper midrange Ada in terms of RT performance could have been an achievement three years ago when Ada GPUs didn't exist. In this day and age, this is like a postman finally delivering you a fax machine. Cool but you ordered that a lifetime ago.

This is a loop of underdelivery. AMD promise 7th heaven but what you see on shelves is shitposting. Then they don't get good revenue and once again promise sky high quality but then the product/feature comes out a couple years too late and it's still worse than what competition had readily available when AMD only were running an ad campaign. Disregard facts, pretend it's a part of a genius plan, get set, repeat.

What AMD need is to become pounders, not smacktalkers. Or to sell their graphics division to someone who ACTUALLY cares.
I don't think RT is that important as long as games largely build around consoles, and consoles build largely around AMD. You can see that by most games being very conservative with forced RT options, and rather treating it as an optional feature.
Posted on Reply
#95
Macro Device
AusWolfas long as games largely build around consoles, and consoles build largely around AMD.
Sony explicitly threatened to ditch AMD if the latters won't improve RT. Console gaming also demands RT but AMD don't deliver on this so this is the reason why console games ain't exactly the pinnacle of rays and bounces. RT is to become more and more popular in game development regardless of what you or I think about it.

Not my point anyway. My point is whatever AMD made last decade feature wise was at least a year later than the green equivalent and never catched up in quality. FSR from almost 2025 is still worse than DLSS from 2020. "But it's brand agnostic and open source and shiet" why should I care if it... doesn't improve my experience, or does but similar perf at similar quality can be achieved on a cheaper green GPU because DLSS P beats FSR Q not only in speed but also in quality in almost all games? RT from almost 2025 is still worse than in RTX 3090, a 2020 GPU. Professional workload performance is still worse, too. AMD frame generation is still worse than what NVIDIA had on the day 0. This comes down to anything.
Posted on Reply
#96
AusWolf
Macro DeviceSony explicitly threatened to ditch AMD if the latters won't improve RT. Console gaming also demands RT but AMD don't deliver on this so this is the reason why console games ain't exactly the pinnacle of rays and bounces. RT is to become more and more popular in game development regardless of what you or I think about it.

Not my point anyway. My point is whatever AMD made last decade feature wise was at least a year later than the green equivalent and never catched up in quality. FSR from almost 2025 is still worse than DLSS from 2020. "But it's brand agnostic and open source and shiet" why should I care if it... doesn't improve my experience, or does but similar perf at similar quality can be achieved on a cheaper green GPU because DLSS P beats FSR Q not only in speed but also in quality in almost all games? RT from almost 2025 is still worse than in RTX 3090, a 2020 GPU. Professional workload performance is still worse, too. AMD frame generation is still worse than what NVIDIA had on the day 0. This comes down to anything.
I can't really add much to the topic other than the fact that I don't give a rat's arse about DLSS or FSR, let alone frame generation which is buggy as hell most of the time, and shoots your input lag into the sky every time, except when you're working with a high enough base frame rate, in which case you don't need FG in the first place.

The fact that AMD fails to deliver on bullshit gimmicks invented by Nvidia to artificially divide the market, bears no significance in my eyes.
Posted on Reply
#97
DaemonForce
john_That's just an excuse for not making enthusiast cards. Enthusiast cards sell also midrange and low end cards.
Hello? NOT an appropriate position. AMD is making a massive pivot with RDNA+CDNA unification, returning to the kind of product stack that was the best standard. They have let us know this is the last stop before the big change. So who is expected to buy the new range of cards? We don't even have a handle on that. Could be guys like me on RX480/580 that buy once every 5-10 years, try to escape as it gets long in the tooth then run into problems on any new shift. Maybe it's for people from the nVidia camp that have been in it too long and want to test the AMD waters. We all have the idea the next flagship is going to be a really small die and then the next round of enthusiast cards are set to ship much later under UDNA, which will most likely have MUCH larger dies that are likely comparable to the 6950XT the way AMD has been doing packaging the last few generations. Supposedly the kind of cut we see between Pascal and Volta (Tesla).

Imagine calling for enthusiast cards on a new process with last gen's memory and speeds while some very serious changes are in the works for hardware and software on a distant scheduled release that is likely going to be double or triple the die size (mm²) with the promise of significantly less headache. What I'm saying is the 8000/9000 rollout is likely going to be the redheaded stepchild of the bunch and then BAM we get something completely insane like Tesla V100 die sizes. The small chip will be really cool but if it's the designated capstone of this era, that party is going to be really short lived.
john_RDNA 4 is probably a fixed RDNA 3 plus the RT improvements SONY demanded from them to remain a customer. AMD's only new problem is Intel, but I think they will try to ignore Intel for now.
This is not even a guess. RDNA3 is a complete product stack and RDNA4 is some supposed "bugfix" even though there's no new information on features. Everybody that has bought 7000 series has already settled in with whatever features/issues and they're good for the next couple of years if not the rest of the decade. Those guys are all set. Weird miner products like the BC-250 have already exposed the reality of PS5 hardware and we're still in this really stupid MX standoff with GPUs as none of the three companies are producing anything that competes with one another. We don't have enough information on how this goes and it gets shakier with each dumb leak thread. Guess we'll have to wait until next week to be sure.
Frick"AMD hasn't officially dropped the price of the RX 6950 XT to $599, at least as far as we're aware — the AMD store lists the reference card at $699 (and it's out of stock). But Newegg, Amazon, and others have regularly had RX 6950 XT cards priced in the $599–$699 range for several months
$500-550ish *with water block
The people that waited on 6800XT-6950XT price drops have been snackin GOOD these past several months. The guys picking up 6000 series mid-range are all happy.
I would consider it too but there's this nagging voice telling me it's cheaper to go with newer product and for creator features I don't need, like AV1 encode and a buffer over 16GB.
I'll never need these on desktop since 1080p144 is my max but the moment I hop into VR or some 3D kit it's an immediate jump like dual 4K. There's no ignoring it anymore.
But...A lot can happen in a year. We could see weird price hikes and drops here and there, maybe a whole new semiconductor or new manufacturing technology. We gonn' find out.
Posted on Reply
#98
Super XP
These upcoming GPUs should be priced at min $499usd. Anything higher and AMD will have issues with trying to capture new market share IMO. Might end up losing market share. RDNA3 was already overpriced, hence the current market share situation.

If AMD wants to gain, cater to the mainstream market for this GPU launch.
Bomby569price will be the most important factor here, and we all know AMD will decide it after Nvidia releases theirs, and that's were AMD's problem is.
I agree, as they priced themselves out of the market with RDNA3. And as we see today, they continue to lose market share against Ngreedia. As I mentioned before, the top of the line mainstream RDNA4 GPU > rx 9070 xt must not exceed $499usd. If it does, AMD is not interested in gaining market share.
BTW, I hate the naming scheme 9070. They should have stuck with 8700xt or 9700xt.
Posted on Reply
#99
Soul_
TumbleGeorgeThis seems to be forced much north of the efficiency region.
When was the last time you bought a GPU that was tuned to be run in the "efficiency" region? Gone are those days.
Posted on Reply
#100
AcE
Dr. DroI stopped following some time ago tbh. These always devolve into some victim complex where a pitiful one is facing a great evil, followed by a stream of self reassuring posts and brand loyalty remarks, its amusing at first but it gets old fast
You’re also firmly part of those users. ;) Ironic.
Macro DeviceSony explicitly threatened to ditch AMD if the latters won't improve RT.
Source for that? (X) never happened.
Posted on Reply
Add your own comment
Dec 27th, 2024 16:29 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts