Thursday, December 26th 2024
AMD Radeon RX 9070 XT Boosts up to 3.10 GHz, Board Power Can Reach up to 330W
AMD's upcoming Radeon RX 9070 XT graphics card can boost its engine clock up to 3.10 GHz, a new leak that surfaced on ChipHell says. Depending on the board design, its total board power can reach up to 330 W, the leak adds. The GPU should come with a very high base frequency for the engine clock, with the leaker claiming a 2.80 GHz base frequency (can be interpreted as Game clocks), with the GPU boosting itself up to 3.10 GHz when the power and thermals permit. The RX 9070 XT will be the fastest graphics card from AMD to be based on its next-generation RDNA 4 graphics architecture. The company isn't targeting the enthusiast segment with this card, but rather the performance segment, where it is expected to go up against NVIDIA's GeForce RTX 5070 series.
RDNA 4 is expected to introduce massive generational gains in ray tracing performance, as AMD is rumored to have significantly developed its ray tracing hardware, to reduce the performance cost of ray tracing. However, as it stands, the "Navi 48" silicon that the RX 9070 XT is based on, is still a performance-segment chip, which succeeds the "Navi 32" and "Navi 22," with a rumored compute unit count of 64, or 4,096 stream processors. Performance-related rumors swing wildly. One set of rumors say that the card's raster graphics performance is in league of the RX 7900 GRE but with ray tracing performance exceeding that of the RX 7900 XTX; while another set of rumors say it beats the RX 7900 XT in raster performance, and sneaks up on the RTX 4080. We'll know for sure in about a month's time.
Sources:
ChipHell Forums, HXL (Twitter), VideoCardz
RDNA 4 is expected to introduce massive generational gains in ray tracing performance, as AMD is rumored to have significantly developed its ray tracing hardware, to reduce the performance cost of ray tracing. However, as it stands, the "Navi 48" silicon that the RX 9070 XT is based on, is still a performance-segment chip, which succeeds the "Navi 32" and "Navi 22," with a rumored compute unit count of 64, or 4,096 stream processors. Performance-related rumors swing wildly. One set of rumors say that the card's raster graphics performance is in league of the RX 7900 GRE but with ray tracing performance exceeding that of the RX 7900 XTX; while another set of rumors say it beats the RX 7900 XT in raster performance, and sneaks up on the RTX 4080. We'll know for sure in about a month's time.
102 Comments on AMD Radeon RX 9070 XT Boosts up to 3.10 GHz, Board Power Can Reach up to 330W
or "AMD Radeon RX 9070 XT Boosts up to 3.10 GHz"(but that's barely +5% VS 6000 series)
Even though AMD announced that they pulled out from the high-end, even if their marketing department won't advertise this as a mid-range, the narrative is "9070 XT to compete with 5090" - then you get 40% slower performance and it's a disappointment, even if it's priced accordingly
Or it's another "poor volta" moment
When it comes to GPUs, people just outright refuse to believe that inflation is real. This is why I wish Anandtech hadnt gone to crap, a circuit analysis of a MCD would be fascinating. I'd bet good money that if the chips were on the same 5nm node as the GPU, and you didnt need those MCM interconnects, that the memory controllers would be a lto smaller and the total die size would be a lot closer to the 4080.
The decisions they made to implement things like a 384 bit interface with 12 memory chips, the power target (because as we have come to know, it scales miserably past its default 330 to 350 W range, even if you pump it 475 W+ into it), none of these are taken lightly when designing a graphics card, they are very much aware of their product's strengths and weaknesses, yet the end result is that both it and the 7900 XT visibly had to be subjected to pre-launch price cuts once they saw that they stood no chance in hell against the RTX 4090 despite being late, especially with the horribly broken launch drivers (which they knew was a problem at the time).
I guarantee you if the RTX 4080 had come first by itself and the 4090 had delayed for no more than a month, the 7900 XTX would have an $1499 MSRP (justifying itself against the 4080 for having 24 over 16 GB) and they would place the 7900 XT at 1199 while telling people they could still get 20 GB and almost as much performance, still sounding like it was a good deal.
you would have to go form the GTX 680 to 1080 Ti ti match the same time frame as 1080 ti to RTX 4090. It's basically about the 3.3x increase.
The main difference is the price the GTXX 680 had a price of $499 & the 1080 ti had a price of $699 That's only 40% more
then you go the 1080 TI & at $699 going the RTX 4090 is 2.2 times more expensive. Even with inflation the Nvidia RTX cards are all horribly priced & always have been since their introduction.
The RX 9070 is only showing to even come close the RX 7900 GRE , but a 20% increase in RT fps. while having the same total CU count. AMD did the same thing with RDNA3 They gave 20% more RT cores & barely made it faster than their RNDA2 by cheeping out shaders & RT cores. Meanwhile ever single one of NVidia's cards have always increase both shaders & RT cores, by far mor than 20%.
The card is horribly over priced a $649 Or even $449 That card is so slow compare the 5000 series is should be a short release of 6 to 8 month there is almost no reason to release it all.
The price should be $350 it's a last generation tier performance & it will still be behind.
And that's the problem. One card has an official $599 MSRP and should remain at that price point going forward, the other got a price promotion to hit $599 and those cards seem to have disappeared now. The best we can find is $619 at present. Sure, it's "only" $20 extra, but it's also for roughly equivalent performance across our gaming suite, and again we favor Nvidia's GPU across a larger suite of testing that factors in DLSS and AI.
If you can find an RX 6950 XT for the same price as an RTX 4070, it's worth considering, but that's not a decisive win. And in fact, since you'll also use about 100–150 watts more power while gaming with the RX 6950 XT, depending on how much you play games, the ongoing cost of owning the RX 6950 XT could also be higher."
I genuninely don't know what we're arguing about anymore.
No one but about a couple dozen ludomaniacs wants a 330 W midrange krankenachtung that's pretending to be a GPU whilst being open about being an AMD product. Catching up with upper midrange Ada in terms of RT performance could have been an achievement three years ago when Ada GPUs didn't exist. In this day and age, this is like a postman finally delivering you a fax machine. Cool but you ordered that a lifetime ago.
This is a loop of underdelivery. AMD promise 7th heaven but what you see on shelves is shitposting. Then they don't get good revenue and once again promise sky high quality but then the product/feature comes out a couple years too late and it's still worse than what competition had readily available when AMD only were running an ad campaign. Disregard facts, pretend it's a part of a genius plan, get set, repeat.
What AMD need is to become pounders, not smacktalkers. Or to sell their graphics division to someone who ACTUALLY cares.
Not my point anyway. My point is whatever AMD made last decade feature wise was at least a year later than the green equivalent and never catched up in quality. FSR from almost 2025 is still worse than DLSS from 2020. "But it's brand agnostic and open source and shiet" why should I care if it... doesn't improve my experience, or does but similar perf at similar quality can be achieved on a cheaper green GPU because DLSS P beats FSR Q not only in speed but also in quality in almost all games? RT from almost 2025 is still worse than in RTX 3090, a 2020 GPU. Professional workload performance is still worse, too. AMD frame generation is still worse than what NVIDIA had on the day 0. This comes down to anything.
The fact that AMD fails to deliver on bullshit gimmicks invented by Nvidia to artificially divide the market, bears no significance in my eyes.
Imagine calling for enthusiast cards on a new process with last gen's memory and speeds while some very serious changes are in the works for hardware and software on a distant scheduled release that is likely going to be double or triple the die size (mm²) with the promise of significantly less headache. What I'm saying is the 8000/9000 rollout is likely going to be the redheaded stepchild of the bunch and then BAM we get something completely insane like Tesla V100 die sizes. The small chip will be really cool but if it's the designated capstone of this era, that party is going to be really short lived. This is not even a guess. RDNA3 is a complete product stack and RDNA4 is some supposed "bugfix" even though there's no new information on features. Everybody that has bought 7000 series has already settled in with whatever features/issues and they're good for the next couple of years if not the rest of the decade. Those guys are all set. Weird miner products like the BC-250 have already exposed the reality of PS5 hardware and we're still in this really stupid MX standoff with GPUs as none of the three companies are producing anything that competes with one another. We don't have enough information on how this goes and it gets shakier with each dumb leak thread. Guess we'll have to wait until next week to be sure. $500-550ish *with water block
The people that waited on 6800XT-6950XT price drops have been snackin GOOD these past several months. The guys picking up 6000 series mid-range are all happy.
I would consider it too but there's this nagging voice telling me it's cheaper to go with newer product and for creator features I don't need, like AV1 encode and a buffer over 16GB.
I'll never need these on desktop since 1080p144 is my max but the moment I hop into VR or some 3D kit it's an immediate jump like dual 4K. There's no ignoring it anymore.
But...A lot can happen in a year. We could see weird price hikes and drops here and there, maybe a whole new semiconductor or new manufacturing technology. We gonn' find out.
If AMD wants to gain, cater to the mainstream market for this GPU launch. I agree, as they priced themselves out of the market with RDNA3. And as we see today, they continue to lose market share against Ngreedia. As I mentioned before, the top of the line mainstream RDNA4 GPU > rx 9070 xt must not exceed $499usd. If it does, AMD is not interested in gaining market share.
BTW, I hate the naming scheme 9070. They should have stuck with 8700xt or 9700xt.