Thursday, December 26th 2024

AMD Radeon RX 9070 XT Boosts up to 3.10 GHz, Board Power Can Reach up to 330W
AMD's upcoming Radeon RX 9070 XT graphics card can boost its engine clock up to 3.10 GHz, a new leak that surfaced on ChipHell says. Depending on the board design, its total board power can reach up to 330 W, the leak adds. The GPU should come with a very high base frequency for the engine clock, with the leaker claiming a 2.80 GHz base frequency (can be interpreted as Game clocks), with the GPU boosting itself up to 3.10 GHz when the power and thermals permit. The RX 9070 XT will be the fastest graphics card from AMD to be based on its next-generation RDNA 4 graphics architecture. The company isn't targeting the enthusiast segment with this card, but rather the performance segment, where it is expected to go up against NVIDIA's GeForce RTX 5070 series.
RDNA 4 is expected to introduce massive generational gains in ray tracing performance, as AMD is rumored to have significantly developed its ray tracing hardware, to reduce the performance cost of ray tracing. However, as it stands, the "Navi 48" silicon that the RX 9070 XT is based on, is still a performance-segment chip, which succeeds the "Navi 32" and "Navi 22," with a rumored compute unit count of 64, or 4,096 stream processors. Performance-related rumors swing wildly. One set of rumors say that the card's raster graphics performance is in league of the RX 7900 GRE but with ray tracing performance exceeding that of the RX 7900 XTX; while another set of rumors say it beats the RX 7900 XT in raster performance, and sneaks up on the RTX 4080. We'll know for sure in about a month's time.
Sources:
ChipHell Forums, HXL (Twitter), VideoCardz
RDNA 4 is expected to introduce massive generational gains in ray tracing performance, as AMD is rumored to have significantly developed its ray tracing hardware, to reduce the performance cost of ray tracing. However, as it stands, the "Navi 48" silicon that the RX 9070 XT is based on, is still a performance-segment chip, which succeeds the "Navi 32" and "Navi 22," with a rumored compute unit count of 64, or 4,096 stream processors. Performance-related rumors swing wildly. One set of rumors say that the card's raster graphics performance is in league of the RX 7900 GRE but with ray tracing performance exceeding that of the RX 7900 XTX; while another set of rumors say it beats the RX 7900 XT in raster performance, and sneaks up on the RTX 4080. We'll know for sure in about a month's time.
167 Comments on AMD Radeon RX 9070 XT Boosts up to 3.10 GHz, Board Power Can Reach up to 330W
You would have a point if the tables were flopped and over 90% of the dGPU market share was not agreeing with me, but alas. I also have first hand experience of the kind you could not hope to recount.
RT is becoming more and more baked into gaming. Of course it's still very, very far from ideal but it's leagues more powerful than SSR/baked lighting/whatnot. Won't be surprised if every single AAA title of 2030 won't allow you any pure raster and it'll have non-PT Cyberpunk/Alan Wake level RT as their basic mode. With ultra settings going far beyond that.
And when most gamers don't own a 7900 XTX level GPU for their native resolution performance to be good you gotta resort to some sort of upscaling. And no matter how we hate the fact the games are poorly optimised and devs just imply you tick the box anyway, FSR does this job worse. End of story.
P.S. You can use both DLSS and FSR at a 100% scaling so you play true native resolution using more advanced AA than naked TAA and you know what, FSR is so behind it's even better to play 1080p@DLAA than it is to play 1440p@FSR100. Not in all games but in most of them.
There's no right answer to that which applies to everyone.
It's also that upscaling is marketed as something that improves your experience by adding performance (which is exactly what lowering graphics settings does, too), and not as something that blurs your image by rendering at a lower resolution. People don't know what upscaling is - they just think that it's free performance, where in reality, no performance is free.
Talking from the perspective of thinking upscaling is great, I can get an imperceptible loss (I don't see this blur) to clarity and trade that against increased visuals, sometimes even a generational difference in visuals. There's a reason people use statements like "free fps", because it can absolutely feel that way. Will that hold true for everyone? Of course not. Never mind our own tastes, everyone's setup is unique too. I don't think anyone is wrong or 'stupid' to game the way they do, but I get the impression at least relative to this forum I give people a bit more credit than being the easily influenced sheep some (not necessarily you specifically) call them.
My screen is 1080p, right? Just plain 1920x1080. Recent games are made with 4K in mind and textures are optimised for this resolution. I enable virtual super resolution (usually at 3072x1728 or 3200x1800 because can't tell apart and too taxing to run 4K anyway), then apply some upscaling (usually XeSS at 59% aka "Quality") and have games with vastly superior static image and forgivable dynamic artifacts than if I just stayed at native 1080p. Yes, I can see the advantages of going 3K on a 1080p display.
This is a much more powerful tool than it appears at the first glance.
They also help a lot with games where 120+ FPS is REALLY what the doctor ordered but there's no way to achieve it sans.
9070 Bandwidth is 640 GB/s
TBP is 330w maybe for custom models and its not TDP
its xx70 series card not xx80
Other than that, I agree. 4096 cores at ~3 GHz should perform similarly to 5120 cores at ~2.4 GHz, putting the 9070 on par with the 7900 GRE, unless there is some huge magic IPC gain lurking around somewhere.
Also, in TPU relative performance, the GRE is 78% faster than the RX 7600XT (2048 cores - max clocks at 2.75GHz) as shown below:
If we assume that the 9070 simply doubles the resources, cores/ROPs/cache etc of the 7600XT, and RDNA4 has slightly higher IPC, while running at 3GHz. The card should perform at least 200% faster than the RX 7600XT which makes it perform closer to the RTX 4070 TI Super and even RX 7900XT:
This is how I believe the card may perform, at least at 1080p and 1440p and fall behind at 4K.
We'll see about the IPC improvements (if any), and the promised better RT. I'm not entirely hopeless, just trying to look at it as realistically as possible.
I don't know how sensible they can make the 9070. But if it's relatively cheap and has a good power budget for what it's delivering, it could become popular. I'm not too convinced that that'll happen, but we'll see.
For me RX 7900 GRE is quite boring gpu becouse performance difference between RX 7800 XT is almost invisible. It would be shame if RX 9070 XT performs the same as RX 7900 GRE (Hair splitting boost).
If RX 9070 XT will perform something like RX 7900 XT/RTX 4070 Ti Super at 540-580$ then i wold consider to buy one. If RX 9070 XT will fail than there are options in used market RX 7900 XTX or RTX 4070 Ti Super.
I"ll still go nvidia."
Corrected it for you to be more truthful
Steam says we don't play 2024 games in 2024
If you're too lazy to read even the headline, it's that most users in 2024 (85%) didn't play any games released in 2024.
What exactly does this say about gaming? In my world, the clear truth is that most people don't value RT because they literally cannot use it. That's pretty obvious, but you're welcome to debate the whole "if they had it, they'd use it" equivocation. Now...combine that with the B580 absolutely killing it over this holiday season, and the conclusion is pretty clear. What consumers want is games that are fun to play, not driven by any "message," and they want hardware that they don't have to sell a kidney for. It's my belief that if AMD can come in with the 9070 XT...or the 8800...or whatever the official name comes out being, they can beat Nvidia and get that market share back.
I'm sure that plenty of people will argue that the 5080 will trounce the 9070...and I don't care. I can rock 4k worth or pixels with 144 hz performance on a 3080...let alone anything higher. It was priced at an MSRP of 699 USD. This was pre-pandemic, when Nvidia and AMD had to actually compete for market share instead of printing money with "AI" stickers being worth hundreds of dollars once slapped onto a GPU...and I think AMD has had that come to jesus moment and decided to stop competing for that halo market which cannot be profitable because there's so little volume.
You are welcome to hate AMD, Nvidia, and Intel. That said, this is about not accepting that they are charging egregious prices for generational gaps that are often not down to an investment in technology, but development of shiny new things that literally don't benefit 85% of the community...and that's a generous interpretation. If the halo 5% of the 15% is actually who can purchase high end hardware then you're looking at 0.75% of people. That's just silly to focus on, when the investment in producing cards amortized over 100k cards versus 2k cards requires them to cost 50 times as much. Remember kids, overhead costs can kill businesses and profitability faster than most other bad decisions.