Thursday, December 26th 2024

AMD Radeon RX 9070 XT Boosts up to 3.10 GHz, Board Power Can Reach up to 330W

AMD's upcoming Radeon RX 9070 XT graphics card can boost its engine clock up to 3.10 GHz, a new leak that surfaced on ChipHell says. Depending on the board design, its total board power can reach up to 330 W, the leak adds. The GPU should come with a very high base frequency for the engine clock, with the leaker claiming a 2.80 GHz base frequency (can be interpreted as Game clocks), with the GPU boosting itself up to 3.10 GHz when the power and thermals permit. The RX 9070 XT will be the fastest graphics card from AMD to be based on its next-generation RDNA 4 graphics architecture. The company isn't targeting the enthusiast segment with this card, but rather the performance segment, where it is expected to go up against NVIDIA's GeForce RTX 5070 series.

RDNA 4 is expected to introduce massive generational gains in ray tracing performance, as AMD is rumored to have significantly developed its ray tracing hardware, to reduce the performance cost of ray tracing. However, as it stands, the "Navi 48" silicon that the RX 9070 XT is based on, is still a performance-segment chip, which succeeds the "Navi 32" and "Navi 22," with a rumored compute unit count of 64, or 4,096 stream processors. Performance-related rumors swing wildly. One set of rumors say that the card's raster graphics performance is in league of the RX 7900 GRE but with ray tracing performance exceeding that of the RX 7900 XTX; while another set of rumors say it beats the RX 7900 XT in raster performance, and sneaks up on the RTX 4080. We'll know for sure in about a month's time.
Sources: ChipHell Forums, HXL (Twitter), VideoCardz
Add your own comment

114 Comments on AMD Radeon RX 9070 XT Boosts up to 3.10 GHz, Board Power Can Reach up to 330W

#101
Dr. Dro
AcEYou’re also firmly part of those users. ;) Ironic.

Source for that? (X) never happened.
Wrong. I am not a fan of any given brand. I simply make it known I dislike AMD's current approach to graphics cards and I will happily point out the inconvenient truths and the undesirables, I don't really have to justify myself.

You would have a point if the tables were flopped and over 90% of the dGPU market share was not agreeing with me, but alas. I also have first hand experience of the kind you could not hope to recount.
Posted on Reply
#102
AcE
Dr. DroWrong. I am not a fan of any given brand.
Again you made a good joke. :) I’m gonna leave it at that.
Posted on Reply
#103
Macro Device
AusWolfThe fact that AMD fails to deliver on bullshit gimmicks invented by Nvidia to artificially divide the market, bears no significance in my eyes.
Gimmicks or not, it's a part of the modern gaming. A part you can't just ignore. AMD could have declared all of it pure nonsense but in order for that statement to have some ground under its feet and to bring some actual improvements to the gaming GPU market AMD should've also had created something noteworthy; revolutional, even. Which never happened. AMD GPUs just barely outperform NVIDIA offerings at pure raster (not in all games by the way) and lose miserably at everything else. Most notably, RT.

RT is becoming more and more baked into gaming. Of course it's still very, very far from ideal but it's leagues more powerful than SSR/baked lighting/whatnot. Won't be surprised if every single AAA title of 2030 won't allow you any pure raster and it'll have non-PT Cyberpunk/Alan Wake level RT as their basic mode. With ultra settings going far beyond that.

And when most gamers don't own a 7900 XTX level GPU for their native resolution performance to be good you gotta resort to some sort of upscaling. And no matter how we hate the fact the games are poorly optimised and devs just imply you tick the box anyway, FSR does this job worse. End of story.

P.S. You can use both DLSS and FSR at a 100% scaling so you play true native resolution using more advanced AA than naked TAA and you know what, FSR is so behind it's even better to play 1080p@DLAA than it is to play 1440p@FSR100. Not in all games but in most of them.
Posted on Reply
#104
wolf
Better Than Native
Dr. DroIt is painfully obvious
But, they're my favourite brand and it suits me better if I spin it differently!
Posted on Reply
#105
Event Horizon
I'll probably buy one if they've figured out how to eliminate coil whine.
Posted on Reply
#106
AusWolf
Macro DeviceGimmicks or not, it's a part of the modern gaming. A part you can't just ignore. AMD could have declared all of it pure nonsense but in order for that statement to have some ground under its feet and to bring some actual improvements to the gaming GPU market AMD should've also had created something noteworthy; revolutional, even. Which never happened. AMD GPUs just barely outperform NVIDIA offerings at pure raster (not in all games by the way) and lose miserably at everything else. Most notably, RT.

RT is becoming more and more baked into gaming. Of course it's still very, very far from ideal but it's leagues more powerful than SSR/baked lighting/whatnot. Won't be surprised if every single AAA title of 2030 won't allow you any pure raster and it'll have non-PT Cyberpunk/Alan Wake level RT as their basic mode. With ultra settings going far beyond that.

And when most gamers don't own a 7900 XTX level GPU for their native resolution performance to be good you gotta resort to some sort of upscaling. And no matter how we hate the fact the games are poorly optimised and devs just imply you tick the box anyway, FSR does this job worse. End of story.

P.S. You can use both DLSS and FSR at a 100% scaling so you play true native resolution using more advanced AA than naked TAA and you know what, FSR is so behind it's even better to play 1080p@DLAA than it is to play 1440p@FSR100. Not in all games but in most of them.
Personally, I'd rather just disable RT and turn my graphics down a notch instead of relying on upscaling, but each to their own. I know using DLSS/FSR with ultra graphics is much more popular than using high or medium graphics at native res, I just can't understand why. I can live with a bit less detail on my shadows, but I can't stand a blurry image.
Posted on Reply
#107
wolf
Better Than Native
AusWolfI just can't understand why.
Just personal preferences really, some people don't care for the bells and whistles and aren't particularly driven by high end visuals, others are and are willing to lean a little on upscaling to get that experience.

There's no right answer to that which applies to everyone.
Posted on Reply
#108
AusWolf
wolfJust personal preferences really, some people don't care for the bells and whistles and aren't particularly driven by high end visuals, others are and are willing to lean a little on upscaling to get that experience.

There's no right answer to that which applies to everyone.
I'm sure it's personal preference for some people. But I also think that reviews have a lot to do with influencing common taste. Since reviews are done with maxed out graphics, I guess people assume that it's the only real way to play the game even if you have to rely on heavy upscaling for acceptable performance.

It's also that upscaling is marketed as something that improves your experience by adding performance (which is exactly what lowering graphics settings does, too), and not as something that blurs your image by rendering at a lower resolution. People don't know what upscaling is - they just think that it's free performance, where in reality, no performance is free.
Posted on Reply
#109
wolf
Better Than Native
@AusWolf i suppose we all make trade-offs, lowering setting to increase fps is a trade off visually, just like upscaling is. Often times I even do both.

Talking from the perspective of thinking upscaling is great, I can get an imperceptible loss (I don't see this blur) to clarity and trade that against increased visuals, sometimes even a generational difference in visuals. There's a reason people use statements like "free fps", because it can absolutely feel that way. Will that hold true for everyone? Of course not. Never mind our own tastes, everyone's setup is unique too. I don't think anyone is wrong or 'stupid' to game the way they do, but I get the impression at least relative to this forum I give people a bit more credit than being the easily influenced sheep some (not necessarily you specifically) call them.
Posted on Reply
#110
AusWolf
wolf@AusWolf i suppose we all make trade-offs, lowering setting to increase fps is a trade off visually, just like upscaling is. Often times I even do both.
I agree. It's just that the trade-off of lowering some visual settings seems much more acceptable to me than that of enabling upscaling.
wolfTalking from the perspective of thinking upscaling is great, I can get an imperceptible loss (I don't see this blur) to clarity and trade that against increased visuals, sometimes even a generational difference in visuals. There's a reason people use statements like "free fps", because it can absolutely feel that way. Will that hold true for everyone? Of course not. Never mind our own tastes, everyone's setup is unique too. I don't think anyone is wrong or 'stupid' to game the way they do, but I get the impression at least relative to this forum I give people a bit more credit than being the easily influenced sheep some (not necessarily you specifically) call them.
Feelings can be influenced. You may be presented with a lower quality image, but if every major review site says that it's actually better because they analysed every pixel in a static image, you'll doubt yourself. This is where the distinction between an adult human being and sheep is. A real person can decide what they like for themselves and consider everything else a personal opinion. Sheep are influenced by the "influencers" (this is why I hate this word) and use the commonly accepted (that is: the loudest) public opinion as gospel that no one else must deviate from. Needless to say, I have no interest in listening to a single word that such people (sheep) have to say.
Posted on Reply
#111
Macro Device
AusWolfI just can't understand why.
I personally use upscalers for the circus method.

My screen is 1080p, right? Just plain 1920x1080. Recent games are made with 4K in mind and textures are optimised for this resolution. I enable virtual super resolution (usually at 3072x1728 or 3200x1800 because can't tell apart and too taxing to run 4K anyway), then apply some upscaling (usually XeSS at 59% aka "Quality") and have games with vastly superior static image and forgivable dynamic artifacts than if I just stayed at native 1080p. Yes, I can see the advantages of going 3K on a 1080p display.

This is a much more powerful tool than it appears at the first glance.

They also help a lot with games where 120+ FPS is REALLY what the doctor ordered but there's no way to achieve it sans.
Posted on Reply
#112
AusWolf
Macro DeviceI personally use upscalers for the circus method.

My screen is 1080p, right? Just plain 1920x1080. Recent games are made with 4K in mind and textures are optimised for this resolution. I enable virtual super resolution (usually at 3072x1728 or 3200x1800 because can't tell apart and too taxing to run 4K anyway), then apply some upscaling (usually XeSS at 59% aka "Quality") and have games with vastly superior static image and forgivable dynamic artifacts than if I just stayed at native 1080p. Yes, I can see the advantages of going 3K on a 1080p display.

This is a much more powerful tool than it appears at the first glance.

They also help a lot with games where 120+ FPS is REALLY what the doctor ordered but there's no way to achieve it sans.
Together with VSR, I guess I can see some point in it.
Posted on Reply
#113
Dawora
Mr_EngineerSince it will most likely be using a 4nm or 3nm process and running at such high clocks, at this power draw (330W) my guess it would perform about the same as an RTX 4080.
Low core count
9070 Bandwidth is 640 GB/s
TBP is 330w maybe for custom models and its not TDP

its xx70 series card not xx80
Posted on Reply
#114
AusWolf
DaworaLow core count
9070 Bandwidth is 640 GB/s
TBP is 330w maybe for custom models and its not TDP

its xx70 series card not xx80
70 and 80 are just arbitrary names, randomly made up. Totally not comparable across different vendors or even generations of the same vendor.

Other than that, I agree. 4096 cores at ~3 GHz should perform similarly to 5120 cores at ~2.4 GHz, putting the 9070 on par with the 7900 GRE, unless there is some huge magic IPC gain lurking around somewhere.
Posted on Reply
Add your own comment
Dec 28th, 2024 07:41 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts