Thursday, December 26th 2024

AMD Radeon RX 9070 XT Boosts up to 3.10 GHz, Board Power Can Reach up to 330W

AMD's upcoming Radeon RX 9070 XT graphics card can boost its engine clock up to 3.10 GHz, a new leak that surfaced on ChipHell says. Depending on the board design, its total board power can reach up to 330 W, the leak adds. The GPU should come with a very high base frequency for the engine clock, with the leaker claiming a 2.80 GHz base frequency (can be interpreted as Game clocks), with the GPU boosting itself up to 3.10 GHz when the power and thermals permit. The RX 9070 XT will be the fastest graphics card from AMD to be based on its next-generation RDNA 4 graphics architecture. The company isn't targeting the enthusiast segment with this card, but rather the performance segment, where it is expected to go up against NVIDIA's GeForce RTX 5070 series.

RDNA 4 is expected to introduce massive generational gains in ray tracing performance, as AMD is rumored to have significantly developed its ray tracing hardware, to reduce the performance cost of ray tracing. However, as it stands, the "Navi 48" silicon that the RX 9070 XT is based on, is still a performance-segment chip, which succeeds the "Navi 32" and "Navi 22," with a rumored compute unit count of 64, or 4,096 stream processors. Performance-related rumors swing wildly. One set of rumors say that the card's raster graphics performance is in league of the RX 7900 GRE but with ray tracing performance exceeding that of the RX 7900 XTX; while another set of rumors say it beats the RX 7900 XT in raster performance, and sneaks up on the RTX 4080. We'll know for sure in about a month's time.
Sources: ChipHell Forums, HXL (Twitter), VideoCardz
Add your own comment

167 Comments on AMD Radeon RX 9070 XT Boosts up to 3.10 GHz, Board Power Can Reach up to 330W

#151
john_
DaemonForceNot everything fits in your stupid tech bubble.
Can you clarify here if your are talking specifically about MY stupid tech bubble and if yes explain what is MY stupid tech bubble?
Posted on Reply
#152
kapone32
john_$1000-$2000 for a high end graphics card is not tiny, expect if we are talking about RX 6400 and GTX 1630. Now if you own a Ferrari, parked in a villa, eating caviar every day, yep, the cost of a GPU is tiny indeed.
In that case obviously you don't have a reason to be crying about pricing, but considering that most people are not rich, the least someone can do is to object for the prices going up and the stagnation in performance that we see in the sub $800 market.
I have the same approach in my classroom. When a student of mine having scored 20 out of 20, rushes to say "the exam was easy" I turn to him and say "congrats, but don't talk for the whole class".
This is why the 3060 laptop was the best thing to buy during Covid and as such had an impact that you could see it on sales charts. Today the Steam Deck can be as cheap as $343 Canadian. In a world where the cheapest DGPUs are $270 for AMD and $400 for Nvidia. There is another generation of APUs coming as well. We will see what happens in a few days.
Posted on Reply
#153
Mr_Engineer
kapone32Did not Nvidia buy the entire lot of the next node?
That's probably Apple
Posted on Reply
#154
ToxicTaZ
So to some things up the RX 9070XT 330w is using more power and slower than the two years old RTX 4080 16GB 320w.

Glad I bought the RTX 4080 Super 16GB 320w.

MSI RTX 4080 Super 16G SUPRIM X

So AMD is going to advertise the 9070XT 330w as a 7900XTX 355w performance and use less power at a lower price point?.. But your actually getting performance trading blows with the 7900XT at best.

Long live the king RX 7900XTX high-end.

Is that what I'm hearing?

Not to worry because Nvidia will be doing the same thing, RTX 5070Ti 16GB 300w will be RTX 4080 16GB 320w performance using less power and less price.

What happened to the days of 50% performance increase? Now 25% performance increase at best? Lack of compition? Guess we wait to see what happens.

Cheers
Posted on Reply
#155
DaemonForce
john_Can you clarify here if your are talking specifically about MY stupid tech bubble and if yes explain what is MY stupid tech bubble?
Why do you need an invitation?
The hype around "bleeding edge" tech because +5% performance over last gen's hero or whatever is competitive.
Market shift for the next big hype thing because community interest = $$$$$.
Casual gaming communities that keep getting co-opted by bad actors (read: NOT customers) over shit politics.
The same actors that remove historic treasure games to serve you slop or completely wipe out hardware for several months to mine buttcoins.
Any of that sound familiar yet?

How about the vendors that manipulate the market with low product and artificial prices while price discovery is on its head?
Or the response to it by letting others break rank to create fake inflation and a race to the bottom? Not awake yet? I can't help you.

This isn't about halo sales or overclocking this and that to hit 200+ gigashits per second.
The RX 580 was an example to describe the mood and that was my point of entry moving on from 2013 and into 2017, bearing witness to it first hand.
I didn't see any $1000 stickers on these cards until later but at $500USD it may as well have been because it was either sold out or scalper price.
Still scalper price. MSRP was like $260. Now it's worse than I had and became the Chinese chopping block special™, on top of being old and troublesome scrap.
Beware and be aware. Even with Player 3 (Intel) jumping in, we have not moved on from this problem.
We will encounter it very suddenly just when things are starting to look normal.
It's always the same few noisy tourists (that in no shape or form belong in PC gaming) and vendors trying to draw them in with miner/AI garbage.
Start gatekeeping them. HARD. I never see these in brick and mortar shops so the best we can do is handle e-tailers.
Hold their feet to the fire. Or we're going to be spinning our wheels for another year.
ToxicTaZSo to some things up the RX 9070XT 330w is using more power and slower than the two years old RTX 4080 16GB 320w.
Waiting for the details to clear up but yeah, doesn't sound good.
ToxicTaZSo AMD is going to advertise the 9070XT 330w as a 7900XTX 355w performance and use less power at a lower price point?.. But your actually getting performance trading blows with the 7900XT at best.
I'm okay with 7900XT performance if it's at a better price point and package. Otherwise I will try yet again to pick up a 7900XT that works.
If there are features on the 9070XT that obsolete the 7900XT then I'll go with it just because I need the insurance of improved features.
Being a creator let alone VR creator on long dated AMD hardware is a very troublesome position.
ToxicTaZWhat happened to the days of 50% performance increase? Now 25% performance increase at best? Lack of compition? Guess we wait to see what happens.
These companies no longer compete with each other, which is the entire problem with the incoming product (so far). 7 Days.
Posted on Reply
#156
john_
DaemonForceNot awake yet? I can't help you.
Please, avoid helping me in the future.
Posted on Reply
#157
AusWolf
ToxicTaZSo to some things up the RX 9070XT 330w is using more power and slower than the two years old RTX 4080 16GB 320w.

Glad I bought the RTX 4080 Super 16GB 320w.

MSI RTX 4080 Super 16G SUPRIM X

So AMD is going to advertise the 9070XT 330w as a 7900XTX 355w performance and use less power at a lower price point?.. But your actually getting performance trading blows with the 7900XT at best.

Long live the king RX 7900XTX high-end.

Is that what I'm hearing?

Not to worry because Nvidia will be doing the same thing, RTX 5070Ti 16GB 300w will be RTX 4080 16GB 320w performance using less power and less price.

What happened to the days of 50% performance increase? Now 25% performance increase at best? Lack of compition? Guess we wait to see what happens.

Cheers
The article says "board power can reach up to 330 W".

It means that you can reach 330 W with something like a Red Devil if you overclock it through the roof. It doesn't mean that the bog standard 9070 XT will be a 330 W card.
Posted on Reply
#158
ToxicTaZ
AusWolfThe article says "board power can reach up to 330 W".

It means that you can reach 330 W with something like a Red Devil if you overclock it through the roof. It doesn't mean that the bog standard 9070 will be a 330 W card.
AMD RX 9070 & RX 9070XT are both different cards that operate at different Watts.

Cheers
Posted on Reply
#159
AusWolf
ToxicTaZAMD RX 9070 & RX 9070XT are both different cards that operate at different Watts.

Cheers
I edited my comment for you to be more accurate. Happy?
Posted on Reply
#161
AusWolf
Super XPWho's exited ;):D:eek::lovetpu:
I am, but why is there a 7800 XT in the picture in front of an RDNA 4 background? :wtf:
Posted on Reply
#162
Sir Beregond
AusWolfI am, but why is there a 7800 XT in the picture in front of an RDNA 4 background? :wtf:
Anything to drum up clicks.
Posted on Reply
#163
RJARRRPCGP
ToxicTaZSo AMD is going to advertise the 9070XT 330w as a 7900XTX 355w performance and use less power at a lower price point?.. But your actually getting performance trading blows with the 7900XT at best.
That sounds like ATi Radeon 9000 Pro vs. Radeon 8500, LOL.
Posted on Reply
#164
Dawora
ToxicTaZSo to some things up the RX 9070XT 330w is using more power and slower than the two years old RTX 4080 16GB 320w.

Glad I bought the RTX 4080 Super 16GB 320w.

MSI RTX 4080 Super 16G SUPRIM X

So AMD is going to advertise the 9070XT 330w as a 7900XTX 355w performance and use less power at a lower price point?.. But your actually getting performance trading blows with the 7900XT at best.

Long live the king RX 7900XTX high-end.

Is that what I'm hearing?

Not to worry because Nvidia will be doing the same thing, RTX 5070Ti 16GB 300w will be RTX 4080 16GB 320w performance using less power and less price.

What happened to the days of 50% performance increase? Now 25% performance increase at best? Lack of compition? Guess we wait to see what happens.

Cheers
5070Ti will be faster than 4080 and it will be closer to 4090.

5060 laptop gpu looks fine.
faster than 4060ti desktop and 4070 laptop.

So 5060 desktop is somewhere aroun 4070 or very close.

also 5060Ti is somewhere around 3090 and 3090Ti performance

Posted on Reply
#165
TumbleGeorge
Dawora5070Ti will be faster than 4080 and it will be closer to 4090.

5060 laptop gpu looks fine.
faster than 4060ti desktop and 4070 laptop.

So 5060 desktop is somewhere aroun 4070 or very close.

also 5060Ti is somewhere around 3090 and 3090Ti performance

Apologies, what is this:
Posted on Reply
#166
ToxicTaZ
Dawora5070Ti will be faster than 4080 and it will be closer to 4090.

5060 laptop gpu looks fine.
faster than 4060ti desktop and 4070 laptop.

So 5060 desktop is somewhere aroun 4070 or very close.

also 5060Ti is somewhere around 3090 and 3090Ti performance

First off the desktop RTX 4080 is Doubled the performance of the RTX 4060Ti. Laptop are always badly cut down from desktop.

I wouldn't base anything off that for two reasons one it says rumored and the second it says laptop everybody knows laptop GPUs are way slower than desktops that's no way the performance of a 4080 desktop.

Both next-gen med from AMD and Nvidia range are aiming for regular two years old Desktop RTX 4080 performance.

25% performance increase Respect...trading blows with...

RTX 5070Ti = 4080
RTX 5080 = 4090
RTX 5090 = 4090 +25%

That's for reality coming unfortunately no competition considering amd's best flagship RX 9070XT 330w is slower than the two years old RTX 4080 320w.

With TDP power and prices going up with only 25% performance increase is it worth your money?

Cheers

I based the RTX 5000 series 25% performance increase based apon my MSI RTX 4080 Super 16G SUPRIM X 320w. I'm curious were my card stands with next-gen.

Better wait a year for Nvidia Rubin!

RTX 6000 series Rubin Vera on 3nm
DLSS 5
Double VRAM - GDDR7X
PCIe 6.0 Ready
Display Port 2.2
HDMI 2.2

Nvidia will market two years for RTX 6000 series (Rubin) so suspect to see the RTX 6000 Super Series.

Patience is a virtue people who wait for Rubin SUPER

Cheers
Posted on Reply
#167
AusWolf
ToxicTaZRTX 5080 = 4090
RTX 5090 = 4090 +25%
Nope. The 5090 has double of everything compared to the 5080 (shaders, memory bus, memory quantity, etc.), so it'll be 5080 +100%.
Posted on Reply
Add your own comment
Feb 5th, 2025 07:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts