• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 XT Leaked Listing Reveals Pricing Details

1x8 pin= 150w
3x8 pins = 450w
Pcie slot= 75w
So in total= 525w


Amd mention references model 330w, so please tell me in logical way why and what of OC they will achieve with extra 200w, that 3x8 pin connection dont make any sense

Maybe it helps with keeping the PCI-e slot at a more stable voltage, on top of that powercooler are known to push the limits even if the gain is terrible.
 
I have to sound like a broken record as I tried to explain on 2-3 other threads why the 3x8pin makes sense for some AIB variants of 9070XT.

If we accept that reference 9070XT is 330W then the top OC AIB variants with likely dual VBIOS could be configured from ~280W up to 420W.

For example the reference 7900XTX is 355W
My 7900XTX between the 2 VBIOS on board and Adrenalin -10%/+15% power limit, can be configured from 316W up to 467W
A ~150W range.

PCIE + 2x8pin = 375W
PCIE + 3x8pin = 525W

So tell us now why any GPU vendor make the stupid move to equip a 9070XT with only 2x8pin if they want to give the GPU 400-420W limit for those who want it with the performance VBIOS?
Having a 525W limit on the connectors doesn’t mean you will use it all.
It’s stupid to bring the 375W configuration to its limits.

Common sense is called!
Common sense = pulling half a kW through a 4096 shader GPU to you?

Mkay

I think at 330W its already pretty generous with power for this level of chip... It'll be interesting to see if there even is notable OC headroom (that is actually stable).
 
Common sense = pulling half a kW through a 4096 shader GPU to you?

Mkay

I think at 330W its already pretty generous with power for this level of chip... It'll be interesting to see if there even is notable OC headroom (that is actually stable).
I don't know if it was here, or in another thread, but someone posted a screenshot on a 9070 XT running 330 W with an 80 ˚C hotspot. If it can do that with a dual-slot cooler, I'll be more than happy.
 
And that is probably on a bench with no airflow, so it might run cooler in a case.
 
And that is probably on a bench with no airflow, so it might run cooler in a case.
Probably not in my case. I'm on micro-ATX with a solid shroud over the PSU compartment (I don't know why that is).
 
Probably not in my case. I'm on micro-ATX with a solid shroud over the PSU compartment

1736692683012.png


(I don't know why that is).

:banghead: :kookoo:
 
Common sense = pulling half a kW through a 4096 shader GPU to you?

Mkay

I think at 330W its already pretty generous with power for this level of chip... It'll be interesting to see if there even is notable OC headroom (that is actually stable).
Didn't say 500W, but maybe 400-430W. Its very different.
And you can't do even 400W with 2x8pin.
XTXs didn't go past 460-470W with only one (stupid) exception of a liquid 550W VBIOS.

And why are you ignoring the dedicated FSR/AI/RT hardware that RDNA4 supposed to have?
Its not about shaders anymore... Those other units wont work with thin air, but with power.

I've read a lot of opinions about the 3x8pin connectors.
One that does not make sense at all is that they used older PCBs that already had 3 connectors.
If they did that and this GPU does not "need" 3, when OCed, they could have left the 3rd unpopulated. Simple...

We dont even know the true capabilities of this entirely new architecture and we already playing engineering of what it can do, under stock, under OC or what it needs.
Its a little absurd to me.

I am not saying that performance will increase proportionally with high OC, that never happens, but on the other hand its impossible to know from now at what point the perf/power curve declines significantly.

Lets see first legit reviews about all aspects of gaming.... raster, RT and upscaling and then we can criticize all we want.
We like it or not, those features are here and nVidia is raining all over the place with them.
When you are a small market share and trying to gain some more, you dont have any other choice but follow.
Even when you're late.
 
I would advise AMD to price cards to break even with R&D and their manufacturing costs at TSMC. Anything higher, and they risk becoming the next Matrox, 3Dfx, S3, Cirrus Logic.

For anyone wondering who the fu..k these companies are, exactly.

Matrox still make cards I believe, and they still update drivers. They just dropped out the gaming market in the early 2000’s. Happy to be put right on that if I am mistaken.

if you want to really hit the classics try Number 9 graphics cards, now there was some promise!
 
Matrox still make cards I believe, and they still update drivers. They just dropped out the gaming market in the early 2000’s. Happy to be put right on that if I am mistaken.
Yes, they're right here.

Fun fact: I could much more easily get a Matrox card in the UK right now than an Intel one (if you don't consider the price). :D
 
What is certain is that AMD will try to focus on QHD instead of 4K (I gave my performance expectation in 4K based on 2750MHz, it seems that the reference turbo will probably be at least 2850MHz based on the leaked turbo clock of Gigabyte Gaming OC -2970MHz if the leaks are true).
RX 9070XT will have 4 Shader Engines while AD103 has 7 GPCs, as you go down in resolution it favors AMD due to latency.This means that in QHD reference RX 9070XT seems will be a little bit faster than RX 7900XT/4070TiS based on these clocks.
 
Yes, they're right here.

Fun fact: I could much more easily get a Matrox card in the UK right now than an Intel one (if you don't consider the price). :D
The hell, 488 pounds for a 512mb videocard with DVI display ports, it's like 600 dollars, these are "government contracts " videocards :D
 
The 9070XT will be priced like a 7900GRE, but with the performance of a 7900XT or XTX in some cases.
It's fine if this means 7900GRE performance is the raster floor for 9070XT. Otherwise I may try my luck again with the 7900XT.
I'm so generationally behind that whatever is good but I can only be romanced by improvements to latency, encoder and 3GHz boost.
There are so many weird parts to my checklist that makes AMD the only real competition. They should not be sitting on improvements.
 
The hell, 488 pounds for a 512mb videocard with DVI display ports, it's like 600 dollars, these are "government contracts " videocards :D
I've heard it's got some professional gobbledegook that makes it suitable for industrial use, like for massive LED walls and such, that's why it's so expensive. As you can tell, I didn't really dig myself into the subject. :oops:
 
:)

Joke aside, I will admit that is looking like a good gpu but yes, really curious as to why some models are including 3 power connectors.

Hopefully, we will finally get some real answers soon, instead of all these rumors and leaks.

Overclockers
 
$550 certainly won't be the retail price in the EU. For $550 equivalent I would buy it (althought I am not sure what card from 7000 series is it supposed to be comparable with), but I highly doubt it will go for that price. That's what 7800 XT sells for over here these days.
 
NVIDIA GeForce RTX 5070 MSRP: $549

(leaked) AMD RX 9070 MSRP: $549

TPU writers: "eAsilY UnDErcuT"

Had to check a few times I didn't get dyslexia and read $459 as $549.
 

Have to wonder if part of the reason for the whole CES palaver was not wanting to undercut their board partners...
 
$550 certainly won't be the retail price in the EU. For $550 equivalent I would buy it (althought I am not sure what card from 7000 series is it supposed to be comparable with), but I highly doubt it will go for that price. That's what 7800 XT sells for over here these days.
$550 MSRP usually translates to £550 retail here in the UK. We'll see how long that holds considering events around the world.
 
$550 MSRP usually translates to £550 retail here in the UK. We'll see how long that holds considering events around the world.
That's ok if it's real I guess. For a new card. And if it's a little more power efficient than 7000-series equivalent, I guess I am interested.
 
Back
Top