Saturday, January 11th 2025

AMD Radeon RX 9070 XT Leaked Listing Reveals Pricing Details

If the recent RDNA 4 performance leaks are anything to go by, the AMD Radeon RX 9070 and the RX 9070 XT GPUs are sizing up to be excellent mid-range contenders. That is, of course, if the pricing is sane enough. A subsequent leak revealed that the RX 9070 XT AIB models will command a price tag of roughly around $549, which would easily allow it to undercut the NVIDIA GeForce RTX 5070.

Now, a further leak has revealed a product listing of an RX 9070 XT by a retailer based in the Philippines. The variant in question is Gigabyte's Gaming OC model, with base and boost clocks of 2,400 and 2,970 MHz respectively. Moreover, 16 GB of GDDR6 memory is also offered, on a 256-bit memory bus. 4,096 shading units and 64 RT cores are present as well - nothing out of the ordinary.
Now let's get to the juicy bit - pricing. The RX 9070 XT Gaming OC is priced at 35,000 Pesos including 12% taxes, which translates to roughly around $521 before taxes. Whether or not this a good price will boil down to how well RDNA 4 performs against NVIDIA's offerings, and how well NVIDIA's partners price their variants. Right now, it does seem that the RX 9070 XT is priced well enough, but that is of course only if the listing is accurate. On that note, it is worth noting that there are a few typos in the listing, which surely does not inspire confidence. That said, being pre-release, the mistakes may just be unintentional - or not.
Source: VideoCardz
Add your own comment

89 Comments on AMD Radeon RX 9070 XT Leaked Listing Reveals Pricing Details

#26
rv8000
evernessinceThe power consumption numbers don't make a lot of sense to me. 7900 XT (a bit lower than this card's rumored performance) consumes around 320w and that's a chiplet based GPU. Even if AMD made 0 improvements to perf per watt architecturally going monolithic would improve efficiency. We aren't getting the full picture of what's going on.
The leak was under furmark load, so highly unrealistic at stock. I doubt its going to be significantly better but I’d expect 290-300w on the basic AIB cards, hopefully less.
Posted on Reply
#27
AusWolf
Zazigalka




this is the leaked card, says it's recognized as 7800xt, but it's navi4x, so 9070xt, has to be cause the benchmark score was similar to 7900xtx
tbp 330w
On a slightly unrelated note, 80 °C hotspot at 330 W is pretty respectable - I wonder which model this was.
Posted on Reply
#28
Luminescent
I would advise AMD to price cards to break even with R&D and their manufacturing costs at TSMC. Anything higher, and they risk becoming the next Matrox, 3Dfx, S3, Cirrus Logic.

For anyone wondering who the fu..k these companies are, exactly.
Posted on Reply
#29
AusWolf
LuminescentI would advise AMD to price cards to break even with R&D and their manufacturing costs at TSMC. Anything higher, and they risk becoming the next Matrox, 3Dfx, S3, Cirrus Logic.

For anyone wondering who the fu..k these companies are, exactly.
R&D has already been done and paid for. How do you set a price to break even with that if you don't know how many cards you'll sell in advance?
Posted on Reply
#30
Darmok N Jalad
Price isn't really all that hard a concept. Price it for what you think it's worth. If people don't buy in enough volume, just lower the price. I don't run a company, but it's pretty simple logic. We all want to maximize the reward of our efforts, not just "break even."
Posted on Reply
#31
wolf
Better Than Native
Zazigalkanividia's planned obsolescence of 3080 10G may very well backfire on them if 9070xt turns out to really deliver the performance I saw in the leaks. Good. It wouldn't if 3080 12G was the actual launch version.
There was a recent comparison between the 10g and 12g models showing no practicable difference, by the time VRAM split the two, performance wasn't playable on the 12g card anyway. I'm extremely content with the performance mine has provided over 4+ years and counting, and at 4k too. It will live on in a second rig cranking out frames and ageing like fine wine for years to come. Sometimes I truly believe people have unreasonable expectations for how cards should age.

I am desperately trying not to overhype myself but with each passing day and rumor it seems increasingly likely a 9070XT will be my upgrade.
Posted on Reply
#32
Zazigalka
wolfThere was a recent comparison between the 10g and 12g models showing no practicable difference, by the time VRAM split the two, performance wasn't playable on the 12g card anyway. I'm extremely content with the performance mine has provided over 4+ years and counting, and at 4k too. It will live on in a second rig cranking out frames and ageing like fine wine for years to come. Sometimes I truly believe people have unreasonable expectations for how cards should age.

I am desperately trying not to overhype myself but with each passing day and rumor it seems increasingly likely a 9070XT will be my upgrade.
It's starting to show now though.
Posted on Reply
#33
wolf
Better Than Native
ZazigalkaIt's starting to show now though.
That test was like 6 weeks ago. I'm out of the level of gpu power I want for 4k now anyway, so from where I sit, they got it just right.
Posted on Reply
#34
zo0lykas
Zazigalka




this is the leaked card, says it's recognized as 7800xt, but it's navi4x, so 9070xt, has to be cause the benchmark score was similar to 7900xtx
tbp 330w
Zazigalka
Yes i seen this leaks and info, and still dont understand why they implemented 3 pins.. if 2 more than enough
Posted on Reply
#35
evernessince
zo0lykasYes i seen this leaks and info, and still dont understand why they implemented 3 pins.. if 2 more than enough
If I had to guess it could be to force people to use two separate cables. Of the power supplies that have daisy chained PCIe 8 pin connectors, they are usually limited to 2 (as people used to plug all three from a single cable into their GPU). By having three receptacles on the card you ensure that the product is safer to operate for a larger number of people.

Designing power delivery is all about anticipating worse case scenarios and eliminating them, that's good design. Not riding up to the line of safety and hoping nothing goes wrong.
Posted on Reply
#36
Beermotor
Zazigalkawell, 3x8-pin is pretty much confirmed, except for the cheapest versions to save a penny. The card must be at least 300w. 7900gre has a tdp of 260w, and even the beefiest nitro+ has 2x8-pin. 9070xt is definitely power hungry for mid-range.
The Sapphire Nitro+ 7900XTX has 3x8-pin PCIe plugs but that monster can pull 400+ watts.
Posted on Reply
#37
Zazigalka
zo0lykasYes i seen this leaks and info, and still dont understand why they implemented 3 pins..
For +300w power
BeermotorThe Sapphire Nitro+ 7900XTX has 3x8-pin PCIe plugs but that monster can pull 400+ watts.
That's Xtx, 350w TDP. Gre has 260w.
Posted on Reply
#38
lexluthermiester
KritSuch a better cable management.
But not for good safety. Cable management must ALWAYS take a back seat to safety.
Posted on Reply
#39
Zazigalka
More recent leaks suggest price closer to 600 not 500. To me this is good news, looks like final price is not set in stone yet and I suspect they may have something that beats 4070Ti/4070TiS convincingly. If it's 4080S for 600-650, it's team red for me this gen.
Posted on Reply
#40
Luminescent
AusWolfR&D has already been done and paid for. How do you set a price to break even with that if you don't know how many cards you'll sell in advance?
R&D is constant and probably has a budget, like marketing, you constantly pay for it so you don't die.
They know how much it costs them to make a card, like Sony knows how much a PlayStation costs to produce, if they get ideas and want to actually make a profit when they almost don't exist as market share, then they die in PC desktop and remain a cheap solution for Microsoft, Sony and whatever mobile companies buy some IP from them.
AMD is so behind and I'm not talking about games, in professional space every plugin or software i use, they all work much better on Cuda.
Posted on Reply
#41
Zach_01
zo0lykas1x8 pin= 150w
3x8 pins = 450w
Pcie slot= 75w
So in total= 525w


Amd mention references model 330w, so please tell me in logical way why and what of OC they will achieve with extra 200w, that 3x8 pin connection dont make any sense
I have to sound like a broken record as I tried to explain on 2-3 other threads why the 3x8pin makes sense for some AIB variants of 9070XT.

If we accept that reference 9070XT is 330W then the top OC AIB variants with likely dual VBIOS could be configured from ~280W up to 420W.

For example the reference 7900XTX is 355W
My 7900XTX between the 2 VBIOS on board and Adrenalin -10%/+15% power limit, can be configured from 316W up to 467W
A ~150W range.

PCIE + 2x8pin = 375W
PCIE + 3x8pin = 525W

So tell us now why any GPU vendor make the stupid move to equip a 9070XT with only 2x8pin if they want to give the GPU 400-420W limit for those who want it with the performance VBIOS?
Having a 525W limit on the connectors doesn’t mean you will use it all.
It’s stupid to bring the 375W configuration to its limits.

Common sense is called!
Posted on Reply
#42
Dristun
Haha, I wonder if I could sell my 4080 and get almost the same performance for less and spend the money gained elsewhere. Hmm! Not like any pretty path traced games are coming until Max Payne 1+2 gets released.
Posted on Reply
#43
AusWolf
Darmok N JaladPrice isn't really all that hard a concept. Price it for what you think it's worth. If people don't buy in enough volume, just lower the price. I don't run a company, but it's pretty simple logic. We all want to maximize the reward of our efforts, not just "break even."
Exactly. Many people here on the forum forget that companies work for profit, not for market share.
LuminescentR&D is constant and probably has a budget, like marketing, you constantly pay for it so you don't die.
They know how much it costs them to make a card, like Sony knows how much a PlayStation costs to produce, if they get ideas and want to actually make a profit when they almost don't exist as market share, then they die in PC desktop and remain a cheap solution for Microsoft, Sony and whatever mobile companies buy some IP from them.
AMD is so behind and I'm not talking about games, in professional space every plugin or software i use, they all work much better on Cuda.
My question still stands: to know how much money you need to break even, you need to know how many cards you'll sell. How?
Zach_01I have to sound like a broken record as I tried to explain on 2-3 other threads why the 3x8pin makes sense for some AIB variants of 9070XT.

If we accept that reference 9070XT is 330W then the top OC AIB variants with likely dual VBIOS could be configured from ~280W up to 420W.

For example the reference 7900XTX is 355W
My 7900XTX between the 2 VBIOS on board and Adrenalin -10%/+15% power limit, can be configured from 316W up to 467W
A ~150W range.

PCIE + 2x8pin = 375W
PCIE + 3x8pin = 525W

So tell us now why any GPU vendor make the stupid move to equip a 9070XT with only 2x8pin if they want to give the GPU 400-420W limit for those who want it with the performance VBIOS?
Having a 525W limit on the connectors doesn’t mean you will use it all.
It’s stupid to bring the 375W configuration to its limits.

Common sense is called!
Maybe not everyone wants to overclock? Maybe some cards come with a 100% or maybe 110% maximum power limit?
Posted on Reply
#44
Zach_01
I believe that going very low profit has some point when you don’t have almost any market share but not giving cards for free.
You have to continue to be able to pay R&D also.

Depends on the real cost of the GPU, how low you can go on price. For example fab cost, die size and yields (%) are very important to the final price, other than the profit.
AusWolfExactly. Many people here on the forum forget that companies work for profit, not for market share.


My question still stands: to know how much money you need to break even, you need to know how many cards you'll sell. How?


Maybe not everyone wants to overclock? Maybe some cards come with a 100% or maybe 110% maximum power limit?
You mean:
100% = 330W
110% = 366W
…??

Could be yes…
Or reference 9070XT is lower than 330W

Adrenaline power limit as we know has -10% up to +15% for power (usually?) and for PCIE+2x8pin you can’t have more than 375W so 326W is the actual starting point limit.

326W + 15% = 374,9W
Posted on Reply
#45
Acuity
The 9070XT will be priced like a 7900GRE, but with the performance of a 7900XT or XTX in some cases.
Posted on Reply
#46
AusWolf
Zach_01You mean:
100% = 330W
110% = 366W
…??

Could be yes…
Or reference 9070XT is lower than 330W

Adrenaline power limit as we know has -10% up to +15% for power (usually?) and for PCIE+2x8pin you can’t have more than 375W so 326W is the actual starting point limit.

326W + 15% = 374,9W
Exactly. I think max and min power limits are controlled by the VBIOS.
Posted on Reply
#47
Zach_01
AusWolfExactly. I think max and min power limits are controlled by the VBIOS.
Yeah was thinking about it.
Some could have the configuration of -10% up to +10% maybe so stay under 375W.
Posted on Reply
#48
AusWolf
Zach_01Yeah was thinking about it.
Some could have the configuration of -10% up to +10% maybe so stay under 375W.
Yep - which is totally fine by me. :)
Posted on Reply
#49
friocasa
sepheronxIf powercolor variants are decent price in Canada, I may replace my 3080 with it.
To gain around 30% performance?

The 3080 still a great GPU
Posted on Reply
#50
AsRock
TPU addict
zo0lykas1x8 pin= 150w
3x8 pins = 450w
Pcie slot= 75w
So in total= 525w


Amd mention references model 330w, so please tell me in logical way why and what of OC they will achieve with extra 200w, that 3x8 pin connection dont make any sense
Maybe it helps with keeping the PCI-e slot at a more stable voltage, on top of that powercooler are known to push the limits even if the gain is terrible.
Posted on Reply
Add your own comment
Jan 25th, 2025 22:38 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts