• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 XT Leaked Listing Reveals Pricing Details

Except that if the design is not good, the heat from the PCB components is transfered to the PCIe power connectors.

RTX 4090 bad design:

1736630143244.png


RX 6900 XT good design:

1736630479222.png

 
The power consumption numbers don't make a lot of sense to me. 7900 XT (a bit lower than this card's rumored performance) consumes around 320w and that's a chiplet based GPU. Even if AMD made 0 improvements to perf per watt architecturally going monolithic would improve efficiency. We aren't getting the full picture of what's going on.

The leak was under furmark load, so highly unrealistic at stock. I doubt its going to be significantly better but I’d expect 290-300w on the basic AIB cards, hopefully less.
 
I would advise AMD to price cards to break even with R&D and their manufacturing costs at TSMC. Anything higher, and they risk becoming the next Matrox, 3Dfx, S3, Cirrus Logic.

For anyone wondering who the fu..k these companies are, exactly.
 
I would advise AMD to price cards to break even with R&D and their manufacturing costs at TSMC. Anything higher, and they risk becoming the next Matrox, 3Dfx, S3, Cirrus Logic.

For anyone wondering who the fu..k these companies are, exactly.
R&D has already been done and paid for. How do you set a price to break even with that if you don't know how many cards you'll sell in advance?
 
Price isn't really all that hard a concept. Price it for what you think it's worth. If people don't buy in enough volume, just lower the price. I don't run a company, but it's pretty simple logic. We all want to maximize the reward of our efforts, not just "break even."
 
nividia's planned obsolescence of 3080 10G may very well backfire on them if 9070xt turns out to really deliver the performance I saw in the leaks. Good. It wouldn't if 3080 12G was the actual launch version.
There was a recent comparison between the 10g and 12g models showing no practicable difference, by the time VRAM split the two, performance wasn't playable on the 12g card anyway. I'm extremely content with the performance mine has provided over 4+ years and counting, and at 4k too. It will live on in a second rig cranking out frames and ageing like fine wine for years to come. Sometimes I truly believe people have unreasonable expectations for how cards should age.

I am desperately trying not to overhype myself but with each passing day and rumor it seems increasingly likely a 9070XT will be my upgrade.
 
There was a recent comparison between the 10g and 12g models showing no practicable difference, by the time VRAM split the two, performance wasn't playable on the 12g card anyway. I'm extremely content with the performance mine has provided over 4+ years and counting, and at 4k too. It will live on in a second rig cranking out frames and ageing like fine wine for years to come. Sometimes I truly believe people have unreasonable expectations for how cards should age.

I am desperately trying not to overhype myself but with each passing day and rumor it seems increasingly likely a 9070XT will be my upgrade.
It's starting to show now though.
 
It's starting to show now though.
That test was like 6 weeks ago. I'm out of the level of gpu power I want for 4k now anyway, so from where I sit, they got it just right.
 
Yes i seen this leaks and info, and still dont understand why they implemented 3 pins.. if 2 more than enough

If I had to guess it could be to force people to use two separate cables. Of the power supplies that have daisy chained PCIe 8 pin connectors, they are usually limited to 2 (as people used to plug all three from a single cable into their GPU). By having three receptacles on the card you ensure that the product is safer to operate for a larger number of people.

Designing power delivery is all about anticipating worse case scenarios and eliminating them, that's good design. Not riding up to the line of safety and hoping nothing goes wrong.
 
well, 3x8-pin is pretty much confirmed, except for the cheapest versions to save a penny. The card must be at least 300w. 7900gre has a tdp of 260w, and even the beefiest nitro+ has 2x8-pin. 9070xt is definitely power hungry for mid-range.
The Sapphire Nitro+ 7900XTX has 3x8-pin PCIe plugs but that monster can pull 400+ watts.
 
More recent leaks suggest price closer to 600 not 500. To me this is good news, looks like final price is not set in stone yet and I suspect they may have something that beats 4070Ti/4070TiS convincingly. If it's 4080S for 600-650, it's team red for me this gen.
 
R&D has already been done and paid for. How do you set a price to break even with that if you don't know how many cards you'll sell in advance?
R&D is constant and probably has a budget, like marketing, you constantly pay for it so you don't die.
They know how much it costs them to make a card, like Sony knows how much a PlayStation costs to produce, if they get ideas and want to actually make a profit when they almost don't exist as market share, then they die in PC desktop and remain a cheap solution for Microsoft, Sony and whatever mobile companies buy some IP from them.
AMD is so behind and I'm not talking about games, in professional space every plugin or software i use, they all work much better on Cuda.
 
1x8 pin= 150w
3x8 pins = 450w
Pcie slot= 75w
So in total= 525w


Amd mention references model 330w, so please tell me in logical way why and what of OC they will achieve with extra 200w, that 3x8 pin connection dont make any sense
I have to sound like a broken record as I tried to explain on 2-3 other threads why the 3x8pin makes sense for some AIB variants of 9070XT.

If we accept that reference 9070XT is 330W then the top OC AIB variants with likely dual VBIOS could be configured from ~280W up to 420W.

For example the reference 7900XTX is 355W
My 7900XTX between the 2 VBIOS on board and Adrenalin -10%/+15% power limit, can be configured from 316W up to 467W
A ~150W range.

PCIE + 2x8pin = 375W
PCIE + 3x8pin = 525W

So tell us now why any GPU vendor make the stupid move to equip a 9070XT with only 2x8pin if they want to give the GPU 400-420W limit for those who want it with the performance VBIOS?
Having a 525W limit on the connectors doesn’t mean you will use it all.
It’s stupid to bring the 375W configuration to its limits.

Common sense is called!
 
Haha, I wonder if I could sell my 4080 and get almost the same performance for less and spend the money gained elsewhere. Hmm! Not like any pretty path traced games are coming until Max Payne 1+2 gets released.
 
Price isn't really all that hard a concept. Price it for what you think it's worth. If people don't buy in enough volume, just lower the price. I don't run a company, but it's pretty simple logic. We all want to maximize the reward of our efforts, not just "break even."
Exactly. Many people here on the forum forget that companies work for profit, not for market share.

R&D is constant and probably has a budget, like marketing, you constantly pay for it so you don't die.
They know how much it costs them to make a card, like Sony knows how much a PlayStation costs to produce, if they get ideas and want to actually make a profit when they almost don't exist as market share, then they die in PC desktop and remain a cheap solution for Microsoft, Sony and whatever mobile companies buy some IP from them.
AMD is so behind and I'm not talking about games, in professional space every plugin or software i use, they all work much better on Cuda.
My question still stands: to know how much money you need to break even, you need to know how many cards you'll sell. How?

I have to sound like a broken record as I tried to explain on 2-3 other threads why the 3x8pin makes sense for some AIB variants of 9070XT.

If we accept that reference 9070XT is 330W then the top OC AIB variants with likely dual VBIOS could be configured from ~280W up to 420W.

For example the reference 7900XTX is 355W
My 7900XTX between the 2 VBIOS on board and Adrenalin -10%/+15% power limit, can be configured from 316W up to 467W
A ~150W range.

PCIE + 2x8pin = 375W
PCIE + 3x8pin = 525W

So tell us now why any GPU vendor make the stupid move to equip a 9070XT with only 2x8pin if they want to give the GPU 400-420W limit for those who want it with the performance VBIOS?
Having a 525W limit on the connectors doesn’t mean you will use it all.
It’s stupid to bring the 375W configuration to its limits.

Common sense is called!
Maybe not everyone wants to overclock? Maybe some cards come with a 100% or maybe 110% maximum power limit?
 
I believe that going very low profit has some point when you don’t have almost any market share but not giving cards for free.
You have to continue to be able to pay R&D also.

Depends on the real cost of the GPU, how low you can go on price. For example fab cost, die size and yields (%) are very important to the final price, other than the profit.

Exactly. Many people here on the forum forget that companies work for profit, not for market share.


My question still stands: to know how much money you need to break even, you need to know how many cards you'll sell. How?


Maybe not everyone wants to overclock? Maybe some cards come with a 100% or maybe 110% maximum power limit?
You mean:
100% = 330W
110% = 366W
…??

Could be yes…
Or reference 9070XT is lower than 330W

Adrenaline power limit as we know has -10% up to +15% for power (usually?) and for PCIE+2x8pin you can’t have more than 375W so 326W is the actual starting point limit.

326W + 15% = 374,9W
 
The 9070XT will be priced like a 7900GRE, but with the performance of a 7900XT or XTX in some cases.
 
You mean:
100% = 330W
110% = 366W
…??

Could be yes…
Or reference 9070XT is lower than 330W

Adrenaline power limit as we know has -10% up to +15% for power (usually?) and for PCIE+2x8pin you can’t have more than 375W so 326W is the actual starting point limit.

326W + 15% = 374,9W
Exactly. I think max and min power limits are controlled by the VBIOS.
 
Exactly. I think max and min power limits are controlled by the VBIOS.
Yeah was thinking about it.
Some could have the configuration of -10% up to +10% maybe so stay under 375W.
 
Yeah was thinking about it.
Some could have the configuration of -10% up to +10% maybe so stay under 375W.
Yep - which is totally fine by me. :)
 
Back
Top