Tuesday, January 28th 2025

AMD Denies Radeon RX 9070 XT $899 USD Starting Price Point Rumors

When the next-generation AMD Radeon RX 9000 series of GPUs, headed by RX 9070 XT and RX 9070, are surrounded by rumors, AMD's officials are coming to the rescue. According to the Bulgarian retailer's disclosure, AMD's initial pricing strategy for the new cards caused concerns, given their reported performance levels. The RX 9070 XT was reportedly positioned at around $899, matching the price point of the RX 7900 XT. The standard RX 9070 was said to carry a $749 price tag. To clarify the situation, AMD's Frank Azor jumped on social media platform X and explained, "While we aren't going to comment on all the price rumors, I can say that an $899 USD starting price point was never part of the plan."

Earlier reports indicate AMD has distributed its first wave of RDNA 4 graphics cards to various partners and retailers globally. However, these companies are currently unable to sell the new GPUs, as AMD has apparently set a March timeline for their release. This information gained additional credibility when a retailer in Bulgaria provided insights into AMD's preliminary launch strategy for the RX 9000 series. The retailer demonstrated the PowerColor Red Devil RX 9070 XT, one of three RX 9070 XT models that PowerColor unveiled during CES. While several AMD board partners have completed their RX 9070 XT designs, they have not yet disclosed official specifications or retail prices. Until March, we have limited information on pricing strategy.
Source: via Wccftech
Add your own comment

239 Comments on AMD Denies Radeon RX 9070 XT $899 USD Starting Price Point Rumors

#226
AusWolf
redeye”From that perspective, I'm almost coming around to the idea that AMD is greedier than Nvidia.”

what?. Nvidia’s gross margins are 50 percent greater/higher than AMD inc OR apple inc. look it up.
IMO it is not possible for a company to be greedier than a company that has a significantly higher gross margin…
Nvidia isn't greedy because it offers last gen performance for last gen price - no price increase is a positive thing, right? AMD is greedy because... well... the same. They should decrease prices, or offer more performance, obviously. :oops:

People will say anything to justify buying Nvidia. It boggles the mind.
Posted on Reply
#227
Wasteland
JustBenchingNo, they do not. Not for the past 10 years.
AusWolfRDNA 2 is the generation that put AMD back onto the map after GCN. You can disagree with me all you want, but I won't budge on this one.
Yeah, RDNA 2 was the last time AMD had a full stack with competitive perf and sensible pricing/timing, more or less from top to bottom--or it would have been, if not for the crypto-mining boom. RDNA 2 was also the first time AMD had a full stack scheme with all of the above characteristics in many years prior (since the 200 series in ~2013?). Obviously I can't say how things would have played out without the crypto/COVID mess, but RDNA2 sure looked like a genuine contender on paper, a serious product launch by a serious GPU company.

That was then. It feels like AMD is still punch drunk from that bit of bad luck in 2021. "We tried to do things properly for once, and God smote us, so let's do all sorts of random shit instead!" Of course, COVID wasn't the only bit of bad luck. Their chiplet strategy didn't pan out as well as they'd hoped, either.

I'm clearly more pro-AMD than JustBenching here. I do think AMD makes good products. Unfortunately, everything surrounding those products ranges from mediocre to dumpster fire. What's so frustrating about these discussions is that on the one hand, despite its second-place status, AMD is still one of the most high-tech organizations in the history of human endeavor, staffed with world class engineers. On the other hand, their public behavior is goofy enough to make you forget that fact on a near daily basis. It's as if someone handed a toddler the keys to a Ferrari.
redeyewhat?. Nvidia’s gross margins are 50 percent greater/higher than AMD inc OR apple inc. look it up.
IMO it is not possible for a company to be greedier than a company that has a significantly higher gross margin…
The point is that Nvidia at least has the excuse that their consumer GPUs could have been sold for much more on the enterprise market. I don't quite subscribe to the popular notion that Nvidia's consumer GPUs represent "charity," but a case could be made. AMD has no such excuse.
Posted on Reply
#228
Vayra86
Jtuck9This sounds interesting. I vaguely remember hearing something about Forza using Ray Tracing for their audio.
Bwhahahaa ray traced audio. Imagine that. What's next, AI shoelaces?
Posted on Reply
#229
Jtuck9
Vayra86Bwhahahaa ray traced audio. Imagine that. What's next, AI shoelaces?
I'm trying to think of what game had the bug where the cables had a mind of their own and started tanking performance.
Posted on Reply
#230
AusWolf
WastelandYeah, RDNA 2 was the last time AMD had a full stack with competitive perf and sensible pricing/timing, more or less from top to bottom--or it would have been, if not for the crypto-mining boom. RDNA 2 was also the first time AMD had a full stack scheme with all of the above characteristics in many years prior (since the 200 series in ~2013?). Obviously I can't say how things would have played out without the crypto/COVID mess, but RDNA2 sure looked like a genuine contender on paper, a serious product launch by a serious GPU company.

That was then. It feels like AMD is still punch drunk from that bit of bad luck in 2021. "We tried to do things properly for once, and God smote us, so let's do all sorts of random shit instead!" Of course, COVID wasn't the only bit of bad luck. Their chiplet strategy didn't pan out as well as they'd hoped, either.
Chiplets are all about cost saving on throwing away smaller chips when they turn out to be defective - nothing else. Whoever expected AMD to introduce chiplets into the GPU space and dominate Nvidia a million times over, lives in lala land. Sure, it didn't work out because they couldn't divide the compute die into smaller chunks, but at least they tried something new, which deserves a point in my books.
WastelandI'm clearly more pro-AMD than JustBenching here. I do think AMD makes good products. Unfortunately, everything surrounding those products ranges from mediocre to dumpster fire. What's so frustrating about these discussions is that on the one hand, despite its second-place status, AMD is still one of the most high-tech organizations in the history of human endeavor, staffed with world class engineers. On the other hand, their public behavior is goofy enough to make you forget that fact on a near daily basis. It's as if someone handed a toddler the keys to a Ferrari.
What I don't understand is if we have people like yourself saying that AMD makes good products, then why on Earth do we have to talk about their "public behaviour"? Yes, they're goofy, yes they're awkward on stage, their marketing is f*ed up beyond measure, but who the heck cares if the product is good? I pay to be able to play games, not to watch a CEO waffle some shit in front of a bunch of journalists.
WastelandThe point is that Nvidia at least has the excuse that their consumer GPUs could have been sold for much more on the enterprise market. I don't quite subscribe to the popular notion that Nvidia's consumer GPUs represent "charity," but a case could be made. AMD has no such excuse.
That's not a case. The consumer and enterprise markets are entirely different. Besides, AMD has enterprise stuff as well. I've heard their MI Instinct stuff isn't half bad (I don't know, I've just heard).
Posted on Reply
#231
JustBenching
WastelandYeah, RDNA 2 was the last time AMD had a full stack with competitive perf and sensible pricing/timing, more or less from top to bottom--or it would have been, if not for the crypto-mining boom. RDNA 2 was also the first time AMD had a full stack scheme with all of the above characteristics in many years prior (since the 200 series in ~2013?). Obviously I can't say how things would have played out without the crypto/COVID mess, but RDNA2 sure looked like a genuine contender on paper, a serious product launch by a serious GPU company.

That was then. It feels like AMD is still punch drunk from that bit of bad luck in 2021. "We tried to do things properly for once, and God smote us, so let's do all sorts of random shit instead!" Of course, COVID wasn't the only bit of bad luck. Their chiplet strategy didn't pan out as well as they'd hoped, either.

I'm clearly more pro-AMD than JustBenching here. I do think AMD makes good products. Unfortunately, everything surrounding those products ranges from mediocre to dumpster fire. What's so frustrating about these discussions is that on the one hand, despite its second-place status, AMD is still one of the most high-tech organizations in the history of human endeavor, staffed with world class engineers. On the other hand, their public behavior is goofy enough to make you forget that fact on a near daily basis. It's as if someone handed a toddler the keys to a Ferrari.


The point is that Nvidia at least has the excuse that their consumer GPUs could have been sold for much more on the enterprise market. I don't quite subscribe to the popular notion that Nvidia's consumer GPUs represent "charity," but a case could be made. AMD has no such excuse.
I wouldn't say they were competitive from top to bottom with rdna 2 but surely they had some really compelling options. Especially the 6800 (non xt) was a killer product. It cost more than the 3070 but I'd gladly pay it for 2x the vram and faster raster. Mining killed their momentum though.
Posted on Reply
#232
Jtuck9
AusWolfChiplets are all about cost saving on throwing away smaller chips when they turn out to be defective - nothing else. Whoever expected AMD to introduce chiplets into the GPU space and dominate Nvidia a million times over, lives in lala land. Sure, it didn't work out because they couldn't divide the compute die into smaller chunks, but at least they tried something new, which deserves a point in my books.
I can't seem to find the article I read but AMD were mentioning how it provides a lot of flexibility to tailor chips (or configurations) to use cases. Now they seemingly have communication between CCUs on the Halo chip and with Deepseek taking advantage of parallelization (something apparently CUDA can't do) and with Neural rendering on the horizon I'm interested to see what they have planned for their next architecture.
Posted on Reply
#233
Wasteland
AusWolfWhat I don't understand is if we have people like yourself saying that AMD makes good products, then why on Earth do we have to talk about their "public behaviour"? Yes, they're goofy, yes they're awkward on stage, their marketing is f*ed up beyond measure, but who the heck cares if the product is good? I pay to be able to play games, not to watch a CEO waffle some shit in front of a bunch of journalists.
We all have an interest in healthy competition. I want AMD to stop stepping on its own dick. What I don't understand is why you'd defend them when they make avoidable errors. "B-But Nvidia is bad too" doesn't move me at this point. I have dozens of posts shitting on Nvidia. I'm a Linux cultist, after all. But AMD is not some battered wife or tiny baby in need of coddling. It is a multi-billion-dollar high-tech company with some of the world's most brilliant engineers, who by the way are probably annoyed than I am with corporate/marketing's mistakes.
AusWolfThat's not a case. The consumer and enterprise markets are entirely different. Besides, AMD has enterprise stuff as well. I've heard their MI Instinct stuff isn't half bad (I don't know, I've just heard).
AMD uses different chips for enterprise vs consumer. Nvidia does not. That is the point. AMD will be merging their architecture with the upcoming UDNA. Hopefully that will be successful.
JustBenchingI wouldn't say they were competitive from top to bottom with rdna 2 but surely they had some really compelling options. Especially the 6800 (non xt) was a killer product. It cost more than the 3070 but I'd gladly pay it for 2x the vram and faster raster. Mining killed their momentum though.
Yeah, banger of a card. I have one. There still isn't a compelling upgrade option for it, and it looks like there won't be for some years to come. Not that I'm in the market anyway. I don't play demanding games often enough to care. Frankly every time I look at the AAA space, I wonder why anyone does.
Posted on Reply
#234
AusWolf
Jtuck9I can't seem to find the article I read but AMD were mentioning how it provides a lot of flexibility to tailor chips (or configurations) to use cases. Now they seemingly have communication between CCUs on the Halo chip and with Deepseek taking advantage of parallelization (something apparently CUDA can't do) and with Neural rendering on the horizon I'm interested to see what they have planned for their next architecture.
It provides a lot of flexibility on CPUs that don't need a super tight latency between cores. Not so much on GPUs. That's why their CPU business is booming while RDNA 3 as a whole generation was just a bit meh despite all the money they pumped into making chiplets work.
WastelandWe all have an interest in healthy competition. I want AMD to stop stepping on its own dick. What I don't understand is why you'd defend them when they make avoidable errors. "B-But Nvidia is bad too" doesn't move me at this point. I have dozens of posts shitting on Nvidia. I'm a Linux cultist, after all. But AMD is not some battered wife or tiny baby in need of coddling. It is a multi-billion-dollar high-tech company with some of the world's most brilliant engineers, who by the way are probably annoyed than I am with corporate/marketing's mistakes.
I'm not defending the company. I'm defending their products which you yourself admitted were good. That's all I care about - products. Dr Su herself could come on stage during the next show and say that RDNA 4 is shit and no one should buy it, but I don't care. If it's good, I'll buy it. Marketing is for idiots.

Same goes for Nvidia (despite all the crap I've given them lately) - if they produce something solid, they'll get my vote (again). The problem is that they don't. They keep pushing the same architecture again and again for a higher price, hidden behind more smoke and mirrors with every gen. AMD is at least trying (even if they fail sometimes), but Nvidia clearly doesn't give a crap about gaming.

Oh, and I'm a (recent) Linux cultist as well thanks to Bazzite. :)
WastelandAMD uses different chips for enterprise vs consumer. Nvidia does not. That is the point. AMD will be merging their architecture with the upcoming UDNA. Hopefully that will be successful.
That's a fair point.
Posted on Reply
#235
TSiAhmat
[Took this from another post and migrated it here, because it was a bit off topic in the other tread]

The price difference between the cheapest 7900xtx & 4080Super/5080 is pretty big right now in my region (VAT included):

The cheapest 4080 Super is 1132 € (and steadily increasing price)

The cheapest 5080 will be 1169€

Currently the almost cheapest 7900xtx is 899 €


There is now a 230€ Price difference between the cheapest 4080 Super/5080 and the 7900xtx (where 3 Months in the past it was a 100€ difference)

Seems like Nvidia really doesn't care about the 7900xtx or they want to signal AMD with "Hey, you don't want to increase P/P this gen, right? Just sell your next card linear (P/P) to your old stack (7000series)"

either way, this gen looks rather bad, pretty sad. Hopefully I am wrong. *pls*

Edit: Also the 899 Price makes absolutely no sense if the 7900xtx costs the same right now, unless they value FSR4 & less energy used highly
Posted on Reply
#236
AnotherReader
AusWolfIt provides a lot of flexibility on CPUs that don't need a super tight latency between cores. Not so much on GPUs. That's why their CPU business is booming while RDNA 3 as a whole generation was just a bit meh despite all the money they pumped into making chiplets work.


I'm not defending the company. I'm defending their products which you yourself admitted were good. That's all I care about - products. Dr Su herself could come on stage during the next show and say that RDNA 4 is shit and no one should buy it, but I don't care. If it's good, I'll buy it. Marketing is for idiots.

Same goes for Nvidia (despite all the crap I've given them lately) - if they produce something solid, they'll get my vote (again). The problem is that they don't. They keep pushing the same architecture again and again for a higher price, hidden behind more smoke and mirrors with every gen. AMD is at least trying (even if they fail sometimes), but Nvidia clearly doesn't give a crap about gaming.

Oh, and I'm a (recent) Linux cultist as well thanks to Bazzite. :)


That's a fair point.
Chiplets are good for yields and reducing costs, but they also allow scaling to higher performance than a single die: see the MI300X. As for latency, GPUs actually don't have better core to core latency than CPUs.


Contrast GPUs' core to core latency with CPUs: latencies range from sub 20 ns to 80 ns for Zen 4. On the same CCD, latencies are much lower than GPUs could ever dream of.
Posted on Reply
#237
AusWolf
AnotherReaderChiplets are good for yields and reducing costs, but they also allow scaling to higher performance than a single die: see the MI300X. As for latency, GPUs actually don't have better core to core latency than CPUs.


Contrast GPUs' core to core latency with CPUs: latencies range from sub 20 ns to 80 ns for Zen 4. On the same CCD, latencies are much lower than GPUs could ever dream of.
I see. But what about splitting the GPU into chiplets? As far as I recall, AMD experimented with that on RDNA 3, and that introduced the unwanted latency. That's why they ended up cutting only the cache / memory controller off of the main die in the end.
Posted on Reply
#238
AnotherReader
AusWolfI see. But what about splitting the GPU into chiplets? As far as I recall, AMD experimented with that on RDNA 3, and that introduced the unwanted latency. That's why they ended up cutting only the cache / memory controller off of the main die in the end.
That apparent increase was an error due to AMD's aggressive power saving on he GCD to MCD link. There's a latency difference, but it isn't as dramatic as initial tests suggested.
We see a similar pattern with vector accesses, where measured Infinity Cache latency dropped from 199 ns to 150.3 ns with the card in a higher power state.
For the monolithic 7600, power saving didn't impact it as badly.
Vector accesses show similar behavior, with just a 9.5% Infinity Cache latency difference depending on whether the Infinity Fabric was in power saving state.
Posted on Reply
#239
Bronan
Hecate91When AMD does have a design win, it sells well but Nvidia still outsold them, while AMD continued to lose marketshare. I don't know how well consistency would help Radeon as they would have to do something to attract customers, something that Nvidia doesn't have since Nvidia has been successful in selling software at a significant premium. Before RDNA, AMD did have consistency and it didn't seem to help them enough.
However the CPU market isn't the same as the GPU market, AMD would still have to overcome the software features which are proprietary to Nvidia, and compete on gaming when most of those features run better with Nvidia hardware. AMD would have to launch faster cards than Nvidia for several generations, and have better RT and upscaling in order to get the win over the market that Ryzen has achieved. AMD would also have to get away from the old stigma of bad drivers, and the complaints of their cards being too expensive, AMD always has to undercut to the point of having low margins, there can't be consistency when there is likely barely enough R&D to develop the next product.
Nvidia also gets away with a lot of anti-consumer things like GPP, handing game devs piles of money to develop games with features only avaible to Nvidia users, marketing lies like saying the 5070 is faster than a 4090, or the MSRP's being completly fake as the MSRP only applies to the FE card which are usually purposely supply limited.
AMD is always been seen as the underdog even though the AMD products are pretty good in the last years
But they also need more capacity for build the company used AI gpu clusters ( supercomputers) because they do well at that front as far as i know.
The info from within AMD has dried up for me as the friend worked there had retired, he knew every secret and never ever told any info.
That of course did not stop me to try to get the latest and greatest, but he always waited till the secret was already leaked :laugh:
Posted on Reply
Add your own comment
Mar 1st, 2025 05:24 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts