Monday, January 27th 2025

Bulgarian Retailer Showcases PowerColor Radeon RX 9070 XT Red Devil S.E. Packaging
Gplay.BG's YouTube channel uploaded a fascinating video feature over the past weekend—providing another look at PowerColor's Radeon RX 9070 XT Red Devil special edition retail package. The Bulgarian retailer's CEO—Ivan Hinov (aka DonBrutar)—appeared to have a sealed box in-hand. Gplay's presentation implies that they have joined the ranks of other European shops having RDNA 4-based cards in-stock, although VideoCardz reckons that special/limited edition Red Devil bundles (of recent generations) are normally distributed to media outlets. Hinov repeatedly referred to one of VideoCardz's recent news articles—regarding a speculated AMD Radeon RX 9070 GPU series launch window. Industry insiders reckon that AMD had—initially—formed a release strategy focusing on late January, possibly on the 23rd. The new cards will be launched around March time, according to an official Team Red statement.
Gplay's video provides some extra insight on this topic—Hinov confirms (in a roundabout way) that his company received information about a January release window, prior to Team Red's announcing of a postponement. The VideoCardz insider network discovered possible launch MSRPs of: "around $899 for the RX 9070 XT, and $749 for the non-XT." Interestingly, Gplay's chief commented on these rumors during his comparison segment: "delay of the Radeon RX 9070 (non-XT) and Radeon RX 9070 XT has created uncertainty. These cards were expected to launch at prices significantly higher than the Radeon RX 7800 XT and close to the Radeon RX 7900 XT, which makes little sense. For example, the Radeon RX 9070 XT was rumored to cost 500 BGN (~$269 USD) more than the RX 7900 XT while offering only marginally better performance. This pricing strategy was a clear mistake."Fast forward to the nine-minute mark of GPlayTV's video:
GPlayTV's description states (via machine translation): "DonBrutar is ready to unravel the mysteries surrounding the current and new series of AMD video cards."
Sources:
GplayTV YouTube Channel, VideoCardz
Gplay's video provides some extra insight on this topic—Hinov confirms (in a roundabout way) that his company received information about a January release window, prior to Team Red's announcing of a postponement. The VideoCardz insider network discovered possible launch MSRPs of: "around $899 for the RX 9070 XT, and $749 for the non-XT." Interestingly, Gplay's chief commented on these rumors during his comparison segment: "delay of the Radeon RX 9070 (non-XT) and Radeon RX 9070 XT has created uncertainty. These cards were expected to launch at prices significantly higher than the Radeon RX 7800 XT and close to the Radeon RX 7900 XT, which makes little sense. For example, the Radeon RX 9070 XT was rumored to cost 500 BGN (~$269 USD) more than the RX 7900 XT while offering only marginally better performance. This pricing strategy was a clear mistake."Fast forward to the nine-minute mark of GPlayTV's video:
GPlayTV's description states (via machine translation): "DonBrutar is ready to unravel the mysteries surrounding the current and new series of AMD video cards."
18 Comments on Bulgarian Retailer Showcases PowerColor Radeon RX 9070 XT Red Devil S.E. Packaging
Not saying Gplay is lying, but perhaps he's been fed wrong information.
Edit: For clarification, I still think AMD got caught with their pants down by Nvidia but I don't think AMD was seriously considering pricing the cards like that.
I'd say it's more likely that they were considering like 650$ MSRP and now have to lower it to about 550$ to remain somewhat price competitive.
Could be a 5Head play in an attempt to obfuscate the actual originally planned price.
I say that as by all other accounts, that's the rumored (base) pricing. Not to say it's true, but it makes sense.
The only way what is implied works is if nVIDIA would have priced GTX50 like original GTX40 (in terms of 80/70ti/70).
That was never going to happen imho, given it's same node and original 4080 and below pricing was not received well.
Perhaps AMD thought the stack would be priced based on the value of 24GB of GDDR7 for 5080, which is possible, but according to my math even at $1000 that still likely leaves >70% typical margin.
With 16GB nVIDIA is making an absolute killing on 5080, but also may let them price the cards more 'competitively' while still making a killing. Also, not destroy every partner they still have left.
FWIW, nVIDIA typical margin is >70% while AMD's is >40%. Both CEO's have been commended for this fact.
IMHO, the market doesn't like AMD having over a 50% margin, 60% at most, nor (as they correctly summized with interviews at CES) does it want a $1000 card from them. ~$800 yes, but only if good value.
Where-as for nVIDIA the sky is the limit, especially given their willingness to use less ram (compared to compute capability) where-as AMD will not, but will sell their cards based on compute OR current RT.
Whichever is higher.
It's why I feel like AMD should be aiming to beat the cut-down 80 every generation (with OC to ~ stock 80), not just value;with a weird stack where we know both cards will over-all be better than 70.
I personally think they would have been better off with a higher-clocked 7168sp card at <$550 to beat 5070 in value/OCing and a 24Gbps card priced between 5070/5070Ti, but I'm not their marketing team.
Instead we get a card that probably literally replaces the 7800xt/gre for cheaper, and a </~ 7900xt for cheaper (less ram, but it's fine given performance).
This puts them in the pickle of a 3.2+/24gbps card being worth less relatively; say $600 instead of $650...but they may still price in toward the moon. Then nobody will buy it instead of the cheaper cards.
I think had they released something higher-clocked right-away the market will have loved it as that's really the max perf applicable to 16GB and have made 5070ti/5080 look ridiculous...
.... same as 12GB 5070 will vs whatever.
Instead we'll probably end up with cheaper/better value alternatives to 5070/5070ti and who knows if they'll ever actually release the 5070ti/5080 competitor.
I can't get over how weird it is if 9070xt is indeed using 20gbps ram, although I will grant the >20gbps ootb OC model hints they may be using something better...
....given greater than stock memory oc is almost-never allowed.
It literally makes no sense (versus overclocking a 9070 or 7800xt) if 20gbps. It appears like something created by marketing to look good versus 5070 stock and a stock 5070ti (when overclocked).
9070 better value ~5070 when OC.
That said, If that's what you're after (more than the limitation of 5070/12GB and/or cheaper) or without the fluff/anemic ram of of 5080 (whichever your view), I don't think these cards will be a bad deal.
It's cool that AMD is likely going to market a very good price, and to some people that may be more important, but to me I would've liked them to put up a fight (even if slightly more expensive).
Appears pretty obvious they need to clear stock of N31...but that's not our problem. I also think 7800xt was a very good card that's tough to follow-up on 4/5nm, but that's also not our problem.
I don't know why AMD does this (I mean, cheaper is good...but they want margin and people want better-performing cards ootb...and the stock clocks don't mesh with this). There is a better way.
I could imagine a 11520sp/256-bit/24GB card on 3nm with a ~/<$500 BOM (so if price ever drops to $700 there's still a 40% margin) competing with a full ($1000) OR beating a cut-down 6080 (~$750-800?).
If nVIDIA targets slightly better performance (12288sp+?), AMD would be a better value. If nVIDIA targets similar performance, AMD would be a MUCH better value.
If nVIDIA tries to make the 6080 better than really needed for the tier and 6070ti worse (<11520sp/24GB, as is the case with current Ti models), AMD would be the better alternative.
( I also think this design makes sense for a PS6 [but with denser libraries bc Playstations run around threshold voltage/lowest yield clock to save power/die space and keep cost down).
To me, for AMD, this is the way.
The part that makes me laugh is when people realize the PS6 will probably be just over the maximum potential of N48 and perhaps often use more than 16GB of ram on avg (a limitation of 5080).
Current PS5 pro uses a ~15000/3000mhz split for GPU/CPU memory. If PS6 uses a 27000/5000mhz split (32GB @ 32gbps), the PS6 could have a ~4ghz(+) cpu ('c' cores?) and up to 60TF on the GPU.
That would equate to 11264sp (or 5632sp if you're 'that guy'/Cerney) at 3nm threshold voltage/clocks (2664-2671mhz), ~60TF (for power/yields), while the extra bw to the cpu would be similar to v-cache.
27gbps is faster than any N48 could muster. 60TF likely faster than the core could handle, (that'd be close to ~3700mhz), even with a ton of voltage. But, ofc, similar compute/RT to a 5080.
I think this is the design they'll go for because if Cerney is the type of guy I think he is; he'll know 11264 (potentially 88 of 90 CUs) is an efficiency wet dream. I think we both think like that (as absolute nerds).
Also, 32gbps will likely be cheap as I don't think many 3nm cards will use it, while yields/pricing should be favorable given all companies can and will make it (on older process lines).
Hell, you see 28gbps OC to 34gbps (and gated). That's likely bc next-gen will likely use 36gbps and higher on the PC market.
I also think a 18gpu/14GB split makes the most sense as allocation (given current game usage in those areas), perhaps with a expanded subsystem for the os (4GB?),
but it's always possible < 32GB could be available to devs.
That's why I think this generation is a cash-grab...all-around. Hold onto your 4090's though.
The potential of a 192-bit 18GB high-clocked design will probably mesh well versus the PS6 (low-clocked 256-bit design),
where-as a high-clocked 256-bit/24GB design would match a 4090 for 'acceptable' enthusiast pricing, lasting a long time.
It's possible they may avoid these designs on 3nm, taking the over/under so people will keep buying graphics cards in future generations...
but I would hope at least one company shoots for the goal and good efficiency/pricing.
N48 and GB203 (16GB) do not match either of these criteria, and why I give them a pass.
I do like the potential pricing/value on the low-end (6070 vanilla) and maxing out the potential of 16GB on the high-end...if good price.
I just think the days of 16GB being enough for anything anything over vanilla 1440p (w/o RT/FG/upscaling to 4k etc) are numbered. Obviously not everyone cares about that, but I think enough do/should.
math for 9070 is similar
1800BGN -> ~906euro, w/o VAT 725euro, before tariff 580euro
that is with exact exchange rates for currencies and if the stated prices in BGN are final prices buyer will pay (with shop interest included)
edit:
and video ends with conclusion 7000 is best price/performance and you can buy it now so line up and give us your money so we can sell out old stuff before new one comes :rolleyes:
Ps. I mentioned normal shops not individual scalper persons
here the question is how amd's 9070 msrp can be deduced from stated in the video prices. i dont think in the video he would state price that is not final retail price so his stated prices 2000/1800BGN should include retailer margins. what i have calculated 675/580 euro is also with that retailer margin included.
i am not sure if there was tariff for import of electronics in bulk. i am sure for individual pieces that there is non. but if am wrong about tariff and amd's msrp was 799euro (or 779?) for 9070xt and ~699 for 9070 (0% tariff) than amd's decision to delay 9070 launch till they figure how they can match nvidia's prices makes much more sense.