Wednesday, January 15th 2025

AMD Radeon RX 9070 XT & RX 9070 Custom Models In Stock at European Stores

AMD's board partners flaunted their new Radeon RX 9070 XT and RX 9070 custom models at last week's CES trade event, but no one expected to see retail units pop up anytime soon after the concluded Las Vegas showcase. Earlier today, a brave soul uploaded compelling new evidence on Team Red's subreddit—they claim that they were surprised to see the "early" delivery of Radeon RX 9070 XT and RX 9070 graphics card stock. Uploaded photos seem to show several boxed Sapphire Pulse models sitting in an Israeli computer store's stockroom. This leak has semi-ruined Sapphire's staggered Pulse-oriented marketing campaign—yesterday, a teaser image emerged via an official social media post.

Industry watcher, momomo_us, has gathered proof of GIGABYTE Radeon RX 9070 XT GAMING OC 16G and Radeon RX 9070 GAMING OC 16G model stock reaching Danish shores. According to VideoCardz, Føniks Computer's online store had at least four units available for purchase and immediate shipping (same business day). Entries for the two models have also appeared on Geizhals—this German price comparison engine lists January 24 as a market launch date. This information could be subject to change—AMD is likely still working on finalizing release window parameters. After all, recent pre-launch leaks have contained incomplete data and errors. It should be noted that NVIDIA's GeForce RTX 50 series is set to hit international markets on January 30—is Team Red planning to pre-empt this rollout?
Sources: Foniks Computer DK, momomo_us, VideoCardz, Geizhals DE, AMD Reddit
Add your own comment

226 Comments on AMD Radeon RX 9070 XT & RX 9070 Custom Models In Stock at European Stores

#176
Hecate91
AusWolfOpen world games with dynamic daytime and weather effects have existed long before RT was invented.
Its baffling to me how people think dynamic lighting in games didn't exist until RT was added, and IMO a game doesn't need RT to look good.
I can't tell much a of a difference from those screen shots, besides the RT image is a bit shinier with some extra lighting, and PT has a bit more lighting although probably not worth the performance hit. Most AAA games studios are using RT as a lazy approach to add lighting.
AusWolfDoes that matter?
I think it only matters if you care about the feature, most people don't and if the stats say they do I would have to guess most people just leave default settings on, some of the latest titles force RT on by default.
Posted on Reply
#177
AusWolf
JustBenchingAnd they are all flat in the open world non scripted scenes.
What do you mean flat? Have you played TES4: Oblivion, for example? My jaw dropped looking at its weather effects back in 2006 (I think it was 2006).
JustBenchingThat's like asking me if HDR matters. Of course.
I didn't ask "does it look nice". I asked "does it matter" as in "is it such a big difference that it makes or breaks a game". ;)
Posted on Reply
#178
Zach_01
AusWolfI didn't ask "does it look nice". I asked "does it matter" as in "is it such a big difference that makes or breaks a game". ;)
IMHO
RT: maybe when lighting, shadows and reflections are obviously better.
PT: not yet, far from calling a game broken with out it.
Posted on Reply
#179
JustBenching
AusWolfWhat do you mean flat? Have you played TES4: Oblivion, for example? My jaw dropped looking at its weather effects back in 2006 (I think it was 2006).
By flat I mean there isn't much differentiation between objects in a scene (non scripted scene). Because there can't be, nobody is crazy enough to finetune lighting in every possible scene in an open world game. I gave you an example above, the bottles and panam's face looks flat - it's the exact same from top to bottom without RT.
AusWolfI didn't ask "does it look nice". I asked "does it matter" as in "is it such a big difference that it makes or breaks a game". ;)
Well better is better is better. If there was no performance penalty for example, we wouldn't even be arguing here right, everybody would be using it. We can argue the same about textures, does ultra to high make such a difference? No, it makes even less of a difference. Yet people are going apeshit about 4070's vram
Posted on Reply
#180
Zach_01
JustBenchingIf there was no performance penalty for example, we wouldn't even be arguing here right, everybody would be using it.
Of course not. The argument exists only for the perf penalty and its legit. Again for me about PT.
Posted on Reply
#181
JustBenching
Zach_01Of course not. The argument exists only for the perf penalty and its legit. Again for me about PT.
Well not really, if you read the rest of the comments people are saying RT off looks betters. Which is wild
Posted on Reply
#182
Zach_01
JustBenchingWell not really, if you read the rest of the comments people are saying RT off looks betters. Which is wild
Each to his own.

But lets see what 9070s can do first as the topic is about stock of 9070s in EU...
I'm betting some AIB variants to be north of 400W. Lets see how horribly efficient can RDNA4 be on those levels.
I keep my hopes low, as history taught us but you never now... Its a brand new architecture.
Posted on Reply
#183
AusWolf
JustBenchingBy flat I mean there isn't much differentiation between objects in a scene (non scripted scene). Because there can't be, nobody is crazy enough to finetune lighting in every possible scene in an open world game. I gave you an example above, the bottles and panam's face looks flat - it's the exact same from top to bottom without RT.
Call me blind, but I just had another look at your Cyberpunk screenshots, and I really don't know what you mean.
JustBenchingWell better is better is better. If there was no performance penalty for example, we wouldn't even be arguing here right, everybody would be using it.
But there is a performance penalty, and a massive one, that's why we're discussing it. ;)
JustBenchingWe can argue the same about textures, does ultra to high make such a difference? No, it makes even less of a difference. Yet people are going apeshit about 4070's vram
Well, I'm actually in the "textures to the max forever" camp (they matter a lot more than lights do, imo), but ultra to high is probably not a difference worth arguing about.
JustBenchingWell not really, if you read the rest of the comments people are saying RT off looks betters. Which is wild
Personally, I'm saying that RT looks different. Whether that means something better, or just different, is up to the game and how it uses it.
Posted on Reply
#184
remekra
To conclude we should just stick with baked lighting, probes and don't progress at all.
I can understand that RT itself in some games and in almost all games at the beginning was just "shiny puddles" and that's all. But to claim that Path Tracing looks worse than raster in that image is just wild or manipulation or lying to yourself.

Truth is AMD was wrong to not focus more on RT since the beggining, since RDNA1 or 2. Doesn't matter that Turing was bad at RT, they still had the tech out and could iterate on it. AMD should have done the same with RDNA1. Then with RDNA2 which is in the consoles really improve it and keep up with nvidia. We would have a lot better implemenations of RT now thanks to it, since devs would use the tech on consoles as well. People would not have an argument that AMD is bad at RT and they could sell more GPUs and maybe RDNA4 would actually have a 9080/9090 instead of just 9070.

AMD knows this, that's why they focused on AI and RT in RDNA4, Sony knows this because they got the RDNA4 RT in PS5 Pro.
Posted on Reply
#185
Lauri
Atleast it won't be a paper launch.
Posted on Reply
#186
AusWolf
remekraTo conclude we should just stick with baked lighting, probes and don't progress at all.
I can understand that RT itself in some games and in almost all games at the beginning was just "shiny puddles" and that's all. But to claim that Path Tracing looks worse than raster in that image is just wild or manipulation or lying to yourself.
It's definitely not worse - but not necessarily better in every case. It's highly dependent on the game and its implementation of the tech.
remekraTruth is AMD was wrong to not focus more on RT since the beggining, since RDNA1 or 2. Doesn't matter that Turing was bad at RT, they still had the tech out and could iterate on it. AMD should have done the same with RDNA1. Then with RDNA2 which is in the consoles really improve it and keep up with nvidia. We would have a lot better implemenations of RT now thanks to it, since devs would use the tech on consoles as well. People would not have an argument that AMD is bad at RT and they could sell more GPUs and maybe RDNA4 would actually have a 9080/9090 instead of just 9070.

AMD knows this, that's why they focused on AI and RT in RDNA4, Sony knows this because they got the RDNA4 RT in PS5 Pro.
That I agree with.
Posted on Reply
#187
DudeBeFishing
JustBenchingI think that's not accurate. See below, RT off, RT on, PT on. Lots of differences between the 3.






I prefer the RT off because I can actually see the scene. The RT and PT needs to have less effect. Dark areas shouldn't turn pitch black. Even then, the ambient occlusion needs to be off. I haven't seen a game that does it right. It looks like dog hair piled in the corners, with dark outlines on every object. Weaker PT effects with ambient occlusion off will look perfect.
Posted on Reply
#188
wheresmycar
JustBenchingI think that's not accurate. See below, RT off, RT on, PT on. Lots of differences between the 3.






PT looks good. The image itself doesn't do it justice.
Posted on Reply
#189
umeng2002
Path tracing is ground breaking in games. Images online don't capture the improvement well enough. That said, if you don't use RT features, ups along and frame generation should be needed.
Posted on Reply
#190
King_Gedoorah
Intervention9299Kr (Danish Kroner), thats $1,283. We get screwed here in Scandinavia :laugh:
You mean we cry in Scandinavian...?
Posted on Reply
#191
dartuil
these cards have to be in the 400-700 € or im out.
Posted on Reply
#192
Zach_01
dartuilthese cards have to be in the 400-700 € or im out.
I think 700€ would be too much and won’t stand a chance to gain market share.
600€ including taxes would be sensible.
AIB 5070s will likely start at 750-800€ considering that FE will cost ~660€ and you can’t have those outside of US.
Posted on Reply
#193
x4it3n
For the sake of competition, let's hope it will be sold $500 or less and will be very similar to a 4080 SUPER performance wise!
Posted on Reply
#194
Zach_01
x4it3nFor the sake of competition, let's hope it will be sold $500 or less and will be very similar to a 4080 SUPER performance wise!
That’s instantly 600€ in most parts of Europe including taxes/import fees.
Posted on Reply
#195
DaemonForce
And I know it wouldn't stand a chance in the US above $600.
Why would anyone buy a 9070 series at $600+ if the previous gen line it replaces (7800XT/7900GRE) falls sub-500?
Ignore the fact the GRE has been out of stock for a month and the cheaper 7800XT stock is already full of cheap refurbs.
Or how about something more enthusiast and insane like the 7900XT falling sub-650? It'll fall some more once this is out.
I focus on the 7800/7900 XT units because they are good but they're also the out that we're given if the launch gets any weirder.

I'm not jumping team green to pick up their lowest tier overpriced AI scambait either.
The 5070 is a catastrophic loser for anyone already on a card that's semi-modern, worse if 4070.
The idea of paying $500 700+ for 12GB once that junk drops and gets scalped to hell is enough to piss anyone off.
I already know the behavior is not going to be desirable for my hardware ecosystem the moment I leave Windows anyway.
People need to stop buying expensive headaches.
Posted on Reply
#196
JustBenching
DaemonForceAnd I know it wouldn't stand a chance in the US above $600.
Why would anyone buy a 9070 series at $600+ if the previous gen line it replaces (7800XT/7900GRE) falls sub-500?
Power draw (probably, assuming the 9070 is better at that) and FSR4. And the increased RT
Posted on Reply
#197
Zach_01
DaemonForceAnd I know it wouldn't stand a chance in the US above $600.
Why would anyone buy a 9070 series at $600+ if the previous gen line it replaces (7800XT/7900GRE) falls sub-500?
Ignore the fact the GRE has been out of stock for a month and the cheaper 7800XT stock is already full of cheap refurbs.
Or how about something more enthusiast and insane like the 7900XT falling sub-650? It'll fall some more once this is out.
I focus on the 7800/7900 XT units because they are good but they're also the out that we're given if the launch gets any weirder.

I'm not jumping team green to pick up their lowest tier overpriced AI scambait either.
The 5070 is a catastrophic loser for anyone already on a card that's semi-modern, worse if 4070.
The idea of paying $500 700+ for 12GB once that junk drops and gets scalped to hell is enough to piss anyone off.
I already know the behavior is not going to be desirable for my hardware ecosystem the moment I leave Windows anyway.
People need to stop buying expensive headaches.
Here prices are moving very slowly. 7900XTs are starting from 710€ with some variants still at 800~900€.
XTXs from 950€ up to 1100€
GREs are some leftovers with absurd prices for whatever reason... 650~800€.
7800XTs are starting from 500€ up to... 800€ (those must be hunting for suckers...)

9070XT at ~600€ with +25~30% raster from 7800XT but around double RT and FSR4 could move something, especially from users on RX5000/6000 with 8/12GB VRAM.
JustBenchingPower draw (probably, assuming the 9070 is better at that) and FSR4. And the increased RT
Probably yes to FSR4/RT but for power I'm not very confident
Both 7800XT/7900GRE are around 260W and 9070XT could end up at 300+W for reference and low power AIBs with some (OCed) even 400+W with gaming boost (bursting) up to 3.5+GHz
Saying 3.5GHz with some confidence because XTXs are already burst up to 3.2-3.3GHz. I have mine capped at 3.0GHz.
Posted on Reply
#198
3valatzy
DaemonForceI'm not jumping team green to pick up their lowest tier overpriced AI scambait either.
The 5070 is a catastrophic loser for anyone already on a card that's semi-modern, worse if 4070.
The idea of paying $500 700+ for 12GB once that junk drops and gets scalped to hell is enough to piss anyone off.
It is a crime, because of how cheap the memory modules in fact are.
1GB costs only 2.3$, 8GB cost 18$. :banghead: :kookoo:

Posted on Reply
#199
AusWolf
JustBenchingPower draw (probably, assuming the 9070 is better at that) and FSR4. And the increased RT
And the previous gen being replaced and unavailable.
Posted on Reply
#200
Quicks
I am leaning more towards the 5070 TI 16GB, being the winner here probably 50W less power and atleast 10% faster then the 9070XT

So...going to wait and see how it plays out.

AMD has a good opertunity to make it happen, but we all know AMD has this skill to F it up!

Last 4 upgrades has been AMD, I think its time to swtich to the green side...
Posted on Reply
Add your own comment
Feb 21st, 2025 04:48 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts