• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RX 9070 availability

Joined
Mar 23, 2016
Messages
4,895 (1.50/day)
Processor Core i7-13700
Motherboard MSI Z790 Gaming Plus WiFi
Cooling Cooler Master RGB Tower cooler
Memory Crucial Pro 5600, Corsair Dominator RGB 6000
Video Card(s) XFX Speedster SWFT309 AMD Radeon RX 6700 XT CORE Gaming
Storage 970 EVO NVMe M.2 500GB,,WD850N 2TB
Display(s) Samsung 28” 4K monitor
Case Corsair iCUE 4000D RGB AIRFLOW
Audio Device(s) EVGA NU Audio, Edifier Bookshelf Speakers R1280
Power Supply TT TOUGHPOWER GF A3 Gold 1050W
Mouse Logitech G502 Hero
Keyboard Logitech G G413 Silver
Software Windows 11 Professional v24H2
:/

Did you take a screenshot of the card in your cart when it was at that price? I was wondering if they might honour it, like EVGA used to in the old country according to that pcgamer article I linked somewhere...

Surely we can dream but better dreams and build better systems
This was back on the first day the cards went up for sale. Video card tracker still showed the price from that day and out curiosity I clicked to look.
 
Joined
Jul 26, 2024
Messages
545 (2.40/day)
still 910e for the cheapest steel legend (2x8-pin) and 940e for taichi (12-pin, the one I want), wonder how long for 9070xt to drop under 800e. I initially wanted nitro, but it's 1100.
 
Joined
May 13, 2008
Messages
1,055 (0.17/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Aren't AMD supposedly providing retailers with rebates to hit MSRP?!
So they say! I really feel like so much of this is because AIBs wanted to seize this opportunity to make margins when/however they can.

This is not a typical launch, because so much of this is concerning AMD finally catching up with RT/FSR, and customers wanting that (and being accustomed to paying a large premium for it).
AIBs want a piece of that pie from those that still are willing to pay a premium for those features, but will eventually take those that will not.

I don't think AMD's target market changed.
 
Last edited:
Joined
Jul 13, 2016
Messages
3,541 (1.12/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
IMHO bending the cable is not an issue here but depending on the plastic quality if is a 9070XT with above 300 Watt pwr draw you might see that plastic becoming brittle. That pwr adaptor will sulk in a constant heat, prolly above 50C hundreds of hours.

That's for sure another potential problem. Normally I'd say that they'd use higher quality plastic that's able to withstand the constant heat / thermal cycles but this standard has proven to throw common sense out the window in many regards.

"30 mating cycles" we can exclude that. Because how many times you'll take out your GPU out of the case if your are not working for GN :D

Including the first install, if you clean your system monthly that's about 2 and 1/2 years. That's not accounting for taking things out for upgrades or reseating either. Most motherboards have the GPU over 1-2 M.2 slots.

That is not an unreasonable scenario, in fact it's rather favorable and assumes nothing else (like a different issue with the PC) might require you to re-mate the connector.

The connector also has issues with pin current changing on reseat (among others).

I've seen only Buildzoid video regarding 5090 and from you description, thank you, seems that is the same on the Nitro current balancing and maybe also not enough shunt resistors. Definitely shunt resistors on 5090 was a sad joke.
However don't forget that pwr draw difference is substantial in between the 2 cards, and the shitty design of power gate might last way longer on Nitro +

This is true, it's much less likely to be an issue on the Nitro. My only concern is that we have yet to see the long term durability of this connector even at around the 300 - 380w mark.

I think the next couple of years will really bear that out.

EDIT I just looked at B video about Sapphire PCB, we have a very different situation. The Sapphire circuit has 2 fuses and and 2x 12V power rails for that 340 W TDP compared with Nvidia , one fuse and one 12v power rail for 575W. That my friend IMHO is a huge difference. I might be wrong because I left out AMPS/wire and ATX version of the PSU. So, we have 2 factors to consider which are very important.
How many amps will go on a single wire and that depends of power gate and ATX revisions.
I believe derBauer had an 2.4 ATX for his 5090 test we don't know what an ATX 3.0 or ATX 3.1 will deliver trough the 12V connector. I don't know how many burned 5090 had an 3.0 or 3.1 PSU behind. I'm not assuming that ATX revisions will ensure a more realizable 12V design, but maybe is the case.

Some 5090s have 2 fuses. They aren't on separate circuits though in both Sapphire's and AIB 5090 designs so the fuse count is really not important, they are incapable of load balancing. It's treated as 1 blob of 12v across all pins.

As I remember one of the perks of the ATX 3.0 was that the GPU will talk with the PSU for the needs of power. However this leave a very bitter taste as many of users don't want to change their PSU just for the sake of ATX 3.0 if that old PSU 2.0 or 2.4 still delivers solid voltages to the system. Why would they.
With more condensed PCBs will see even more 12V implementations. Sapphire however doesn't have an excuse here for the 12V. that being said, 2x8 or 3x8 takes more space on the PCB than a 12V.

I don't agree myself with 12 V implementations on any GPU but, than we have to learn and discuss how can we deal with it. Or we have to get enough signatures to ban that 12v out of the design.
Let's make a poll.

Builzoid also talked about this (and hardware busters has a video explaining how the PSU determines how much power a GPU can pull), it's not nearly as smart as you think. In fact it's not smart at all.

The GPU isn't reporting any kind of detailed power consumption information to the PSU. All it's doing is detecting whether a sense pin is open or ground:


There's no communication done between the PSU and the GPU. It's determined by the cable.
 
Joined
Mar 2, 2011
Messages
108 (0.02/day)
Just hear me out. This goes for everyone that cares about saving a little money and/or can wait a small amount of time.

Wait until end of June, when companies have to hit their sales target for the quarter. If that doesn't work, there is a very Prime Day to buy a card. They are launching cards RN for a reason, I think.

Likely because just after that, when they have consumed peak margin and anyone thinking they got a great deal, they will likely launch a high-end card and that will relegate these to those lower prices as a norm.

If I am wrong, you can ridicule me to the ends of the earth. You don't need to buy from the river company, as on that date pretty much everyone competes.

When I am right, you can thank me for giving away all my secrets. :p

--------------------------------------

As for power draw, that was a long-standing issue. I'm glad they fixed it. I can understand your inclination towards a 9070, but I truly believe it is not a card that will make people happy long-term.
I would honestly just wait a small amount of time, the 9070 xt price WILL decrease to the 9070 price or lower, and you will have a much better card with similar power consumption for idle usage.
You can always undervolt/decrease power limit, and then the juice is there if there are times you want/need it. IMHO the 9070 is just too limited in it's capability.
I understand people think RT is an option; a toggle. And right now it is. But it will not always be this way; if you're coming from a 1080Ti it will absolutely be in the lifetime of this purchase.
9070 XT should/will allow a 1080p or 1080p upscaled experience to be good in those games. A 9070 will not; it is purposely limited to stay just shy of that level.
I understand the concern to save power, and perhaps you should check out this video (I would shoot for 20k in that bench as the target to hit).
It might be something to consider. Ofc, you do you, and I respect your choice, just trying to help.
I just want people to have the best they can afford, and I think the 9070 xt is that card for many people, and anyone considering 9070 will fairly soon be able to get the better one for it's current price or lower.

Until then,

!RemindmePrimeDay=thisthread.

Thanks for input, I know you trying to help.
Amazon UK sent me 2X Asrock high end MBO with bent socket pins and fingerprints, one after the other, sold as NEW. I don't trust them having good hardware, prolly just low grades and stuff that comes from test benches and overclocking.
Maybe the US one is better, but thank for the tip.

Regarding 9070 XT I was thinking anyway about that, but thanks anyway, lowering the power, not sure lowering power just using Adrenaline will not give stutters, that guy in the video he only benchmarked in Superposition for 1-2 minutes, IMO he really needs a Cinematic loop for at least1/2hr @8k optimized to test the stability of 70 power clamp, than take 5 games and play another 30 min in each for same reason. I played with power sliders for Nvidia GPUs in MSI afterburner but, rarely was smooth. I'm the guy that like consistency in FPS not high counts of FPS.

Lowering power for me will be only in Adrenaline. Asus tweak, Sapphire Trixx and the like are just nice gates for exploits and constant vulnerabilities for MITRE, not to say to minimize their impact you need to delete some dll files. I worked more with the later. My advice to you is to resume yourself to Adrenaline and observe any overlays misbehaving.

I was thinking of 9070 because I believe is plenty for my 1440P and not planning at all for 4K. VRAM bandwidth 624GB/s and both cards have infinity cache of 64mb which as long as I know is faster than Nvidia L2 cache, but I'm not completely convinced, I might be wrong. While the VRAM will hold very well, in time, GPU will show earlier signs of struggle, I'm aware of that.

Is true I was planning for end of May but, I can hold it till June. I purchased my 1080 Ti second hand/ebay for high price, September 2021. Since than I paid some more for 4x 92 mm fans(for deshrouding) and thermal putty. It cools very well in 2K now but wasn't cheap. I think is around 500£ card + deshrouding. Stock fans always to thin pushing high volume of air usually with high dB, and yes are some exceptions but I see them very rare.

In some games I have like GPU - 10C on VRAM(the norm is GPU temp minus 6-7C) for 1080 Ti with this kind of RPM, raer exhaust case fan 1300 RPM / 120mm/ no grill.
The 2 fans(both in pull) on the backplate are cooling further the VRAM and GPU but, also overcome the hot air trapped on the side as my Lian Li OD 11 Dynamic XL stupid design has only 3cm between the glass panel and GPU. High end case my ***
Think about it, if 9000 cards are wider than my 1080 Ti:nutkick: Asus TUF 9070 it is wider 14cm vs 12.5cmm, I guess I'll modify again my case, another cost, I should really ask Lian Li for partial refund at this point. derBauer got his name on my case I'll ask if he can help me with:roll:

Fans GPU RPM.jpg


I always focus on cooling the VRAM first and not the GPU but with the designs I seen already for 9070 and 9070XT and their hot running VRAM my methods might change a bit.
I see 2 flaws:
1. improper linkage of the already to thin cold plate for VRAM with the heatsink
2. high density of the PCB causing heat by proximity

Comparison of 4070 Super and 9070 both Asus and 220W TDP.

Asus TUF 9070 and 4070 Super.jpg


You can see the higher density PCB of 9070 not even XT and also one reason they are pushing 12v connectors, take less room on PCB.

We gonna have to deshroud, find solutions to cool VRAM better because as we already knew they usual do the bare minimum for VRAM cooling.
I'm considering 9070 XT for 2 things, higher quality chip and memory junction sensor which might be missing on some 9070.

I'm not gonna ridicule you if I don't like the 9070XT, I'm responsible for my decisions, I'll just return it.

Hope you find the info useful

Including the first install, if you clean your system monthly that's about 2 and 1/2 years. That's not accounting for taking things out for upgrades or reseating either. Most motherboards have the GPU over 1-2 M.2 slots.

That is not an unreasonable scenario, in fact it's rather favorable and assumes nothing else (like a different issue with the PC) might require you to re-mate the connector.

The connector also has issues with pin current changing on reseat (among others).
Not pulling my card out for cleaning not even once /year. Cleaning is done with the card there, hold fans with tape or chop sticks and blow the dust with a powerful blower being assisted by the hoover hose to pull some of the dust. If you see fluffs in-between rad fins, means your filters are not good enough. A good blower does a lot.
 
Last edited:
Joined
Jan 14, 2019
Messages
14,890 (6.63/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Was Overclockers the only retailer to not increase their prices during the frenzy?

Competition seemed to work in the opposite direction.
No they weren't. I got my XT from Scan for the exact same £570. Overclockers were the only one that made news about it.
 
Joined
May 13, 2008
Messages
1,055 (0.17/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Thanks for input, I know you trying to help.
Amazon UK sent me 2X Asrock high end MBO with bent socket pins and fingerprints, one after the other, sold as NEW. I don't trust them having good hardware, prolly just low grades and stuff that comes from test benches and overclocking.
Maybe the US one is better, but thank for the tip.

Regarding 9070 XT I was thinking anyway about that, but thanks anyway, lowering the power, not sure lowering power just using Adrenaline will not give stutters, that guy in the video he only benchmarked in Superposition for 1-2 minutes, IMO he really needs a Cinematic loop for at least1/2hr @8k optimized to test the stability of 70 power clamp, than take 5 games and play another 30 min in each for same reason. I played with power sliders for Nvidia GPUs in MSI afterburner but, rarely was smooth. I'm the guy that like consistency in FPS not high counts of FPS.

Lowering power for me will be only in Adrenaline. Asus tweak, Sapphire Trixx and the like are just nice gates for exploits and constant vulnerabilities for MITRE, not to say to minimize their impact you need to delete some dll files. I worked more with the later. My advice to you is to resume yourself to Adrenaline and observe any overlays misbehaving.

I was thinking of 9070 because I believe is plenty for my 1440P and not planning at all for 4K. VRAM bandwidth 624GB/s and both cards have infinity cache of 64mb which as long as I know is faster than Nvidia L2 cache, but I'm not completely convinced, I might be wrong. While the VRAM will hold very well, in time, GPU will show earlier signs of struggle, I'm aware of that.

Is true I was planning for end of May but, I can hold it till June. I purchased my 1080 Ti second hand/ebay for high price, September 2021. Since than I paid some more for 4x 92 mm fans(for deshrouding) and thermal putty. It cools very well in 2K now but wasn't cheap. I think is around 500£ card + deshrouding. Stock fans always to thin pushing high volume of air usually with high dB, and yes are some exceptions but I see them very rare.

In some games I have like GPU - 10C on VRAM(the norm is GPU temp minus 6-7C) for 1080 Ti with this kind of RPM, raer exhaust case fan 1300 RPM / 120mm/ no grill.
The 2 fans(both in pull) on the backplate are cooling further the VRAM and GPU but, also overcome the hot air trapped on the side as my Lian Li OD 11 Dynamic XL stupid design has only 3cm between the glass panel and GPU. High end case my ***
Think about it, if 9000 cards are wider than my 1080 Ti:nutkick: Asus TUF 9070 it is wider 14cm vs 12.5cmm, I guess I'll modify again my case, another cost, I should really ask Lian Li for partial refund at this point. derBauer got his name on my case I'll ask if he can help me with:roll:

View attachment 388580

I always focus on cooling the VRAM first and not the GPU but with the designs I seen already for 9070 and 9070XT and their hot running VRAM my methods might change a bit.
I see 2 flaws:
1. improper linkage of the already to thin cold plate for VRAM with the heatsink
2. high density of the PCB causing heat by proximity

Comparison of 4070 Super and 9070 both Asus and 220W TDP.

View attachment 388581

You can see the higher density PCB of 9070 not even XT and also one reason they are pushing 12v connectors, take less room on PCB.

We gonna have to deshroud, find solutions to cool VRAM better because as we already knew they usual do the bare minimum for VRAM cooling.
I'm considering 9070 XT for 2 things, higher quality chip and memory junction sensor which might be missing on some 9070.

I'm not gonna ridicule you if I don't like the 9070XT, I'm responsible for my decisions, I'll just return it.

Hope you find the info useful


Not pulling my card out for cleaning not even once /year. Cleaning is done with the card there, hold fans with tape or chop sticks and blow the dust with a powerful blower being assisted by the hoover hose to pull some of the dust. If you see fluffs in-between rad fins, means your filters are not good enough. A good blower does a lot.

I understand stress-testing stability, but looking at how the pulse performs, I imagine they're all *mostly* capable of ~3200mhz raster and ~3050mhz RT with a decent undervolt (85-90%?)
AFAICT (unless I'm missing something), this should pretty perfect for a lot of scenarios and only ~270w load or so. I'm sure people will be stress-testing them over the next couple days here, reddit, etc. I'll keep an eye-out. I do think that *approximate* area will be perfect for a lot of people though. Even looking at Ratchet & Clank at 4k, the pulse (the weakest card W1z tested) should be able to keep 60 at max power.

This is why I think it's kind of a jack of all trades. People could run it at ~270w level (I don't know what it will be exactly, but close to there) and be able to keep most-everything smooth, and that little bit more performance (for a lot more power) is there is you really need it to get people over the hump in some scenarios. The 9070 can't hit that lower threshold, and will certainly struggle in the other.

I don't think there's a problem with adjust PL/V in Adrenaline at all, but that's just imho. That's all I've used on AMD cards for a while.

You're talking to the "W1zard, for the love of God use minimums not averages" guy, so you don't have to tell me about caring more about consistency/stability. Believe me. We are 100% in agreement.

As for 1440p, like I say, I'm a cautious guy. I look at stuff like Wukong and Spider-man 2 (which would have similar perf), because I know we're going there (at default, like Wukong) at some point, but for games like Spider-man. And 9070xt will keep 60 (at what I'm describing above, I think). 9070 isn't capable and it's very obvious they limited the card for this exact reason (and nVIDIA did the same thing for 5070).
You can lower settings, but what happens when the next step up becomes the reccommended? Then 9070 is in tough shape.

You can pretty much upscale 960p->1440p at similar (slightly better) performance than 1080p native, and that's my point about preparing for that future.
9070 is going to struggle even there to keep 60 overclocked to the balls in scenarios like that.

Where a 9070xt should do it undervolted to the ankles. That's kinda what I'm saying. That's the split. Right at that spot, and it will be increasingly more important. Most people don't understand that.
I really don't expect them to, as this crap is just barely starting to get standardized (as those two games are; they perform pretty much the same and limited by RT). I just don't want people caught off-guard.

I appreciate all the information on your sitch (and observations; I too would probably find a better way to cool the ram). I always think that stuff is interesting.

As I say, I'm not selling anyithng, buy what you want, but I just really don't think people are looking at this thing from a 'big picture' pov.
If you want the power savings; undervolt; it'll still be good-enough for dang near everything. If you need the perf (in fringe cases), OC. But the 9070 can't OC enough for many increasingly-common instances.

When you factor in the eventual pricing (which, again, I think the inevitable price-drop of XT to the price of vanilla is literally the purpose of the vanilla's current price; to entice people for that 'new' bargain), the 9070 looks real, real bad. I'm not saying it's a bad card (if it's cheaper), but the split...even though it *looks* small, truly isn't. Most people just don't notice those instances yet.

No they weren't. I got my XT from Scan for the exact same £570. Overclockers were the only one that made news about it.
Grats! I might need you be my guinea pig on a couple things if you ever have time and are willing. :p

I'm kinda curious about my theoretical experiment (building upon that undervolt vid), and if that's 'enough' for a lot of games you play; think it might be helpful/useful to some people.

But don't let me distract you from actually having fun. :)
 
Last edited:
Joined
Jan 14, 2019
Messages
14,890 (6.63/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
I understand stress-testing stability, but looking at how the pulse performs, I imagine they're all *mostly* capable of ~3200mhz raster and ~3050mhz RT with a decent undervolt (85-90%?)
AFAICT (unless I'm missing something), this should pretty perfect for a lot of scenarios and only ~270w load. I'm sure people will be stress-testing them over the next couple days here, reddit, etc. I'll keep an eye-out. I do think that *approximate* area will be perfect for a lot of people though. Even looking at Ratchet & Clank at 4k, the pulse (the weakest card W1z tested) should be able to keep 60 at max power.

This is why I think it's kind of a jack of all trades. People could run it at ~270w level (I don't know what it will be exactly, but close to there) and be able to keep most-everything smooth, and that little bit more performance (for a lot more power) is there is you really need it to get people over the hump in some scenarios. The 9070 can't hit that lower threshold, and will certainly struggle in the other.

I don't think there's a problem with adjust PL/V in Adrenaline at all, but that's just imho. That's all I've used on AMD cards for a while.

You're talking to the "W1zard, for the love of God use minimums not averages" guy, so you don't have to tell me about caring more about consistency. Believe me. We are 100% in agreement.

As for 1440p, like I say, I'm a cautious guy. I look at stuff like Wukong and Spider-man 2 (which would have similar perf), because I know we're going there (at default, like Wukong) at some point, but for games like Spider-man. And 9070xt will keep 60 (at what I'm describing above, I think). 9070 isn't capable and it's very obvious they limited the card for this exact reason (and nVIDIA did the same thing for 5070).
You can lower settings, but what happens when the next step up becomes the reccommended? Then 9070 is in tough shape.

You can pretty much upscale 960p->1440p at the same performance as 1080p native, and that's my point about preparing for that future. 9070 is going to struggle even there to keep 60 overclocked to the balls.

Where a 9070xt should do it undervolted to the ankles. That's kinda what I'm saying. That's the split. Right at that spot, and it will be increasingly more important. Most people don't understand that.
I really don't expect them to, as this crap is just barely starting to get standardized (as those two games are; they perform pretty much the same and limited by RT). I just don't want people caught off-guard.

I appreciate all the information on your sitch (and observations; I too would probably find a better way to cool the ram). I always think that stuff is interesting.

As I say, I'm not selling anyithng, buy what you want, but I just really don't think people are looking at this thing from a 'big picture' pov.
If you want the power savings; undervolt; it'll still be good-enough for dang near everything. If you need the perf (in fringe cases), OC. But the 9070 can't OC enough for many increasingly-common instances.

When you factor in the eventual pricing (which, again, I think the inevitable price-drop of XT to the price of vanilla is literally the purpose of the vanilla's current price; to entice people for that 'new' bargain), the 9070 looks real, real bad. I'm not saying it's a bad card (if it's cheap), but the split...evne though it *looks* small, truly isn't. Most people just don't notice those instances yet.
I don't think the difference between the XT and non-XT is that great, to be honest. Considering that you'd usually need a good 30-50% uplift to notice with the naked eye without an FPS counter...

Grats! I might need you be my guinea pig on a couple things if you even have time and are willing. :p
Sure. :) I'll just have to find a Linux distro that has the latest Mesa driver first (Bazzite only has 24.3 which doesn't fully support RDNA 4). :(
 
Joined
May 13, 2008
Messages
1,055 (0.17/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
I don't think the difference between the XT and non-XT is that great, to be honest. Considering that you'd usually need a good 30-50% uplift to notice with the naked eye without an FPS counter...


Sure. :) I'll just have to find a Linux distro that has the latest Mesa driver first (Bazzite only has 24.3 which doesn't fully support RDNA 4). :(
Don't worry about it, brother. Have fun. And good luck.

(I know people have VRR monitors, but I use 60 as a buffer; just like high settings. Each of those is a variable that becomes a concession, and people need to have standards. They don't have to be mine, but mine are very much the experience I feel most want to have. I try to limit accepting concessions if possible when evaluating something, as then they can be used later to increase longevity even more. I don't think people should *start* from there if they don't have to. If I condone that, then I would be condoning developers using them as crutches or giving AMD/nVIDIA an excuse to sell crappier GPUs).
 
Last edited:
Joined
Feb 7, 2008
Messages
712 (0.11/day)
Location
Barcelona, Catalonia, Spain
System Name Woody
Processor AMD Ryzen 7 9700X
Motherboard Gigabyte B650M Aorus Elite AX ICE
Cooling MSI MAG A13 CoreLiquid 360
Memory 32GB (2x16) Corsair Vengeance 6000MHz (CL30)
Video Card(s) Sapphire Pulse RX 9070
Storage WD_BLACK SN850x (2x1TB) + Sandisk Ultra 2TB
Display(s) LG 34WN80C (3440x1440 @ 70 Hz)
Case Lian Li A3 (Black-Wood)
Audio Device(s) Logitech Pro X & Scarlett 2i4 w/M-AUDIO BX5-D2
Power Supply Corsair RM750 (ver. 2019)
Mouse Logitech MX Master 3
Keyboard Keychron Q1 Pro (Akko Cream Blue Pro V3 switches)
Software Windows 10 Pro x64
I understand stress-testing stability, but looking at how the pulse performs, I imagine they're all *mostly* capable of ~3200mhz raster and ~3050mhz RT with a decent undervolt (85-90%?)
Well, my 9070 non-XT Pulse on pure stock settings by now, upon several synthetic 3DMark tests (TimeSpy/Solar Bay/Steel Nomad Light/Steel Nomad/Port Royal/Speed Way) and SOTR/Forza Horizon 5 benchmarks, the max GPU clock topped out at 2831 Mhz with a max power draw of around 245W.

And this is the most basic non-XT version on stock, withouth the fine tuning on voltage/clocks/power draw accomplished yet. Coming from a Sapphire Pulse 7800XT, also the most basic one, which undervolting helped quite a lot regarding obtaining performance and upping clocks, I would expect to see kind of same gains on both versions. And most importantly because of this ...
I don't think the difference between the XT and non-XT is that great, to be honest. Considering that you'd usually need a good 30-50% uplift to notice with the naked eye without an FPS counter...
... so to continue, I know synthetic benchmarks don't mean a lot, but we both executed the same test Heaven Superposition, even in different OS with diff drivers, the margin was magically around 10%, same diff as its price as GN also stated in his video, or kind of.

Maybe I got your whole post wrong, but why a good 9070 won't be able squeeze that 5-10% more with a decent tuning and yet struggle so much vs an undervolted 9070 XT? I really want to understand, not being ironical.

edit: I'm well aware on the cut-down on the real hardware units for some parts, but if we talk about performance wise, do you think 9070 still comes DOA? Cause for me it's a great product. Maybe price could be a little lower, but somehow, I don't feel ripped off having paid it at its MSRP given the current madness.
 
Last edited:
Joined
Mar 2, 2011
Messages
108 (0.02/day)
Well, my 9070 non-XT Pulse on pure stock settings by now, upon several synthetic 3DMark tests (TimeSpy/Solar Bay/Steel Nomad Light/Steel Nomad/Port Royal/Speed Way) and SOTR/Forza Horizon 5 benchmarks, the max GPU clock topped out at 2831 Mhz with a max power draw of around 245W.

And this is the most basic non-XT version on stock, withouth the fine tuning on voltage/clocks/power draw accomplished yet. Coming from a Sapphire Pulse 7800XT, also the most basic one, which undervolting helped quite a lot regarding obtaining performance and upping clocks, I would expect to see kind of same gains on both versions. And most importantly because of this ...

... so to continue, I know synthetic benchmarks don't mean a lot, but we both executed the same test Heaven Superposition, even in different OS with diff drivers, the margin was magically around 10%, same diff as its price as GN also stated in his video, or kind of.

Maybe I got your whole post wrong, but why a good 9070 won't be able squeeze that 5-10% more with a decent tuning and yet struggle so much vs an undervolted 9070 XT? I really want to understand, not being ironical.

edit: I'm well aware on the cut-down on the real hardware units for some parts, but if we talk about performance wise, do you think 9070 still comes DOA? Cause for me it's a great product. Maybe price could be a little lower, but somehow, I don't feel ripped off having paid it at its MSRP given the current madness.
Let me try to explain.
You are under the impression all chips are undervolting well and reach some certain clocks?

It might be a lower quality chip, is just a lottery when you buy a GPU. If me and you go to store and buy same Asus TUF 9070 OC, right? 2 cards from same store, in same day we assume is the same batch and revision, those 2 cards we buy will be different in overclocking, in undervolting. Chips are not coming out of the foundries equal. That's all.

My explanation maybe is a bad resume of what is called Silicon Lottery

If you are not happy just return it. No regret, no shame, later on get another one maybe will do better or even worst than the one you have.
 
Joined
Oct 15, 2011
Messages
2,605 (0.53/day)
Location
Springfield, Vermont
System Name KHR-1
Processor Ryzen 9 5900X
Motherboard ASRock B550 PG Velocita (UEFI-BIOS P3.40)
Memory 32 GB G.Skill RipJawsV F4-3200C16D-32GVR
Video Card(s) Sparkle Titan Arc A770 16 GB
Storage Western Digital Black SN850 1 TB NVMe SSD
Display(s) Alienware AW3423DWF OLED-ASRock PG27Q15R2A (backup)
Case Corsair 275R
Audio Device(s) Technics SA-EX140 receiver with Polk VT60 speakers
Power Supply eVGA Supernova G3 750W
Mouse Logitech G Pro (Hero)
Software Windows 11 Pro x64 23H2
Prices are very high, at around RX 7900 XTX level. There are no cards available in stock. Any orders made will be fulfilled from April onwards. The situation is exactly the same as it is with Nvidia GPUs.
I fear it could turn catastrophic, like the Ethereum-PoW-bull-run-era!
 
Joined
Nov 15, 2024
Messages
275 (2.39/day)
No they weren't. I got my XT from Scan for the exact same £570. Overclockers were the only one that made news about it.
I thought I read somewhere that Scan went offline and when they came back the prices had gone up?! I did notice that Ebuyer and Scan didn't seem to have the same problems as Overclockers either regarding instability. Was interesting watching the stock level drop in real time on Ebuyer, if you consider that interesting.

I fear it could turn catastrophic, like the Ethereum-PoW-bull-run-era!
It will certainly be interesting to see quality control over time (especially during this time), as Rusty Caterpillar mentioned.
 
Last edited:
Joined
Jan 14, 2019
Messages
14,890 (6.63/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
... so to continue, I know synthetic benchmarks don't mean a lot, but we both executed the same test Heaven Superposition, even in different OS with diff drivers, the margin was magically around 10%, same diff as its price as GN also stated in his video, or kind of.

Maybe I got your whole post wrong, but why a good 9070 won't be able squeeze that 5-10% more with a decent tuning and yet struggle so much vs an undervolted 9070 XT? I really want to understand, not being ironical.

edit: I'm well aware on the cut-down on the real hardware units for some parts, but if we talk about performance wise, do you think 9070 still comes DOA? Cause for me it's a great product. Maybe price could be a little lower, but somehow, I don't feel ripped off having paid it at its MSRP given the current madness.
The question isn't whether you can make up the difference between the non-XT and XT with some OC. The question is, are you gonna notice the difference? If it's within 10%, then I don't think so. :)

By the way, I managed to start my games under Nobara thanks to its Mesa 25.1 beta implementation. The card is still recognised as AMD Device 7550, but oh well. :D
I'm sure there's lots of improvement coming with later, non-beta Mesa versions. :)

I thought I read somewhere that Scan went offline and when they came back the prices had gone up?! I did notice that Ebuyer and Scan didn't seem to have the same problems as Overclockers either regarding instability. Was interesting watching the stock level drop in real time on Ebuyer, if you consider that interesting.
Overclockers got bombarded with requests and went offline straight at 2 PM. That's why I switched to Scan, which was online for a few minutes, which was enough for me to grab my card. :)
Scan usually sends two emails for your order: one for the order confirmation, and one for the payment confirmation. The latter came a few hours later this time, when their site came back up.

Scan doesn't usually list a price for products that are unavailable with no ETA. There's also no pre-orders for the 9070 XT currently, just a "notify me when available" button to sign up for an email alert.
 
Joined
May 14, 2024
Messages
15 (0.05/day)
Processor Ryzen 7 5700X3D
Motherboard ASUS TUF A520M-PLUS II
Cooling Thermalright Assassin King ARGB 120mm
Memory ADATA XPG Spectrix D35G 32GB DDR4 3600MHz CL18 (2 x 16GB)
Video Card(s) Sapphire Pulse Radeon RX 9070
Storage 2TB (500GB SK Hynix P41 Platinum - 500GB Samsung EVO 870 - 500GB Intel 660p - 500GB ADATA SU630)
Display(s) Dell G2724D - 1440p/165Hz
Case Kolink Observatory HF Mesh ARGB
Mouse Cooler Master MM731
Keyboard Akko 5075B Plus | V3 Cream Blue Pro Switches
Software Windows 11 Pro
I snagged a Sapphire Pulse 9070 at MSRP from OCUK on launch day, late afternoon.

The two MSRP 9070's (Reaper and Pulse) went from £525 to £539 after the first few batches sold. (I believe they were putting them on in batches of ~100, or so). Not too bad really but the same Reaper/Pulse models XT variants went from its MSRP of £569 to a whopping £629.[/COLOR]

Mine was delivered today and gotta say, I'm impressed so far. Very nice upgrade from my 6750 XT. I was trying to find a 7900 GRE under £550 for the past few months and never saw one that cheap so I'm glad I waited.

I recall someone from OCUK saying they're expecting more stock to arrive next week.

If the prices stay the same now after selling all the "MSRP stock" then I would say that the prices for the 9070 (starting) at £539 is better value than the 9070 XT (starting) at £629.

£90 extra for ~11% performance increase. Not to mention the efficiency, since it gets within spitting distance of the XT whilst using only 220W vs 304W which is definitely something to take into consideration for UK power prices.
 
Last edited:
Top