• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RX 9070 availability

Joined
Mar 23, 2016
Messages
4,895 (1.50/day)
Processor Core i7-13700
Motherboard MSI Z790 Gaming Plus WiFi
Cooling Cooler Master RGB Tower cooler
Memory Crucial Pro 5600, Corsair Dominator RGB 6000
Video Card(s) XFX Speedster SWFT309 AMD Radeon RX 6700 XT CORE Gaming
Storage 970 EVO NVMe M.2 500GB,,WD850N 2TB
Display(s) Samsung 28” 4K monitor
Case Corsair iCUE 4000D RGB AIRFLOW
Audio Device(s) EVGA NU Audio, Edifier Bookshelf Speakers R1280
Power Supply TT TOUGHPOWER GF A3 Gold 1050W
Mouse Logitech G502 Hero
Keyboard Logitech G G413 Silver
Software Windows 11 Professional v24H2
:/

Did you take a screenshot of the card in your cart when it was at that price? I was wondering if they might honour it, like EVGA used to in the old country according to that pcgamer article I linked somewhere...

Surely we can dream but better dreams and build better systems
This was back on the first day the cards went up for sale. Video card tracker still showed the price from that day and out curiosity I clicked to look.
 
Joined
Jul 26, 2024
Messages
546 (2.39/day)
still 910e for the cheapest steel legend (2x8-pin) and 940e for taichi (12-pin, the one I want), wonder how long for 9070xt to drop under 800e. I initially wanted nitro, but it's 1100.
 
Joined
May 13, 2008
Messages
1,058 (0.17/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Aren't AMD supposedly providing retailers with rebates to hit MSRP?!
So they say! I really feel like so much of this is because AIBs wanted to seize this opportunity to make margins when/however they can.

This is not a typical launch, because so much of this is concerning AMD finally catching up with RT/FSR, and customers wanting that (and being accustomed to paying a large premium for it).
AIBs want a piece of that pie from those that still are willing to pay a premium for those features, but will eventually take those that will not.

I don't think AMD's target market changed.
 
Last edited:
Joined
Jul 13, 2016
Messages
3,541 (1.12/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
IMHO bending the cable is not an issue here but depending on the plastic quality if is a 9070XT with above 300 Watt pwr draw you might see that plastic becoming brittle. That pwr adaptor will sulk in a constant heat, prolly above 50C hundreds of hours.

That's for sure another potential problem. Normally I'd say that they'd use higher quality plastic that's able to withstand the constant heat / thermal cycles but this standard has proven to throw common sense out the window in many regards.

"30 mating cycles" we can exclude that. Because how many times you'll take out your GPU out of the case if your are not working for GN :D

Including the first install, if you clean your system monthly that's about 2 and 1/2 years. That's not accounting for taking things out for upgrades or reseating either. Most motherboards have the GPU over 1-2 M.2 slots.

That is not an unreasonable scenario, in fact it's rather favorable and assumes nothing else (like a different issue with the PC) might require you to re-mate the connector.

The connector also has issues with pin current changing on reseat (among others).

I've seen only Buildzoid video regarding 5090 and from you description, thank you, seems that is the same on the Nitro current balancing and maybe also not enough shunt resistors. Definitely shunt resistors on 5090 was a sad joke.
However don't forget that pwr draw difference is substantial in between the 2 cards, and the shitty design of power gate might last way longer on Nitro +

This is true, it's much less likely to be an issue on the Nitro. My only concern is that we have yet to see the long term durability of this connector even at around the 300 - 380w mark.

I think the next couple of years will really bear that out.

EDIT I just looked at B video about Sapphire PCB, we have a very different situation. The Sapphire circuit has 2 fuses and and 2x 12V power rails for that 340 W TDP compared with Nvidia , one fuse and one 12v power rail for 575W. That my friend IMHO is a huge difference. I might be wrong because I left out AMPS/wire and ATX version of the PSU. So, we have 2 factors to consider which are very important.
How many amps will go on a single wire and that depends of power gate and ATX revisions.
I believe derBauer had an 2.4 ATX for his 5090 test we don't know what an ATX 3.0 or ATX 3.1 will deliver trough the 12V connector. I don't know how many burned 5090 had an 3.0 or 3.1 PSU behind. I'm not assuming that ATX revisions will ensure a more realizable 12V design, but maybe is the case.

Some 5090s have 2 fuses. They aren't on separate circuits though in both Sapphire's and AIB 5090 designs so the fuse count is really not important, they are incapable of load balancing. It's treated as 1 blob of 12v across all pins.

As I remember one of the perks of the ATX 3.0 was that the GPU will talk with the PSU for the needs of power. However this leave a very bitter taste as many of users don't want to change their PSU just for the sake of ATX 3.0 if that old PSU 2.0 or 2.4 still delivers solid voltages to the system. Why would they.
With more condensed PCBs will see even more 12V implementations. Sapphire however doesn't have an excuse here for the 12V. that being said, 2x8 or 3x8 takes more space on the PCB than a 12V.

I don't agree myself with 12 V implementations on any GPU but, than we have to learn and discuss how can we deal with it. Or we have to get enough signatures to ban that 12v out of the design.
Let's make a poll.

Builzoid also talked about this (and hardware busters has a video explaining how the PSU determines how much power a GPU can pull), it's not nearly as smart as you think. In fact it's not smart at all.

The GPU isn't reporting any kind of detailed power consumption information to the PSU. All it's doing is detecting whether a sense pin is open or ground:


There's no communication done between the PSU and the GPU. It's determined by the cable.
 
Joined
Mar 2, 2011
Messages
110 (0.02/day)
Just hear me out. This goes for everyone that cares about saving a little money and/or can wait a small amount of time.

Wait until end of June, when companies have to hit their sales target for the quarter. If that doesn't work, there is a very Prime Day to buy a card. They are launching cards RN for a reason, I think.

Likely because just after that, when they have consumed peak margin and anyone thinking they got a great deal, they will likely launch a high-end card and that will relegate these to those lower prices as a norm.

If I am wrong, you can ridicule me to the ends of the earth. You don't need to buy from the river company, as on that date pretty much everyone competes.

When I am right, you can thank me for giving away all my secrets. :p

--------------------------------------

As for power draw, that was a long-standing issue. I'm glad they fixed it. I can understand your inclination towards a 9070, but I truly believe it is not a card that will make people happy long-term.
I would honestly just wait a small amount of time, the 9070 xt price WILL decrease to the 9070 price or lower, and you will have a much better card with similar power consumption for idle usage.
You can always undervolt/decrease power limit, and then the juice is there if there are times you want/need it. IMHO the 9070 is just too limited in it's capability.
I understand people think RT is an option; a toggle. And right now it is. But it will not always be this way; if you're coming from a 1080Ti it will absolutely be in the lifetime of this purchase.
9070 XT should/will allow a 1080p or 1080p upscaled experience to be good in those games. A 9070 will not; it is purposely limited to stay just shy of that level.
I understand the concern to save power, and perhaps you should check out this video (I would shoot for 20k in that bench as the target to hit).
It might be something to consider. Ofc, you do you, and I respect your choice, just trying to help.
I just want people to have the best they can afford, and I think the 9070 xt is that card for many people, and anyone considering 9070 will fairly soon be able to get the better one for it's current price or lower.

Until then,

!RemindmePrimeDay=thisthread.

Thanks for input, I know you trying to help.
Amazon UK sent me 2X Asrock high end MBO with bent socket pins and fingerprints, one after the other, sold as NEW. I don't trust them having good hardware, prolly just low grades and stuff that comes from test benches and overclocking.
Maybe the US one is better, but thank for the tip.

Regarding 9070 XT I was thinking anyway about that, but thanks anyway, lowering the power, not sure lowering power just using Adrenaline will not give stutters, that guy in the video he only benchmarked in Superposition for 1-2 minutes, IMO he really needs a Cinematic loop for at least1/2hr @8k optimized to test the stability of 70 power clamp, than take 5 games and play another 30 min in each for same reason. I played with power sliders for Nvidia GPUs in MSI afterburner but, rarely was smooth. I'm the guy that like consistency in FPS not high counts of FPS.

Lowering power for me will be only in Adrenaline. Asus tweak, Sapphire Trixx and the like are just nice gates for exploits and constant vulnerabilities for MITRE, not to say to minimize their impact you need to delete some dll files. I worked more with the later. My advice to you is to resume yourself to Adrenaline and observe any overlays misbehaving.

I was thinking of 9070 because I believe is plenty for my 1440P and not planning at all for 4K. VRAM bandwidth 624GB/s and both cards have infinity cache of 64mb which as long as I know is faster than Nvidia L2 cache, but I'm not completely convinced, I might be wrong. While the VRAM will hold very well, in time, GPU will show earlier signs of struggle, I'm aware of that.

Is true I was planning for end of May but, I can hold it till June. I purchased my 1080 Ti second hand/ebay for high price, September 2021. Since than I paid some more for 4x 92 mm fans(for deshrouding) and thermal putty. It cools very well in 2K now but wasn't cheap. I think is around 500£ card + deshrouding. Stock fans always to thin pushing high volume of air usually with high dB, and yes are some exceptions but I see them very rare.

In some games I have like GPU - 10C on VRAM(the norm is GPU temp minus 6-7C) for 1080 Ti with this kind of RPM, raer exhaust case fan 1300 RPM / 120mm/ no grill.
The 2 fans(both in pull) on the backplate are cooling further the VRAM and GPU but, also overcome the hot air trapped on the side as my Lian Li OD 11 Dynamic XL stupid design has only 3cm between the glass panel and GPU. High end case my ***
Think about it, if 9000 cards are wider than my 1080 Ti:nutkick: Asus TUF 9070 it is wider 14cm vs 12.5cmm, I guess I'll modify again my case, another cost, I should really ask Lian Li for partial refund at this point. derBauer got his name on my case I'll ask if he can help me with:roll:

Fans GPU RPM.jpg


I always focus on cooling the VRAM first and not the GPU but with the designs I seen already for 9070 and 9070XT and their hot running VRAM my methods might change a bit.
I see 2 flaws:
1. improper linkage of the already to thin cold plate for VRAM with the heatsink
2. high density of the PCB causing heat by proximity

Comparison of 4070 Super and 9070 both Asus and 220W TDP.

Asus TUF 9070 and 4070 Super.jpg


You can see the higher density PCB of 9070 not even XT and also one reason they are pushing 12v connectors, take less room on PCB.

We gonna have to deshroud, find solutions to cool VRAM better because as we already knew they usual do the bare minimum for VRAM cooling.
I'm considering 9070 XT for 2 things, higher quality chip and memory junction sensor which might be missing on some 9070.

I'm not gonna ridicule you if I don't like the 9070XT, I'm responsible for my decisions, I'll just return it.

Hope you find the info useful

Including the first install, if you clean your system monthly that's about 2 and 1/2 years. That's not accounting for taking things out for upgrades or reseating either. Most motherboards have the GPU over 1-2 M.2 slots.

That is not an unreasonable scenario, in fact it's rather favorable and assumes nothing else (like a different issue with the PC) might require you to re-mate the connector.

The connector also has issues with pin current changing on reseat (among others).
Not pulling my card out for cleaning not even once /year. Cleaning is done with the card there, hold fans with tape or chop sticks and blow the dust with a powerful blower being assisted by the hoover hose to pull some of the dust. If you see fluffs in-between rad fins, means your filters are not good enough. A good blower does a lot.
 
Last edited:
Joined
Jan 14, 2019
Messages
14,925 (6.64/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Was Overclockers the only retailer to not increase their prices during the frenzy?

Competition seemed to work in the opposite direction.
No they weren't. I got my XT from Scan for the exact same £570. Overclockers were the only one that made news about it.
 
Joined
May 13, 2008
Messages
1,058 (0.17/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Thanks for input, I know you trying to help.
Amazon UK sent me 2X Asrock high end MBO with bent socket pins and fingerprints, one after the other, sold as NEW. I don't trust them having good hardware, prolly just low grades and stuff that comes from test benches and overclocking.
Maybe the US one is better, but thank for the tip.

Regarding 9070 XT I was thinking anyway about that, but thanks anyway, lowering the power, not sure lowering power just using Adrenaline will not give stutters, that guy in the video he only benchmarked in Superposition for 1-2 minutes, IMO he really needs a Cinematic loop for at least1/2hr @8k optimized to test the stability of 70 power clamp, than take 5 games and play another 30 min in each for same reason. I played with power sliders for Nvidia GPUs in MSI afterburner but, rarely was smooth. I'm the guy that like consistency in FPS not high counts of FPS.

Lowering power for me will be only in Adrenaline. Asus tweak, Sapphire Trixx and the like are just nice gates for exploits and constant vulnerabilities for MITRE, not to say to minimize their impact you need to delete some dll files. I worked more with the later. My advice to you is to resume yourself to Adrenaline and observe any overlays misbehaving.

I was thinking of 9070 because I believe is plenty for my 1440P and not planning at all for 4K. VRAM bandwidth 624GB/s and both cards have infinity cache of 64mb which as long as I know is faster than Nvidia L2 cache, but I'm not completely convinced, I might be wrong. While the VRAM will hold very well, in time, GPU will show earlier signs of struggle, I'm aware of that.

Is true I was planning for end of May but, I can hold it till June. I purchased my 1080 Ti second hand/ebay for high price, September 2021. Since than I paid some more for 4x 92 mm fans(for deshrouding) and thermal putty. It cools very well in 2K now but wasn't cheap. I think is around 500£ card + deshrouding. Stock fans always to thin pushing high volume of air usually with high dB, and yes are some exceptions but I see them very rare.

In some games I have like GPU - 10C on VRAM(the norm is GPU temp minus 6-7C) for 1080 Ti with this kind of RPM, raer exhaust case fan 1300 RPM / 120mm/ no grill.
The 2 fans(both in pull) on the backplate are cooling further the VRAM and GPU but, also overcome the hot air trapped on the side as my Lian Li OD 11 Dynamic XL stupid design has only 3cm between the glass panel and GPU. High end case my ***
Think about it, if 9000 cards are wider than my 1080 Ti:nutkick: Asus TUF 9070 it is wider 14cm vs 12.5cmm, I guess I'll modify again my case, another cost, I should really ask Lian Li for partial refund at this point. derBauer got his name on my case I'll ask if he can help me with:roll:

View attachment 388580

I always focus on cooling the VRAM first and not the GPU but with the designs I seen already for 9070 and 9070XT and their hot running VRAM my methods might change a bit.
I see 2 flaws:
1. improper linkage of the already to thin cold plate for VRAM with the heatsink
2. high density of the PCB causing heat by proximity

Comparison of 4070 Super and 9070 both Asus and 220W TDP.

View attachment 388581

You can see the higher density PCB of 9070 not even XT and also one reason they are pushing 12v connectors, take less room on PCB.

We gonna have to deshroud, find solutions to cool VRAM better because as we already knew they usual do the bare minimum for VRAM cooling.
I'm considering 9070 XT for 2 things, higher quality chip and memory junction sensor which might be missing on some 9070.

I'm not gonna ridicule you if I don't like the 9070XT, I'm responsible for my decisions, I'll just return it.

Hope you find the info useful


Not pulling my card out for cleaning not even once /year. Cleaning is done with the card there, hold fans with tape or chop sticks and blow the dust with a powerful blower being assisted by the hoover hose to pull some of the dust. If you see fluffs in-between rad fins, means your filters are not good enough. A good blower does a lot.

I understand stress-testing stability, but looking at how the pulse performs, I imagine they're all *mostly* capable of ~3200mhz raster and ~3050mhz RT with a decent undervolt (85-90%?)
AFAICT (unless I'm missing something), this should pretty perfect for a lot of scenarios and only ~270w load or so. I'm sure people will be stress-testing them over the next couple days here, reddit, etc. I'll keep an eye-out. I do think that *approximate* area will be perfect for a lot of people though. Even looking at Ratchet & Clank at 4k, the pulse (the weakest card W1z tested) should be able to keep 60 at max power.

This is why I think it's kind of a jack of all trades. People could run it at ~270w level (I don't know what it will be exactly, but close to there) and be able to keep most-everything smooth, and that little bit more performance (for a lot more power) is there is you really need it to get people over the hump in some scenarios. The 9070 can't hit that lower threshold, and will certainly struggle in the other.

I don't think there's a problem with adjust PL/V in Adrenaline at all, but that's just imho. That's all I've used on AMD cards for a while.

You're talking to the "W1zard, for the love of God use minimums not averages" guy, so you don't have to tell me about caring more about consistency/stability. Believe me. We are 100% in agreement.

As for 1440p, like I say, I'm a cautious guy. I look at stuff like Wukong and Spider-man 2 (which would have similar perf), because I know we're going there (at default, like Wukong) at some point, but for games like Spider-man. And 9070xt will keep 60 (at what I'm describing above, I think). 9070 isn't capable and it's very obvious they limited the card for this exact reason (and nVIDIA did the same thing for 5070).
You can lower settings, but what happens when the next step up becomes the reccommended? Then 9070 is in tough shape.

You can pretty much upscale 960p->1440p at similar (slightly better) performance than 1080p native, and that's my point about preparing for that future.
9070 is going to struggle even there to keep 60 overclocked to the balls in scenarios like that.

Where a 9070xt should do it undervolted to the ankles. That's kinda what I'm saying. That's the split. Right at that spot, and it will be increasingly more important. Most people don't understand that.
I really don't expect them to, as this crap is just barely starting to get standardized (as those two games are; they perform pretty much the same and limited by RT). I just don't want people caught off-guard.

I appreciate all the information on your sitch (and observations; I too would probably find a better way to cool the ram). I always think that stuff is interesting.

As I say, I'm not selling anyithng, buy what you want, but I just really don't think people are looking at this thing from a 'big picture' pov.
If you want the power savings; undervolt; it'll still be good-enough for dang near everything. If you need the perf (in fringe cases), OC. But the 9070 can't OC enough for many increasingly-common instances.

When you factor in the eventual pricing (which, again, I think the inevitable price-drop of XT to the price of vanilla is literally the purpose of the vanilla's current price; to entice people for that 'new' bargain), the 9070 looks real, real bad. I'm not saying it's a bad card (if it's cheaper), but the split...even though it *looks* small, truly isn't. Most people just don't notice those instances yet.

No they weren't. I got my XT from Scan for the exact same £570. Overclockers were the only one that made news about it.
Grats! I might need you be my guinea pig on a couple things if you ever have time and are willing. :p

I'm kinda curious about my theoretical experiment (building upon that undervolt vid), and if that's 'enough' for a lot of games you play; think it might be helpful/useful to some people.

But don't let me distract you from actually having fun. :)
 
Last edited:
Joined
Jan 14, 2019
Messages
14,925 (6.64/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
I understand stress-testing stability, but looking at how the pulse performs, I imagine they're all *mostly* capable of ~3200mhz raster and ~3050mhz RT with a decent undervolt (85-90%?)
AFAICT (unless I'm missing something), this should pretty perfect for a lot of scenarios and only ~270w load. I'm sure people will be stress-testing them over the next couple days here, reddit, etc. I'll keep an eye-out. I do think that *approximate* area will be perfect for a lot of people though. Even looking at Ratchet & Clank at 4k, the pulse (the weakest card W1z tested) should be able to keep 60 at max power.

This is why I think it's kind of a jack of all trades. People could run it at ~270w level (I don't know what it will be exactly, but close to there) and be able to keep most-everything smooth, and that little bit more performance (for a lot more power) is there is you really need it to get people over the hump in some scenarios. The 9070 can't hit that lower threshold, and will certainly struggle in the other.

I don't think there's a problem with adjust PL/V in Adrenaline at all, but that's just imho. That's all I've used on AMD cards for a while.

You're talking to the "W1zard, for the love of God use minimums not averages" guy, so you don't have to tell me about caring more about consistency. Believe me. We are 100% in agreement.

As for 1440p, like I say, I'm a cautious guy. I look at stuff like Wukong and Spider-man 2 (which would have similar perf), because I know we're going there (at default, like Wukong) at some point, but for games like Spider-man. And 9070xt will keep 60 (at what I'm describing above, I think). 9070 isn't capable and it's very obvious they limited the card for this exact reason (and nVIDIA did the same thing for 5070).
You can lower settings, but what happens when the next step up becomes the reccommended? Then 9070 is in tough shape.

You can pretty much upscale 960p->1440p at the same performance as 1080p native, and that's my point about preparing for that future. 9070 is going to struggle even there to keep 60 overclocked to the balls.

Where a 9070xt should do it undervolted to the ankles. That's kinda what I'm saying. That's the split. Right at that spot, and it will be increasingly more important. Most people don't understand that.
I really don't expect them to, as this crap is just barely starting to get standardized (as those two games are; they perform pretty much the same and limited by RT). I just don't want people caught off-guard.

I appreciate all the information on your sitch (and observations; I too would probably find a better way to cool the ram). I always think that stuff is interesting.

As I say, I'm not selling anyithng, buy what you want, but I just really don't think people are looking at this thing from a 'big picture' pov.
If you want the power savings; undervolt; it'll still be good-enough for dang near everything. If you need the perf (in fringe cases), OC. But the 9070 can't OC enough for many increasingly-common instances.

When you factor in the eventual pricing (which, again, I think the inevitable price-drop of XT to the price of vanilla is literally the purpose of the vanilla's current price; to entice people for that 'new' bargain), the 9070 looks real, real bad. I'm not saying it's a bad card (if it's cheap), but the split...evne though it *looks* small, truly isn't. Most people just don't notice those instances yet.
I don't think the difference between the XT and non-XT is that great, to be honest. Considering that you'd usually need a good 30-50% uplift to notice with the naked eye without an FPS counter...

Grats! I might need you be my guinea pig on a couple things if you even have time and are willing. :p
Sure. :) I'll just have to find a Linux distro that has the latest Mesa driver first (Bazzite only has 24.3 which doesn't fully support RDNA 4). :(
 
Joined
May 13, 2008
Messages
1,058 (0.17/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
I don't think the difference between the XT and non-XT is that great, to be honest. Considering that you'd usually need a good 30-50% uplift to notice with the naked eye without an FPS counter...


Sure. :) I'll just have to find a Linux distro that has the latest Mesa driver first (Bazzite only has 24.3 which doesn't fully support RDNA 4). :(
Don't worry about it, brother. Have fun. And good luck.

(I know people have VRR monitors, but I use 60 as a buffer; just like high settings. Each of those is a variable that becomes a concession, and people need to have standards. They don't have to be mine, but mine are very much the experience I feel most want to have. I try to limit accepting concessions if possible when evaluating something, as then they can be used later to increase longevity even more. I don't think people should *start* from there if they don't have to. If I condone that, then I would be condoning developers using them as crutches or giving AMD/nVIDIA an excuse to sell crappier GPUs).
 
Last edited:
Joined
Feb 7, 2008
Messages
720 (0.12/day)
Location
Barcelona, Catalonia, Spain
System Name Woody
Processor AMD Ryzen 7 9700X
Motherboard Gigabyte B650M Aorus Elite AX ICE
Cooling MSI MAG A13 CoreLiquid 360
Memory 32GB (2x16) Corsair Vengeance 6000MHz (CL30)
Video Card(s) Sapphire Pulse RX 9070
Storage WD_BLACK SN850x (2x1TB) + Sandisk Ultra 2TB
Display(s) LG 34WN80C (3440x1440 @ 70 Hz)
Case Lian Li A3 (Black-Wood)
Audio Device(s) Logitech Pro X & Scarlett 2i4 w/M-AUDIO BX5-D2
Power Supply Corsair RM750 (ver. 2019)
Mouse Logitech MX Master 3
Keyboard Keychron Q1 Pro (Akko Cream Blue Pro V3 switches)
Software Windows 10 Pro x64
I understand stress-testing stability, but looking at how the pulse performs, I imagine they're all *mostly* capable of ~3200mhz raster and ~3050mhz RT with a decent undervolt (85-90%?)
Well, my 9070 non-XT Pulse on pure stock settings by now, upon several synthetic 3DMark tests (TimeSpy/Solar Bay/Steel Nomad Light/Steel Nomad/Port Royal/Speed Way) and SOTR/Forza Horizon 5 benchmarks, the max GPU clock topped out at 2831 Mhz with a max power draw of around 245W.

And this is the most basic non-XT version on stock, withouth the fine tuning on voltage/clocks/power draw accomplished yet. Coming from a Sapphire Pulse 7800XT, also the most basic one, which undervolting helped quite a lot regarding obtaining performance and upping clocks, I would expect to see kind of same gains on both versions. And most importantly because of this ...
I don't think the difference between the XT and non-XT is that great, to be honest. Considering that you'd usually need a good 30-50% uplift to notice with the naked eye without an FPS counter...
... so to continue, I know synthetic benchmarks don't mean a lot, but we both executed the same test Heaven Superposition, even in different OS with diff drivers, the margin was magically around 10%, same diff as its price as GN also stated in his video, or kind of.

Maybe I got your whole post wrong, but why a good 9070 won't be able squeeze that 5-10% more with a decent tuning and yet struggle so much vs an undervolted 9070 XT? I really want to understand, not being ironical.

edit: I'm well aware on the cut-down on the real hardware units for some parts, but if we talk about performance wise, do you think 9070 still comes DOA? Cause for me it's a great product. Maybe price could be a little lower, but somehow, I don't feel ripped off having paid it at its MSRP given the current madness.
 
Last edited:
Joined
Mar 2, 2011
Messages
110 (0.02/day)
Well, my 9070 non-XT Pulse on pure stock settings by now, upon several synthetic 3DMark tests (TimeSpy/Solar Bay/Steel Nomad Light/Steel Nomad/Port Royal/Speed Way) and SOTR/Forza Horizon 5 benchmarks, the max GPU clock topped out at 2831 Mhz with a max power draw of around 245W.

And this is the most basic non-XT version on stock, withouth the fine tuning on voltage/clocks/power draw accomplished yet. Coming from a Sapphire Pulse 7800XT, also the most basic one, which undervolting helped quite a lot regarding obtaining performance and upping clocks, I would expect to see kind of same gains on both versions. And most importantly because of this ...

... so to continue, I know synthetic benchmarks don't mean a lot, but we both executed the same test Heaven Superposition, even in different OS with diff drivers, the margin was magically around 10%, same diff as its price as GN also stated in his video, or kind of.

Maybe I got your whole post wrong, but why a good 9070 won't be able squeeze that 5-10% more with a decent tuning and yet struggle so much vs an undervolted 9070 XT? I really want to understand, not being ironical.

edit: I'm well aware on the cut-down on the real hardware units for some parts, but if we talk about performance wise, do you think 9070 still comes DOA? Cause for me it's a great product. Maybe price could be a little lower, but somehow, I don't feel ripped off having paid it at its MSRP given the current madness.
Let me try to explain.
You are under the impression all chips are undervolting well and reach some certain clocks?

It might be a lower quality chip, is just a lottery when you buy a GPU. If me and you go to store and buy same Asus TUF 9070 OC, right? 2 cards from same store, in same day we assume is the same batch and revision, those 2 cards we buy will be different in overclocking, in undervolting. Chips are not coming out of the foundries equal. That's all.

My explanation maybe is a bad resume of what is called Silicon Lottery

If you are not happy just return it. No regret, no shame, later on get another one maybe will do better or even worst than the one you have.
 
Joined
Oct 15, 2011
Messages
2,605 (0.53/day)
Location
Springfield, Vermont
System Name KHR-1
Processor Ryzen 9 5900X
Motherboard ASRock B550 PG Velocita (UEFI-BIOS P3.40)
Memory 32 GB G.Skill RipJawsV F4-3200C16D-32GVR
Video Card(s) Sparkle Titan Arc A770 16 GB
Storage Western Digital Black SN850 1 TB NVMe SSD
Display(s) Alienware AW3423DWF OLED-ASRock PG27Q15R2A (backup)
Case Corsair 275R
Audio Device(s) Technics SA-EX140 receiver with Polk VT60 speakers
Power Supply eVGA Supernova G3 750W
Mouse Logitech G Pro (Hero)
Software Windows 11 Pro x64 23H2
Prices are very high, at around RX 7900 XTX level. There are no cards available in stock. Any orders made will be fulfilled from April onwards. The situation is exactly the same as it is with Nvidia GPUs.
I fear it could turn catastrophic, like the Ethereum-PoW-bull-run-era!
 
Joined
Nov 15, 2024
Messages
275 (2.37/day)
No they weren't. I got my XT from Scan for the exact same £570. Overclockers were the only one that made news about it.
I thought I read somewhere that Scan went offline and when they came back the prices had gone up?! I did notice that Ebuyer and Scan didn't seem to have the same problems as Overclockers either regarding instability. Was interesting watching the stock level drop in real time on Ebuyer, if you consider that interesting.

I fear it could turn catastrophic, like the Ethereum-PoW-bull-run-era!
It will certainly be interesting to see quality control over time (especially during this time), as Rusty Caterpillar mentioned.
 
Last edited:
Joined
Jan 14, 2019
Messages
14,925 (6.64/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
... so to continue, I know synthetic benchmarks don't mean a lot, but we both executed the same test Heaven Superposition, even in different OS with diff drivers, the margin was magically around 10%, same diff as its price as GN also stated in his video, or kind of.

Maybe I got your whole post wrong, but why a good 9070 won't be able squeeze that 5-10% more with a decent tuning and yet struggle so much vs an undervolted 9070 XT? I really want to understand, not being ironical.

edit: I'm well aware on the cut-down on the real hardware units for some parts, but if we talk about performance wise, do you think 9070 still comes DOA? Cause for me it's a great product. Maybe price could be a little lower, but somehow, I don't feel ripped off having paid it at its MSRP given the current madness.
The question isn't whether you can make up the difference between the non-XT and XT with some OC. The question is, are you gonna notice the difference? If it's within 10%, then I don't think so. :)

By the way, I managed to start my games under Nobara thanks to its Mesa 25.1 beta implementation. The card is still recognised as AMD Device 7550, but oh well. :D
I'm sure there's lots of improvement coming with later, non-beta Mesa versions. :)

I thought I read somewhere that Scan went offline and when they came back the prices had gone up?! I did notice that Ebuyer and Scan didn't seem to have the same problems as Overclockers either regarding instability. Was interesting watching the stock level drop in real time on Ebuyer, if you consider that interesting.
Overclockers got bombarded with requests and went offline straight at 2 PM. That's why I switched to Scan, which was online for a few minutes, which was enough for me to grab my card. :)
Scan usually sends two emails for your order: one for the order confirmation, and one for the payment confirmation. The latter came a few hours later this time, when their site came back up.

Scan doesn't usually list a price for products that are unavailable with no ETA. There's also no pre-orders for the 9070 XT currently, just a "notify me when available" button to sign up for an email alert.
 
Joined
May 14, 2024
Messages
16 (0.05/day)
Processor Ryzen 7 5700X3D
Motherboard ASUS TUF A520M-PLUS II
Cooling Thermalright Assassin King ARGB 120mm
Memory ADATA XPG Spectrix D35G 32GB DDR4 3600MHz CL18 (2 x 16GB)
Video Card(s) Sapphire Pulse Radeon RX 9070
Storage 2TB (500GB SK Hynix P41 Platinum - 500GB Samsung EVO 870 - 500GB Intel 660p - 500GB ADATA SU630)
Display(s) Dell G2724D - 1440p/165Hz
Case Kolink Observatory HF Mesh ARGB
Mouse Cooler Master MM731
Keyboard Akko 5075B Plus | V3 Cream Blue Pro Switches
Software Windows 11 Pro
I snagged a Sapphire Pulse 9070 at MSRP from OCUK on launch day, late afternoon.

The two MSRP 9070's (Reaper and Pulse) went from £525 to £539 after the first few batches sold. (I believe they were putting them on in batches of ~100, or so). Not too bad really but the same Reaper/Pulse models XT variants went from its MSRP of £569 to a whopping £629.[/COLOR]

Mine was delivered today and gotta say, I'm impressed so far. Very nice upgrade from my 6750 XT. I was trying to find a 7900 GRE under £550 for the past few months and never saw one that cheap so I'm glad I waited.

I recall someone from OCUK saying they're expecting more stock to arrive next week.

If the prices stay the same now after selling all the "MSRP stock" then I would say that the prices for the 9070 (starting) at £539 is better value than the 9070 XT (starting) at £629.

£90 extra for ~11% performance increase. Not to mention the efficiency, since it gets within spitting distance of the XT whilst using only 220W vs 304W which is definitely something to take into consideration for UK power prices.
 
Last edited:
Joined
Feb 7, 2008
Messages
720 (0.12/day)
Location
Barcelona, Catalonia, Spain
System Name Woody
Processor AMD Ryzen 7 9700X
Motherboard Gigabyte B650M Aorus Elite AX ICE
Cooling MSI MAG A13 CoreLiquid 360
Memory 32GB (2x16) Corsair Vengeance 6000MHz (CL30)
Video Card(s) Sapphire Pulse RX 9070
Storage WD_BLACK SN850x (2x1TB) + Sandisk Ultra 2TB
Display(s) LG 34WN80C (3440x1440 @ 70 Hz)
Case Lian Li A3 (Black-Wood)
Audio Device(s) Logitech Pro X & Scarlett 2i4 w/M-AUDIO BX5-D2
Power Supply Corsair RM750 (ver. 2019)
Mouse Logitech MX Master 3
Keyboard Keychron Q1 Pro (Akko Cream Blue Pro V3 switches)
Software Windows 10 Pro x64
Let me try to explain.
You are under the impression all chips are undervolting well and reach some certain clocks?

It might be a lower quality chip, is just a lottery when you buy a GPU. If me and you go to store and buy same Asus TUF 9070 OC, right? 2 cards from same store, in same day we assume is the same batch and revision, those 2 cards we buy will be different in overclocking, in undervolting. Chips are not coming out of the foundries equal. That's all.

My explanation maybe is a bad resume of what is called Silicon Lottery

If you are not happy just return it. No regret, no shame, later on get another one maybe will do better or even worst than the one you have.
I'm well aware of the silicon lottery, been in the game long enough to know how it works but thanks anyway. And at no point did I say I was unhappy, quite the opposite actually, at least when it comes to the 7800XT. Time will tell with this 9070.

It's also easy to find plenty of reports from users who saw gains by undervolting these latest AMD generations. Maybe it's because they come slightly overvolted out of the box to ensure what you mentioned, since no two chips are the same, stability has to be guaranteed. But in many cases, users reported that undervolting helped reduce the power limit, which in turn allowed clocks to boost higher, actually improving performance.

Maybe I didn’t explain myself clearly, but I wasn’t talking about different overclocking potential. I also specifically stated "why a good 9070", my actual question to alwayssts was exactly that: what could cause a well-tuned 9070 with 10% more performance to struggle, while a simple, non-tuned undervolted 9070 XT wouldn’t, because that’s what I understood from his post.

The question isn't whether you can make up the difference between the non-XT and XT with some OC. The question is, are you gonna notice the difference? If it's within 10%, then I don't think so. :)
That's a very good point as well.

By the way, I managed to start my games under Nobara thanks to its Mesa 25.1 beta implementation. The card is still recognised as AMD Device 7550, but oh well. :D
I'm sure there's lots of improvement coming with later, non-beta Mesa versions. :)
:rockout:

If the prices stay the same now after selling all the "MSRP stock" then I would say that the prices for the 9070 (starting) at £539 is better value than the 9070 XT (starting) at £629.
Quite an interesting topic, maybe this was AMD’s strategy all along, just throwing out an even more unrealistic MSRP for the first 9070 XT, then using a clear decoy marketing move to push everyone toward the XT, knowing that the price difference would only last a few hours. That way they sell out quickly and make people settle for the 9070 as a consolation prize, boosting their sales.

Even I tried to grab a 9070 XT as another FOMO victim, but I already knew the non-XT was a better deal for me. The performance I needed was there, it’s smaller and more power efficient.

That’s why I replied that this card is actually a solid product. Could it be cheaper? Sure, but trashing it just because it was only 50$ cheaper than the XT seems unfair, and as you said, now the value proposition is hard to beat if prices stay where they are.
 
Joined
Mar 16, 2021
Messages
42 (0.03/day)
Location
Saudi Arabia
System Name My PC
Processor Ryzen 5500
Motherboard TUF B450M-PRO GAMING
Cooling ID-COOLING SE-207-XT
Memory JUHOR 3000Mhz 32GB (16Gx2)
Video Card(s) RTX 2060 Super 135$
Storage Kingchuxing 500G | Reletech P400 DRAM 2TB + HDD
Display(s) LG 24 FHD IPS 75hz
Case XPG BATTLECRUISER
Audio Device(s) SAMSON SR850 | Edifier R1280DB
Power Supply XPG CORE Reactor 650W Gold
Mouse Razer Basilisk X HyperSpeed
Keyboard K96
Software Windows 10 Pro
Here in saudi arabia, they know nothing about AMD cards
 
Joined
May 14, 2024
Messages
16 (0.05/day)
Processor Ryzen 7 5700X3D
Motherboard ASUS TUF A520M-PLUS II
Cooling Thermalright Assassin King ARGB 120mm
Memory ADATA XPG Spectrix D35G 32GB DDR4 3600MHz CL18 (2 x 16GB)
Video Card(s) Sapphire Pulse Radeon RX 9070
Storage 2TB (500GB SK Hynix P41 Platinum - 500GB Samsung EVO 870 - 500GB Intel 660p - 500GB ADATA SU630)
Display(s) Dell G2724D - 1440p/165Hz
Case Kolink Observatory HF Mesh ARGB
Mouse Cooler Master MM731
Keyboard Akko 5075B Plus | V3 Cream Blue Pro Switches
Software Windows 11 Pro
Quite an interesting topic, maybe this was AMD’s strategy all along, just throwing out an even more unrealistic MSRP for the first 9070 XT, then using a clear decoy marketing move to push everyone toward the XT, knowing that the price difference would only last a few hours. That way they sell out quickly and make people settle for the 9070 as a consolation prize, boosting their sales.

Even I tried to grab a 9070 XT as another FOMO victim, but I already knew the non-XT was a better deal for me. The performance I needed was there, it’s smaller and more power efficient.

That’s why I replied that this card is actually a solid product. Could it be cheaper? Sure, but trashing it just because it was only 50$ cheaper than the XT seems unfair, and as you said, now the value proposition is hard to beat if prices stay where they are.

Yes indeed, I too was really contemplating the XT because of the "only 50$ more" thing even though I knew it was more than I needed, I was looking for a 7900 GRE earlier in the month and would have happily bought one if the price came down below £550, which it never did (fortunately because the 9070 was cheaper and a lot better). When I saw the prices were almost £100 extra then I wasn't even bothered any more I knew the 9070 was the better option than the XT at an inflated price.

I don't think the 9070 is getting the praise it deserves tbh, in most games at 1440p and 4K the difference is minimal and ranges from 2-3fps to a maximum of like 12fps and does so well running a hell of a lot more efficiently. It still has the RT chops too, I compared benchmarks with my friend from MH:Wilds earlier (his system is i9-13900F & 4070) and mine is 5700X3D & 9070 and at native res, no upscaling or frame gen, ultra settings including maxed out RT I got a score of well over 26,000 and an "Excellent" whilst his score was not even 12,000 and only got a "Playable"

I for one don't think the card costs too much at £525, I think it's a perfectly good card for that money. It was only looking expensive because it was next to the XT but now if you take into account that the real MSRP of the 9070 XT is £629 then I think the non XT is a solid deal.

I've seen some reviews that call the 9070 a 1440p card but then call the 9070 XT a 4K card, like bruh...there's like 11% difference between them. 9070 is as much of a 4K card as the 9070 XT is, although I prefer to play at 1440p (with either of them).
 
Joined
Jul 31, 2024
Messages
1,016 (4.58/day)
Why ? For example: I wanted to upgrade from my 1080 Ti to 7800 XT BUT, while 7800 XT draws lower pwr in gaming than 1080 Ti, in idle and video play(40W vs 17-19W) back draws way more than 1080 Ti with a single monitor. So, whatever I'm saving in gaming in terms of power I'm wasting it in idle and video play back.

My previous MSI radeon 6800 z Trio needed some manual config files to lower the power consumption.

my powercolor 7800xt hellhound always had with one or two monitors in w11 pro 23h2 / 24h2 or linux kernel with gnu userspace always idle consumptions below 12 Watts. I do not use high refresh screens - one whqd 75 hz and one very old dvi -> hdmi screen with 60hz.


now the revirews speak the same thing about the 7800xt and power consumption.





i do not know how these cards behave with 300hz whqd or better screens.
 
Joined
Feb 7, 2008
Messages
720 (0.12/day)
Location
Barcelona, Catalonia, Spain
System Name Woody
Processor AMD Ryzen 7 9700X
Motherboard Gigabyte B650M Aorus Elite AX ICE
Cooling MSI MAG A13 CoreLiquid 360
Memory 32GB (2x16) Corsair Vengeance 6000MHz (CL30)
Video Card(s) Sapphire Pulse RX 9070
Storage WD_BLACK SN850x (2x1TB) + Sandisk Ultra 2TB
Display(s) LG 34WN80C (3440x1440 @ 70 Hz)
Case Lian Li A3 (Black-Wood)
Audio Device(s) Logitech Pro X & Scarlett 2i4 w/M-AUDIO BX5-D2
Power Supply Corsair RM750 (ver. 2019)
Mouse Logitech MX Master 3
Keyboard Keychron Q1 Pro (Akko Cream Blue Pro V3 switches)
Software Windows 10 Pro x64
Yes indeed, I too was really contemplating the XT because of the "only 50$ more" thing even though I knew it was more than I needed, I was looking for a 7900 GRE earlier in the month and would have happily bought one if the price came down below £550, which it never did (fortunately because the 9070 was cheaper and a lot better). When I saw the prices were almost £100 extra then I wasn't even bothered any more I knew the 9070 was the better option than the XT at an inflated price.

I don't think the 9070 is getting the praise it deserves tbh, in most games at 1440p and 4K the difference is minimal and ranges from 2-3fps to a maximum of like 12fps and does so well running a hell of a lot more efficiently. It still has the RT chops too, I compared benchmarks with my friend from MH:Wilds earlier (his system is i9-13900F & 4070) and mine is 5700X3D & 9070 and at native res, no upscaling or frame gen, ultra settings including maxed out RT I got a score of well over 26,000 and an "Excellent" whilst his score was not even 12,000 and only got a "Playable"

I for one don't think the card costs too much at £525, I think it's a perfectly good card for that money. It was only looking expensive because it was next to the XT but now if you take into account that the real MSRP of the 9070 XT is £629 then I think the non XT is a solid deal.

I've seen some reviews that call the 9070 a 1440p card but then call the 9070 XT a 4K card, like bruh...there's like 11% difference between them. 9070 is as much of a 4K card as the 9070 XT is, although I prefer to play at 1440p (with either of them).
I was in a similar situation, got my 7800XT a couple of months ago for €550 and, due to different circumstances, I was still within the return period. Following some advice that I should’ve gone for around 7900 GRE performance to fully support and enjoy my resolution -3440x1440-, this 9070 ended up being an even better option. For me, the price difference was just €80 more (around 12%) for nearly a solid ~20-25% performance boost. It was a win-win. I think we can consider ourselves lucky that we got it at MSRP.
Our Sapphire Pulse RX 9070 simply rocks :toast:
 
Joined
Jan 14, 2019
Messages
14,925 (6.64/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Yes indeed, I too was really contemplating the XT because of the "only 50$ more" thing even though I knew it was more than I needed, I was looking for a 7900 GRE earlier in the month and would have happily bought one if the price came down below £550, which it never did (fortunately because the 9070 was cheaper and a lot better). When I saw the prices were almost £100 extra then I wasn't even bothered any more I knew the 9070 was the better option than the XT at an inflated price.

I don't think the 9070 is getting the praise it deserves tbh, in most games at 1440p and 4K the difference is minimal and ranges from 2-3fps to a maximum of like 12fps and does so well running a hell of a lot more efficiently. It still has the RT chops too, I compared benchmarks with my friend from MH:Wilds earlier (his system is i9-13900F & 4070) and mine is 5700X3D & 9070 and at native res, no upscaling or frame gen, ultra settings including maxed out RT I got a score of well over 26,000 and an "Excellent" whilst his score was not even 12,000 and only got a "Playable"

I for one don't think the card costs too much at £525, I think it's a perfectly good card for that money. It was only looking expensive because it was next to the XT but now if you take into account that the real MSRP of the 9070 XT is £629 then I think the non XT is a solid deal.

I've seen some reviews that call the 9070 a 1440p card but then call the 9070 XT a 4K card, like bruh...there's like 11% difference between them. 9070 is as much of a 4K card as the 9070 XT is, although I prefer to play at 1440p (with either of them).
Don't rely on the MH: Wilds benchmark too much. It's broken as hell, gives you inconsistent results even on the same system. Other than that, I agree.
 
Joined
Mar 2, 2011
Messages
110 (0.02/day)
My previous MSI radeon 6800 z Trio needed some manual config files to lower the power consumption.

my powercolor 7800xt hellhound always had with one or two monitors in w11 pro 23h2 / 24h2 or linux kernel with gnu userspace always idle consumptions below 12 Watts. I do not use high refresh screens - one whqd 75 hz and one very old dvi -> hdmi screen with 60hz.


now the revirews speak the same thing about the 7800xt and power consumption.





i do not know how these cards behave with 300hz whqd or better screens.
I believe was AMD and try to blame it on screens polling intervals and Hz cause as you can see 9000 series doesn't have the issues of 7000 series.

Thanks for sharing.
 
Joined
May 13, 2008
Messages
1,058 (0.17/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
my actual question to alwayssts was exactly that: what could cause a well-tuned 9070 with 10% more performance to struggle, while a simple, non-tuned undervolted 9070 XT wouldn’t, because that’s what I understood from his post.
First of all, look at Wizard's performance of a 9070 and then look at a XT. Notice the gap. That gap is very important. While game perf could vary, it won't for for RT workloads (max load/rt limitation).
Notice at stock how 9070 is 2586mhz (for that model, but all similar) where-as overclocked it is 2829mhz, this is your perf difference (plus ~4% from excess mem bw).
This can vary some, especially for raster, but avg oc clock likely based off of a RT workload, given the clock/perf differences. So less than/around 50fps RT mins, ~55 or so overclocked. Purposely not 60.
~60 at stock for XT. More than that if you overclock; can keep it at similar power to a 9070 OC, where-as 9070 can't do it at max. That is the literal defining line.
You will notice a 5070ti is relegated below this level at stock (which is very purposeful as well), but can be overclocked beyond it too. You will hopefully see the tactics at play, and the mentalist games occurring.
Hopefully you are starting to understand how the prodcut game is played.
Because there are people that will *never* OC, but still want to maintain 60fps, which the 5070ti will find more and more instances that it won't, even when overclocked, especially when up-scaled.
Same goes for 9070, but even overclocking doesn't help. On 9070 xt you're already (essentially) there, with a good 8%-10% left in the tank, and/or you can tune to that level and less power.


I have explained it several times, so apologies if repetitive. The reason why it matters because of the newer graphical standards and maintaining 60fps with high settings *right now*, but standard settings later.
Again, you can lower these *now*, that is true. But it is a standard. It will also likely become REQUIRED on many titles (like Indiana Jones, for example) at some point in (*arguably this card's life*) the future.
Again, look at how 9070 XT performs in IJ. It will maintain 4k60 pretty much exactly in 'Supreme" (rt). This will trickle down in different ways depending on situation. It's not at all about 4k, just one example.
This is replicated across the minimums of an immense amount of games, please don't make me link them all. Go look at reviews (that actually show lows and/or RT workloads).

Look at all the instances a 9070 xt is ~59fps mins, across reviews, which is many. This is because this a standard. It is likely AMD is busting their butt to get that last FPS.
Again, I suggest tweaking 9070 xt for best power to maintain that extra ~1fps. I *think* the clock would be ~3200mhz (and may need a *slight* memory oc). I don't know the *exact* power often required.
It's confusing to explain to people, as raster/rt either require more bandwidth, power, or both...and raster operates 5% higher on average. So think in terms of ~3200mhz raster or ~3050mhz RT (50TF) on XT.
This number both true, and very scalable. It makes sense to sell ~50TF, 100TF, and whatever the halo can manage within power limits next-gen.

The 9070 will almost-certainly not get that last fps, especially not in RT, and that is *the* point. This is the split in the cards, it is VERY purposeful, and it will matter, especially as time moves forward.
When you can't turn some of this stuff off, because it will be baked into the core design of the games.

The 9070 xt is a standard (more-or-less). Look at even 4k PT with performance (1080p) up-scaling in CyberPunk. It maintains a 30fps minimum pretty much exactly. Why is this? I will now explain the future.

AMD is almost-certainly using a 3-stack configuration next-gen, as shown here. It likely scale from 1 stack, 2 stack, and 3 stack. 128-bit, 256-but, 384-bit. There may be units disabled on some skus, idk.

Now, a one stack configuration will likely replace 9070 xt. Let's say it is 6144sp @ ~4200mhz (stock), just to be round (and similar to some 9070 xt stock perf of 3150mhz), but the point is that similar perf ability.

Which 9070 is not. You could say 9070 is cut-off between RT/raster imho, although some will make an arguement it is a low-end RT card and higher-end raster card.

Now, scale that up to 2 stacks. You now have a 60fps 4k PT experience with performance up-scaling. Get it?

That is a worse-case scenario, but there are MANY, MANY more that are not. Spider-man 2 RT 1080p60 (or 1440 'quality' up-scaling); 60fps on a XT.
Not likely going to maintain it on a 9070 even with your best OC.
Star Wars Outlaws (which is a demonstration of Snow Drop; read 'The next Division Game' etc) at 1440p (which uses up-scaling by default). Wukong. Etc etc etc etc. This is the next standard. And it will scale.

The tiers depend on a lot of things, but I think they will be roughly like so:

9070 xt/low-end UDNA/low-end nVIDIA (6144sp?): 1080p60 RT or 1440p 'quality' up-scaling (30 fps 'performance' [1080p] up-scaling 4k PT). You can have this now with a 9070 xt.
Mid-range: (Essentially replace the 5080/higher-end N48 (if it exists). This may be for 1080p->4K up-scaling, as it requires a bit more performance. 1440p60RT. I think this will be ~PS6 and/or 9216sp/18GB part
Performance (replace 4090, slightly faster; probably ~100TF), likely double the low-end: 1080p120 RT, 1440p->4k upscaing RT, 1080p60 'performance' up-scaling PT.
Halo: Probably built for 1440pPT and 1440p 'quality' 4k up-scaling PT.

People need to start looking at cards like this if they want to stay current with the industry. Some don't, and I understand. These cards are *slightly* before many need to worry about it, and I get that too.
This is why I say it's *mostly* a next-gen concern. But not all people upgrade their cards EVERY generation, or even every 3 generations. For them this should be a concern. This is what I am saying.

9070 xt is the cheapest card that fits that criteria. 4080 used to be, but again it has been relegated below it in many instances due to increased demands from DLSS etc. This will continue, with 5070 ti too.
Likely in both respects. AMD will likely stay pretty consistant; although I don't know the *exact* capabilties of the PS6. If that outdates 9070 xt (or a higher-end version) in that regard, I don't know (yet).
4080/5070ti will likely get worse as the demands from nVIDIA's software increase. I know this concept blows some peoples' minds, but it's still true.

People need to stop looking at raster performance (for the most part) in my opinion. We have now more-or-less conquered that hill with the capabilties of the 9070xt IMHO.
You could likely keep 4k60 with Ratchet and Clank with an overclock on a decent 9070 xt model, which is one of Wizard's most-demanding benchmarks.
It will likely to be ported to PS6 and this one of it's options. Spider-man 2 (and Wukong etc) are an example of *current* RT, and will likely be ported to with ~1080p or 1440p->4k performance up-scaling.
1080p60 (or 1440p up-scaling) is possible on XT. 4k requires more oomph (and might be perf diff to PS6), but you can always use frame generation which will likely keep framerate okay, with lag managable.

If the PS6 is something like the 9070 xt or rather the next tier up (as described as a 9216sp part from nvidia/5080/'9080xt') I do not know. I think the later. I think it will be, bc then AMD can sell more cards.
If I had to guess, whatever AMD puts out with 32GB will be *very* close to the power of the PS6, although there are obvious reasons to make it slightly slower (especially at stock). Sell more cards later.
It also would make sense, given PS6 will likely take advantage of more than 16GB of ram, which 9070 xt can not. A 18GB card, for instance, would fit that bill for compute/rt power. 32GB perhaps same buffer.
Also consider nVIDIA almost certainly knows the scaling standard (which can be clearly seen as greater than 5080 for 1440p60RT), it's likely they think PS6 will perform better as well. Gotta sell it again.
With 2 more GB (and higher *stock* clockspeed), probably. First they need to sell a 24GB 5080 with not as much compute as that eventual card with less ram, so they can update DLSS later and relefate the 24GB part (with more ram but less compute) below 60fps. This is how you win with a disjointed stack....Because to understand it you literally have to read my long sprawling posts, which most won't.

Does this all make sense? I know I'm about a year (or slighly more) too early for most people to understand, as not everyone plays games with these implementations. But many will soon-enough.

Don't worry, DF etc will beat it into most peoples' head before too long, I'm sure. 9070 will not move on this curve imho. It will have similar problems to 5070; just pure grunt rather than compute/ram.

I'd go screencap reviews/link benches that show this consistantly, but it is a pain...Most don't care/understand; most don't even understand 1% lows. I'd rather you look yourself and you will find this to be true.

I honestly have no problem helping people learn, and I do speculate *sometimes*, but other times I'm not. If you don't believe me, then GO LOOK IT UP.

Obviously people have different opinions/needs, but I'm trying to look out for people. No agenda. This is the same reason I explain 1% lows to people, so when games like HZD stutter on nVIDIA, you get it.

I don't sit and explain it's because of buffer limitations due to heavy reliance of internal cache which allows maximum use of compute but can't save it when it goes to external ram (which is often not enough). It's also often times for other reasons, too. That's the thing. There's lots of reasons, but the point remains the actual reality is the problem with LOWs and what they do to them over time.

I simply say "don't look at averages, look at lows". Don't listen to people saying one product is 10% faster when it is in-fact 10% SLOWER due to dips that make it slow down, which is not a good experience.
Don't pay attention to some peoples' awful comparative *average* bar graphs, wither displayed up and down or left to right.

The red team somehow much more consistent, even when not capable of being faster as max, because they actually plan their architecture going out to RAM, for which they actually have enough (unlike nVIDIA).
And they keep 60fps, or will easily overclock to do so. Especially nice as things evolve to be more demanding slightly over time. nVIDIA goes the opposite direction (sinks futher under 60fps), especially over time.

Do you know WHY better-acclaimed company does that thing? Because many people don't get it. They see the numbers averages but don't actually get it. That consistency matters, not avgs.
But those people will buy a new card when their current one slows down beyond a level they'd like (which is often the level I reccommend; high settings and keeping 60fps).

If you don't believe me, GO LOOK AT THE NUMBERS. Right now, go look at HZD and look at AMD's/nVIDIA's minimums. Many other games. This is not a rare occurance. Neither is people blaming the game.
I don't blame those people for not understanding (I do for acclaiming averages/features despite those things), but you need to understand many of these things are very purposeful and real. So are standards.
It's not an opinion to push a product or bias in a brand war.

People need to absorb that, both what I'm saying and others pushing the opposite. Some think games are programmed badly, when it's the often cards planned to outdate it in ways you don't understand.
Faster cache (at higher core clock) or more ram. Higher compute, but not ram. More intensive software. All stuff variables AMD attempts not to use in an underhanded fashion, and nVIDIA very-much does.

So support them...Just don't buy a 9070. Buy a XT. Or higher (when it comes out). If it will do those things (1080pRT->4k, 1440pRT native consistantly, 4kRT, etc.), as 9070 xt perfectly fits many situations.
And likely stay that way, the variable being whatever needle the PS6 pushes over it's capability (perhaps slightly more performance and/or VRAM usage). But it will still likely be very usable in that situation.
Likely by lowering a few settings and/or dealing with some dips. You won't need to buy another card, *probably*. That's what I'm saying. I can't guarentee that with a 9070 at all. To me, that is worth $50.
I think it's more than that, actually. I think the price will soon and certainly eventually reflect that; we'll have to see if the market agrees with me or ya'll. I know most reviewers agree with me (on that). :)

Or, you know, chill at 1440p non-rt with your 6800xt (or similar/better) and non-rt until you can't. I get that too, I really do! I really, really, do. Maybe you don't even a 6800xt, so buy a 9070...when it's way cheaper!

Also, I am not trying to make you feel bad about your purchase, and I'm not saying there isn't a possibility a next-gen in that sweet spot for the PS6 which might also outdate 9070 xt in this respect. It could.
But for right now, and everything we've seen over the last couple/few years, and continue to see in AAA releases every day, 9070 xt is (*almost, barring that ~1fps*), perfect (as a mid-range product).
Certainly for it's price (which is incredibly important to note versus similar nVIDIA cards). Like I say, I don't care which card runs resolution/setting badly better. I care that it can run a setting well. Hence; similar.

Obviously 9070 has uses and there are many cases what I am talking about will not be a limiting factor. Just be aware, however, that these issues exist.
 
Last edited:
Joined
Jan 14, 2019
Messages
14,925 (6.64/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
First of all, look at Wizard's performance of a 9070 and then look at a XT. Notice the gap. That gap is very important. While game perf could vary, it won't for for RT workloads (max load/rt limitation).
Notice at stock how 9070 is 2586mhz (for that model, but all similar) where-as overclocked it is 2829mhz, this is your perf difference (plus ~4% from excess mem bw).
This can vary some, especially for raster, but avg oc clock likely based off of a RT workload, given the clock/perf differences. So ~40TF max RT perf, where 9070 xt is capable of 50TF, which is more-or-less the standard.
You will notice a 5070ti is relegated below this level at stock (which is very purposeful as well), but can be overclocked beyond it too. You will hopefully see the tactics at play, and the mentalist games occurring.
Hopefully you are starting to understand how the prodcut game is played.
Because there are people that will *never* OC, but still want to maintain 60fps, which the 5070ti will find more and more instances that it won't, even when overclocked, especially when up-scaled.
Same goes for 9070, but even overclocking doesn't help. On 9070 xt you're already (essentially) there, with a good 8%-10% left in the tank, and/or you can tune to that level and less power.


I have explained it several times, so apologies if repetitive. The reason why it matters because of the newer graphical standards and maintaining 60fps with high settings *right now*, but standard settings later.
Again, you can lower these *now*, that is true. But it is a standard. It will also likely become REQUIRED on many titles (like Indiana Jones, for example) at some point in (*arguably this card's life*) the future.
Again, look at how 9070 XT performs in IJ. It will maintain 4k60 pretty much exactly in 'Supreme" (rt). This will trickle down in different ways depending on situation. It's not at all about 4k, just one example.
This is replicated across the minimums of an immense amount of games, please don't make me link them all. Go look at reviews (that actually show lows and/or RT workloads).

Look at all the instances a 9070 xt is ~59fps mins, across reviews, which is many. This is because this a standard. It is likely AMD is busting their butt to get that last FPS.
Again, I suggest tweaking 9070 xt for best power to maintain that extra ~1fps. I *think* the clock would be ~3200mhz (and may need a *slight* memory oc). I don't know the *exact* power often required.
It's confusing to explain to people, as raster/rt either require more bandwidth, power, or both...and raster operates 5% higher on average. So think in terms of ~3200mhz raster or ~3050mhz RT (50TF RT) on XT.
This number both true, and very scalable. It makes sense to sell ~50TF, 100TF, and whatever the halo can manage within power limits next-gen.

The 9070 will almost-certainly not get that last fps, especially not in RT, and that is *the* point. This is the split in the cards, it is VERY purposeful, and it will matter, especially as time moves forward.
When you can't turn some of this stuff off, because it will be baked into the core design of the games.

The 9070 xt is a standard (more-or-less). Look at even 4k PT with performance (1080p) up-scaling in CyberPunk. It maintains a 30fps minimum pretty much exactly. Why is this? I will now explain the future.

AMD is almost-certainly using a 3-stack configuration next-gen, as shown here. It likely scale from 1 stack, 2 stack, and 3 stack. 128-bit, 256-but, 384-bit. There may be units disabled on some skus, idk.

Now, a one stack configuration will likely replace 9070 xt. Let's say it is 6144sp @ ~4200mhz (stock), just to be round (and similar to some 9070 xt stock perf of 3150mhz), but the point is that similar perf ability.

Which 9070 is not. You could say 9070 is cut-off between RT/raster imho, although some will make an arguement it is a low-end RT card and higher-end raster card.

Now, scale that up to 2 stacks. You now have a 60fps 4k PT experience with performance up-scaling. Get it?

That is a worse-case scenario, but there are MANY, MANY more that are not. Spider-man 2 RT 1080p60 (or 1440 'quality' up-scaling); 60fps on a XT.
Not likely going to maintain it on a 9070 even with your best OC.
Star Wars Outlaws (which is a demonstration of Snow Drop; read 'The next Division Game' etc) at 1440p (which uses up-scaling by default). Wukong. Etc etc etc etc. This is the next standard. And it will scale.

The tiers depend on a lot of things, but I think they will be roughly like so:

9070 xt/low-end UDNA/low-end nVIDIA (6144sp?): 1080p60 RT or 1440p 'quality' up-scaling (30 fps 'performance' [1080p] up-scaling 4k PT). You can have this now with a 9070 xt.
Mid-range: (Essentially replace the 5080/higher-end N48 (if it exists). This may be for 1080p->4K up-scaling, as it requires a bit more performance. 1440p60RT. I think this will be ~PS6 and/or 9216sp/18GB part
Performance (replace 4090, slightly faster; probably ~100TF), likely double the low-end: 1080p120 RT, 1440p->4k upscaing RT, 1080p60 'performance' up-scaling PT.
Halo: Probably built for 1440pPT and 1440p 'quality' 4k up-scaling PT.

People need to start looking at cards like this if they want to stay current with the industry. Some don't, and I understand. These cards are *slightly* before many need to worry about it, and I get that too.
This is why I say it's *mostly* a next-gen concern. But not all people upgrade their cards EVERY generation, or even every 3 generations. For them this should be a concern. This is what I am saying.

9070 xt is the cheapest card that fits that criteria. 4080 used to be, but again it has been relegated below it in many instances due to increased demands from DLSS etc. This will continue, with 5070 ti too.
Likely in both respects. AMD will likely stay pretty consistant; although I don't know the *exact* capabilties of the PS6. If that outdates 9070 xt (or a higher-end version) in that regard, I don't know (yet).
4080/5070ti will likely get worse as the demands from nVIDIA's software increase. I know this concept blows some peoples' minds, but it's still true.

People need to stop looking at raster performance (for the most part) in my opinion. We have now more-or-less conquered that hill with the capabilties of the 9070xt IMHO.
You could likely keep 4k60 with Ratchet and Clank with an overclock on a decent 9070 xt model, which is one of Wizard's most-demanding benchmarks.
It will likely to be ported to PS6 and this one of it's options. Spider-man 2 (and Wukong etc) are an example of *current* RT, and will likely be ported to with ~1080p or 1440p->4k performance up-scaling.
1080p60 (or 1440p up-scaling) is possible on XT. 4k requires more oomph (and might be perf diff to PS6), but you can always use frame generation which will likely keep framerate okay, with lag managable.

If the PS6 is something like the 9070 xt or rather the next tier up (as described as a 9216sp part from nvidia/5080/'9080xt') I do not know. I think the later. I think it will be, bc then AMD can sell more cards.
If I had to guess, whatever AMD puts out with 32GB will be *very* close to the power of the PS6, although there are obvious reasons to make it slightly slower (especially at stock). Sell more cards later.
It also would make sense, given PS6 will likely take advantage of more than 16GB of ram, which 9070 xt can not. A 18GB card, for instance, would fit that bill for compute/rt power. 32GB perhaps same buffer.
Also consider nVIDIA almost certainly knows the scaling standard (which can be clearly seen as greater than 5080 for 1440p60RT), it's likely they think PS6 will perform better as well. Gotta sell it again.
With 2 more GB (and higher *stock* clockspeed), probably. First they need to sell a 24GB 5080 with not as much compute as that eventual card with less ram, so they can update DLSS later and relefate the 24GB part (with more ram but less compute) below 60fps. This is how you win with a disjointed stack....Because to understand it you literally have to read my long sprawling posts, which most won't.

Does this all make sense? I know I'm about a year (or slighly more) too early for most people to understand, as not everyone plays games with these implementations. But many will soon-enough.

Don't worry, DF etc will beat it into most peoples' head before too long. I'm just giving a heads-up. 9070 will not move on this curve imho. It will have similar problems to 5070; just pure grunt rather than compute/ram.

I'd go screencap reviews/link benches that show this consistantly, but it is a pain...Most don't care/understand; most din't even understand 1% loe. I'd rather you look yourself and you will find this to be true.

I honestly have no problem helping people learn, and I do speculate *sometimes*, but other times I'm not. If you don't believe me, then GO LOOK IT UP.

Obviously people have different opinions/needs, but I'm trying to look out for people. No agenda. This is the same reason I explain 1% lows to people, so when games like HZD stutter on nVIDA, you get it.

I don't sit and explain it's because of buffer limitations due to heavy reliance of internal cache which allows maximum use of compute but can't save it when it goes to external ram (which is often not enough).

I simply say "don't look at averages, look at lows". Don't listen to people saying one product is 10% faster when it is in-fact 10% SLOWER due to dips that make it slow down, which is not a good experience.
Don't pay attention to some peoples' awful comparative *average* bar graphs, wither displayed up and down or left to right.

The red team somehow much more consistant, because they actually plan their architecture going out to RAM.

Do you know WHY better-acclaimed company does that thing? Because many people don't get it. They see the numbers and they somehow STILL don't get it. That consistancy matters, not highs or avgs.

If you don't believe me, GO LOOK AT THE NUMBERS. Right now, go look at HZD and look at AMD's/nVIDIA's minimums. Many other games. This is not a rare occurance. Neither is people blaming the game.
I don't blame those people, but you need to understand many of these things are very purposeful and real. So are standards. It's not an opinion to push a product or bias in a brand war.

People need to absorb that, both what I'm saying and others pushing the opposite. Some think games are programmed badly, when it's the often cards planned to outdate it in ways you don't understand.
Faster cache (at higher core clock) or more ram. Higher compute, but not ram. More intensive software. All stuff variables AMD attempts not to use in an underhanded fashion, and nvIDIA very-much does.

So support them...Just don't buy a 9070. Buy a XT. Or higher (when it comes out). If it will do those things (1080pRT->4k, 1440pRT native consistantly, 4kRT, etc.), as 9070 xt perfectly fits many situations.

Or, you know, chill at 1440p non-rt with your 6800xt (or similar) and non-rt until you can't. I get that too, I really do! I really, really, do. Maybe you don't even a 6800xt, so buy a 9070...when it's way cheaper!

Also, I am not trying to make you feel bad about your purchase, and I'm not saying there isn't a possibility a next-gen in that sweet spot for the PS6 which might also outdate 9070 xt in this respect. It could.
But for right now, and everything we've seen over the last couple/few years, and continue to see in AAA releases every day, 9070 xt is (*almost, barring that ~1fps*), perfect (as a mid-range product).

Obviously 9070 has uses and there are many cases what I am talking about will not be a limiting factor. Just be aware, however, that these issues exist.
Impressive wall of text, but you still haven't explained why a ~10% difference in FPS is so meaningful. You can always turn one setting down from ultra to high, and you've got that 10% back - not that you notice 10% anyway.
 
Joined
Dec 25, 2020
Messages
7,865 (5.12/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~

Weren't we supposed to be swimming in these cards, with plentiful stock worldwide
 
Top