my actual question to alwayssts was exactly that: what could cause a well-tuned 9070 with 10% more performance to struggle, while a simple, non-tuned undervolted 9070 XT wouldn’t, because that’s what I understood from his post.
First of all, look at Wizard's performance of a
9070 and then look at a
XT. Notice the gap. That gap is very important. While game perf could vary, it won't for for RT workloads (max load/rt limitation).
Notice at stock how 9070 is
2586mhz (for that model, but all similar) where-as overclocked it is 2829mhz, this is your perf difference (plus ~4% from excess mem bw).
This can vary some, especially for raster, but avg oc clock likely based off of a RT workload, given the clock/perf differences. So less than/around 50fps RT mins, ~55 or so overclocked. Purposely not 60.
~60 at stock for XT. More than that if you overclock; can keep it at similar power to a 9070 OC, where-as 9070 can't do it at max. That is the literal defining line.
You will notice a 5070ti is relegated
below this level at stock (which is very purposeful as well), but can be overclocked beyond it too. You will hopefully see the tactics at play, and the mentalist games occurring.
Hopefully you are starting to understand
how the prodcut game is played.
Because there are people that will *never* OC, but still want to maintain 60fps, which the 5070ti will find more and more instances that it won't, even when overclocked, especially when up-scaled.
Same goes for 9070, but even overclocking doesn't help. On 9070 xt you're already (essentially) there, with a good 8%-10% left in the tank, and/or you can tune to that level and less power.
I have explained it several times, so apologies if repetitive. The reason why it matters because of the newer graphical standards and maintaining 60fps with high settings *right now*, but standard settings later.
Again, you can lower these *now*, that is true. But it is a standard. It will also likely become REQUIRED on many titles (like Indiana Jones, for example) at some point in (*arguably this card's life*) the future.
Again, look at how 9070 XT performs in IJ. It will maintain 4k60 pretty much exactly in 'Supreme" (rt). This will trickle down in different ways depending on situation. It's not at all about 4k, just one example.
This is replicated across the minimums of an immense amount of games, please don't make me link them all. Go look at reviews (that actually show lows and/or RT workloads).
Look at all the instances a 9070 xt is ~59fps mins, across reviews, which is many. This is because this a standard. It is likely AMD is busting their butt to get that last FPS.
Again, I suggest tweaking 9070 xt for best power to maintain that extra ~1fps. I *think* the clock would be ~3200mhz (and may need a *slight* memory oc). I don't know the *exact* power often required.
It's confusing to explain to people, as raster/rt either require more bandwidth, power, or both...and raster operates 5% higher on average. So think in terms of ~3200mhz raster or ~3050mhz RT (50TF) on XT.
This number both true, and very scalable. It makes sense to sell ~50TF, 100TF, and whatever the halo can manage within power limits next-gen.
The 9070 will almost-certainly not get that last fps, especially not in RT, and that is *the* point. This is the split in the cards, it is VERY purposeful, and it will matter, especially as time moves forward.
When you can't turn some of this stuff off, because it will be baked into the core design of the games.
The 9070 xt is a standard (more-or-less). Look at even 4k PT with performance (1080p) up-scaling in CyberPunk. It maintains a 30fps minimum pretty much exactly. Why is this? I will now explain the future.
AMD is almost-certainly using a 3-stack configuration next-gen, as shown
here. It likely scale from 1 stack, 2 stack, and 3 stack. 128-bit, 256-but, 384-bit. There may be units disabled on some skus, idk.
Now, a one stack configuration will likely replace 9070 xt. Let's say it is 6144sp @ ~4200mhz (stock), just to be round (and similar to some 9070 xt stock perf of 3150mhz), but the point is that similar perf ability.
Which 9070 is not. You could say 9070 is cut-off between RT/raster imho, although some will make an arguement it is a low-end RT card and higher-end raster card.
Now, scale that up to 2 stacks. You now have a 60fps 4k PT experience with performance up-scaling. Get it?
That is a worse-case scenario, but there are MANY, MANY more that are not. Spider-man 2 RT 1080p60 (or 1440 'quality' up-scaling); 60fps on a XT.
Not likely going to maintain it on a 9070 even with your best OC.
Star Wars Outlaws (which is a demonstration of Snow Drop; read 'The next Division Game' etc) at 1440p (which uses up-scaling by default). Wukong. Etc etc etc etc. This is the next standard. And it will scale.
The tiers depend on a lot of things, but I think they will be roughly like so:
9070 xt/low-end UDNA/low-end nVIDIA (6144sp?): 1080p60 RT or 1440p 'quality' up-scaling (30 fps 'performance' [1080p] up-scaling 4k PT). You can have this now with a 9070 xt.
Mid-range: (Essentially replace the 5080/higher-end N48 (if it exists). This may be for 1080p->4K up-scaling, as it requires a bit more performance. 1440p60RT. I think this will be ~PS6 and/or 9216sp/18GB part
Performance (replace 4090, slightly faster; probably ~100TF), likely double the low-end: 1080p120 RT, 1440p->4k upscaing RT, 1080p60 'performance' up-scaling PT.
Halo: Probably built for 1440pPT and 1440p 'quality' 4k up-scaling PT.
People need to start looking at cards like this if they want to stay current with the industry. Some don't, and I understand. These cards are *slightly* before many need to worry about it, and I get that too.
This is why I say it's *mostly* a next-gen concern. But not all people upgrade their cards EVERY generation, or even every 3 generations.
For them this should be a concern. This is what I am saying.
9070 xt is the cheapest card that fits that criteria. 4080 used to be, but again it has been relegated below it in many instances due to increased demands from DLSS etc. This will continue, with 5070 ti too.
Likely in both respects. AMD will likely stay pretty consistant; although I don't know the *exact* capabilties of the PS6. If that outdates 9070 xt (or a higher-end version) in that regard, I don't know (yet).
4080/5070ti will likely get worse as the demands from nVIDIA's software increase. I know this concept blows some peoples' minds, but it's still true.
People need to stop looking at raster performance (for the most part) in my opinion. We have now more-or-less conquered that hill with the capabilties of the 9070xt IMHO.
You could likely keep 4k60 with Ratchet and Clank with an overclock on a decent 9070 xt model, which is one of Wizard's most-demanding benchmarks.
It will likely to be ported to PS6 and this one of it's options. Spider-man 2 (and Wukong etc) are an example of *current* RT, and will likely be ported to with ~1080p or 1440p->4k performance up-scaling.
1080p60 (or 1440p up-scaling) is possible on XT. 4k requires more oomph (and might be perf diff to PS6), but you can always use frame generation which will likely keep framerate okay, with lag managable.
If the PS6 is something like the 9070 xt or rather the next tier up (as described as a 9216sp part from nvidia/5080/'9080xt') I do not know. I think the later. I think it will be, bc then AMD can sell more cards.
If I had to guess, whatever AMD puts out with 32GB will be *very* close to the power of the PS6, although there are obvious reasons to make it slightly slower (especially at stock). Sell more cards later.
It also would make sense, given PS6 will likely take advantage of more than 16GB of ram, which 9070 xt can not. A 18GB card, for instance, would fit that bill for compute/rt power. 32GB perhaps same buffer.
Also consider nVIDIA almost certainly knows the scaling standard (which can be clearly seen as greater than 5080 for 1440p60RT), it's likely they think PS6 will perform better as well. Gotta sell it again.
With 2 more GB (and higher *stock* clockspeed),
probably. First they need to sell a 24GB 5080 with not as much compute as that eventual card with less ram, so they can update DLSS later and relefate the 24GB part (with more ram but less compute) below 60fps. This is how you win with a disjointed stack....Because to understand it you literally have to read my long sprawling posts, which most won't.
Does this all make sense? I know I'm about a year (or slighly more) too early for most people to understand, as not everyone plays games with these implementations. But many will soon-enough.
Don't worry, DF etc will beat it into most peoples' head before too long, I'm sure. 9070 will not move on this curve imho. It will have similar problems to 5070; just pure grunt rather than compute/ram.
I'd go screencap reviews/link benches that show this consistantly, but it is a pain...Most don't care/understand; most don't even understand 1% lows. I'd rather you look yourself and you will find this to be true.
I honestly have no problem helping people learn, and I do speculate *sometimes*, but other times I'm not. If you don't believe me, then GO LOOK IT UP.
Obviously people have different opinions/needs, but I'm trying to look out for people. No agenda. This is the same reason I explain 1% lows to people, so when games like HZD stutter on nVIDIA, you get it.
I don't sit and explain it's because of buffer limitations due to heavy reliance of internal cache which allows maximum use of compute but can't save it when it goes to external ram (which is often not enough). It's also often times for other reasons, too. That's the thing. There's lots of reasons, but the point remains the actual reality is the problem with LOWs and what they do to them over time.
I simply say "don't look at averages, look at lows". Don't listen to people saying one product is 10% faster when it is in-fact 10% SLOWER due to dips that make it slow down, which is not a good experience.
Don't pay attention to some peoples' awful comparative *average* bar graphs, wither displayed up and down or left to right.
The red team somehow much more consistent, even when not capable of being faster as max, because they actually plan their architecture going out to RAM, for which they actually have enough (unlike nVIDIA).
And they keep 60fps, or will easily overclock to do so. Especially nice as things evolve to be more demanding slightly over time. nVIDIA goes the opposite direction (sinks futher under 60fps), especially over time.
Do you know WHY better-acclaimed company does that thing? Because many people don't get it. They see the numbers averages but don't actually get it. That consistency matters, not avgs.
But those people will buy a new card when their current one slows down beyond a level they'd like (which is often the level I reccommend; high settings and keeping 60fps).
If you don't believe me, GO LOOK AT THE NUMBERS. Right now, go look at HZD and look at AMD's/nVIDIA's minimums. Many other games. This is not a rare occurance. Neither is people blaming the game.
I don't blame those people for not understanding (I do for acclaiming averages/features despite those things), but you need to understand many of these things are very purposeful and real. So are standards.
It's not an opinion to push a product or bias in a brand war.
People need to absorb that, both what I'm saying and others pushing the opposite. Some think games are programmed badly, when it's the often cards planned to outdate it in ways you don't understand.
Faster cache (at higher core clock) or more ram. Higher compute, but not ram. More intensive software. All stuff variables AMD attempts not to use in an underhanded fashion, and nVIDIA very-much does.
So support them...Just don't buy a 9070. Buy a XT. Or higher (when it comes out). If it will do those things (1080pRT->4k, 1440pRT native consistantly, 4kRT, etc.), as 9070 xt perfectly fits many situations.
And likely stay that way, the variable being whatever needle the PS6 pushes over it's capability (perhaps slightly more performance and/or VRAM usage). But it will still likely be very usable in that situation.
Likely by lowering a few settings and/or dealing with some dips. You won't need to buy another card, *probably*. That's what I'm saying. I can't guarentee that with a 9070 at all. To me, that is worth $50.
I think it's more than that, actually. I think the price will soon and certainly eventually reflect that; we'll have to see if the market agrees with me or ya'll. I know most reviewers agree with me (on that).
Or, you know, chill at 1440p non-rt with your 6800xt (or similar/better) and non-rt until you can't. I get that too, I
really do! I really, really, do. Maybe you don't even a 6800xt, so buy a 9070...when it's way cheaper!
Also,
I am not trying to make you feel bad about your purchase, and I'm not saying there isn't a possibility a next-gen in that sweet spot for the PS6 which might also outdate 9070 xt in this respect. It could.
But for right now, and everything we've seen over the last couple/few years, and continue to see in AAA releases every day, 9070 xt is (*almost, barring that ~1fps*), perfect (as a mid-range product).
Certainly for it's price (which is
incredibly important to note versus similar nVIDIA cards). Like I say, I don't care which card runs resolution/setting badly better. I care that it can run a setting well. Hence; similar.
Obviously 9070 has uses and there are many cases what I am talking about will not be a limiting factor. Just be aware, however, that these issues exist.