What I see is that the 24GB of vram on the 7900xtx never saves you when RT/PT is on.
It’s highly more likely to get better fps from a 12GB 4070Ti rather than the 7900XTX.
And we got already games with no option to disable the RT…
No, it saves you in 4k60 raster, which is where and why even the
4080 gets smooshed.
\
What are we looking at? The 4070ti, an 800$ card is faster than the 7900xt, a 900$ card and closer to the xtx, a 1000$ card.
It can't keep 60fps with 960p upscaling to 1440p because it clearly runs out of the ram. I mean, literally look at the VRAM usage and the amount under 60 and do the appropriate math.
It will not get better in the future. You want at least 45TF/16GB. Weird how 4070Ti and 5070 are purposely below that (except not at all, bc nVIDIA!)
This is all by design. If you live in denial about the ram, you live in denial about the ram.
I'm just reporting the facts of the situation so people don't fall into the same trap with 4k upscaling and 5080 in the future. Because it happens. That is literally why they did it. And will sell a 24GB super.
Probably. Which unless they clock it a ton higher will make absolutely no sense bc it won't have enough resources for 24GB to make sense. They will sell a (probably 11264-12288sp) 6080 on 3nm that does.
Because the 16GB too will run out of ram in new titles at 1440p->4k with all this stuff enabled; likely to be >16GB, perhaps lower than 18GB. Look at 5080, now scale to 60 and add proportionate RAM.
That is what a well-matched card would be, which 5080 is not. Again, this is the job of 192-bit 3nm cards (and people need to be aware of that) and why 5080 16GB is a ridiculous investment.
Don't buy a 4070ti/5070 for 1440p (even upscaled) for RT. That is the point. It's likely any future competition will also be playable in 1080p RT (960->1440p 9070xtx?) bc 16GB of ram.
Again, otoh, 6800xt for 1440p60, 7900xtx for 4k60 native non-rt. 4090 for all the things.
Obviously the upcoming the upcoming 9070 series is a the most sensible wrt perf/$ longevity, but an OC 7800xt ain't bad for 1440p native. 6800xt ok. None of them unplayable for 1440p.
7800/9070 for 1080p RT, 9070xtx for (960p) 1440p upscaled RT. I'm sure there are instances 4070ti/5070 will be okay, but I'm betting on it not being the case more often than not bc <45TF/16GB.
I'm hoping most logical people can understand what's happening and understand my point.
I'm not trying to insult people that made that choice. I'm suggesting people make common-sense choices.
It's obviously not a VRAM issue when the 24GB 3090 and 7900 XTX are getting the same performance. Not quite sure what point you're trying to make. This looks completely normal with the expected strength of the GPUs. 4070-Ti/4070 Super and the 3090 are all around the same strength in regular performance and ray tracing, and they get the same performance and 1% lows, with AMD being trash in RT as usual.
You don't buy a 7000 series for RT. But if you bought an XTX for 4k60 raster, you could still upscale from 960/1080->1440p/4k RT (with OC) is my point. Not with either using a similar-enough priced 4070ti.
They don't have regular 1440p->4k upscaling non-RT on the graph, but that is where 7900xt shines and 4070ti would falter. Same with 9070 (1440p native, 1440->4k upscaling sometimes non-rt) vs 5070.
Do what you want...I don't really want to have this argument until after 5070 and 9070 series launch.
It will prove my point on a greater scale about the common sense of AMD and the constant upgrade cycle required with nVIDIA (outside 4090).
In reality, people should again be waiting for 192-bit parts (comparable to a PS6) or 256-bit parts (comparable to a 4090). Those will be aimed at what people actually want.
I don't care which company you buy. I'm simply saying I wouldn't be buying a card for >1440p non-rt RN. And that is the point of the 9070 series (and 6800xt/7800xt for that matter, granted 9070 1080pRT).
If you want native 4k max non-rt, a 7900xtx will get you that. So will a 5080. A 5080 will also run out of ram upscaling and/or in native 4k relatively soon. A 7900xtx will not, but perhaps lack enough raster.
Again, this is why you wait for a 256-bit 3nm card that will replace the 5080/7900xtx and perform like a 4090, or buy the step down that will be faster but have 18GB of ram.
It will all become more clear as more games and their engines start to show signs of being prepared to be ported to the PS6.
There is a reason why the 4090 is exactly 1440p->4k upscaled RT. The PS6 would likely do that from 1080p, which means a card that performs like a 5080 (/9070 XTX?) but in some cases may use more ram.
I'm sure we'll have this conversation more over the next year or so leading into whatever the PS6 and 3nm specs turn out to be...even though you can already predict them more-or-less.
(11264?) ~11520-12288sp @ 3.7-4ghz 256-bit/24GB
8192-9216sp @ 192-bit/18GB
I will buy the perf/$ card at 11264sp(+), because it should scale 1440p->4k in almost all instances a PS6 is 1080p->4k. Exactly like a (at worst overclocked) 4090. Exactly like how 2080ti lasted through ps5.
If you want to buy a 192-bit card then I wouldn't blame you, but it's really toss-up that or a PS6 (if you're just gaming). Cards will likely come first and (I would hope) be similar-priced or cheaper.
Again, nothing available rn makes sense for over 16GB/1440p (1080p RT) unless you bought a 4090. 1440p (1080p RT) DOES make sense,
nVIDIA is taking the over/under, because that's how they getcha.
And 9070 will hit it on the head.
That's exactly why AMD didn't make a faster card this gen. Couldn't beat a 4090 and that card will scale for a long time at precisely what you see here. AMD need to match it, again is the point of 3nm.
They will likely have to beat the price of a 6080 with similar characteristics for $1000, maybe AMD priced Ti-level, and then we can all be happy for a long time, whatever brand we personally choose.
That's the main take-away I want people to understand. Look at how this game performs on a 4090.
Look at Timespy GT1 scores (and imagine a card at 120fps; essentially a 4090).
Everything is in flux, but the 4090 is the centerpiece of the next generation of gaming, more-or-less. 1440p->4k RT upscaling. 4k60 native max w/o RT.
The next step down would be 1080p->4k RT; a 5080 (9070xtx?) with more RAM. This would be your native 1440p card moving forward.
5080 and 9070 (arguably 6800xt/7800xt/4080) are good limbo picks for 1440p. Some of those cards are/will be vastly cheaper than others. This is what people should hold themselves over on IMHO.
If you don't believe me, then just wait and see. We'll see who's right.
I know I am, but that's not the point. The point is I want people to be happy and not foolishly waste their money. I also want nVIDIA to stop the shenanigans, and you all should too.