Just like some 5080's are tickling a 4090's undercarriage with tuning.
They really are not in practical reality, and again this is by design. It is partially due to RAM limitations, partially due to not having *quite* enough (guaranteed) compute potential.
nVIDIA disguises this limitation a little by the boatload of extra bw (5080 only needs 22gbps to operate at stock; that extra bw can translate into ~6% extra perf in many cases).
This is why you should laugh when (something like a proposed) 18GB card that surpass it in many ways. The buffer no long the primary bottleneck at higher speeds, and slightly higher compute at stock (both important to keeping 60fps in many games) to match.
Like I say, you can believe it or not, but this is what nVIDIA does. Some people will eventually understand and some never will. That is their prerogative, but it's still true, and why people should understand.
5080 has a stock (especially important wrt RT) clock of 2640mhz. Yes, it can overclock to ~3150mhz or so, but 16GB becomes a limitation,
where 120fps may be approx equal to stable ~1440p60RT, it can't.
This is why benchmarks matter, and 100fps (and next 120fps) important in that bench. It is also why ~20k is important in Unigine (like in that under-volting video)...they all roughly equal out to standards.
This is why 9070 xt clocks to where it does, and has 16GB...and hits 100fps/20k...which probably similar to stable 1080p60RT. Many cards have the 5080 problem, including
9070/
5070/
5070ti, for different reasons.
9070 XT knows what it is, and knows what it isn't. It is a well-matched card that will hit the spec it needs to, which is keeping 60fps in 1080pRT or 1440p 'quality' up-scaling (960p->1440p), perhaps with a lil' tweak.
On a 5080 1080p->4k up-scaling will be better native, and that likely will be the point of a '9080xt' or something, but it's still alright with a 9070 xt if you use something like frame-gen. Native FR/lag still okay.
That is why 9070 xt is a good product. It hits all those targets and is inexpensive. 5080 (besides 1080p->4k) isn't really going to give a much better stabilized experience; it's not a 1440pRT card...not *really*.
You can be amazed by the 'tremendous architectural advancements' and whatever they try to sell cheaper cards on next gen that surpass those that were formerly more expensive.
All-the-while I will have already explained exactly what the former limitations are (that were fixable within silicon choices/ram limitations) but nVIDIA chose to limit them to sell later as new product(s).
Sometimes several times to showcase different areas a product was originally limited. Like I say, there are instances >16GB will matter, but other instances greater than most 5080s can OC on the core.
Again, I have explained why 9070 XT is *perfectly* oriented towards 1080pRT or 1440p 'quality' up-scaling. 5080 is NOT a 1440pRT card, or one step up, as 4090 currently is, hence 5080 is not a great card.
It is faster, but not generally in any way that will give you a tangible better experience. This is why the cheaper Radeon makes sense; for all intents and purposes it can do the same things well.
This pressure would largely be relieved (and equalized) on something like 9216sp @ ~3700-3780/36gbps and 18GB, and can almost guarentee to you however far in advance will hit 120fps in Time Spy. Which would be both a better product (in terms of keeping 60fps in many instances at 1440pRT), and perhaps most importantly for you to understand, cheaper for nVIDIA to make. This will also manifest as better 1080p->4k up-scaling capability (than 9070xt/5070ti, but perhaps similar to 5080, depending upon if >16GB ram needed to keep 60fps).
In that case, a '6070' would be a '4090'...except when you need a larger buffer (which is largely true in keeping 60fps at 4k, but will be more true in instances of 1440p fairly soon, especially with up-scaling/rt/fg).
That is why I call 16GB cards (and 9070 xt) a 1080p card. It is a 1080p RT card (That can upscale). It is the bottom of this up-coming era, where more ram needed etc for higher-rez and (eventually standard) RT.
Again, we can pretty much assume the next-gen stack will be 1080p, 1440p, and 4k RT...with a card like 9070 xt at the bottom (~6144sp @ 4200mhz/40gbps/16GB? Maybe configured differently). And then scaled.
Where 1440p is ~4090 or a high-clocked ~12288sp card that replaces it. 5080 (to it's replacement) a 1080p->4k card. Then a halo for native 4kRT or 1440p (and 4k up-scaling) path tracing.
It all lines up. The only cards that currently fit are ~4090 (to a 12288sp part, perhaps clocked lower at first and then higher later) and 9070 xt to a low-end part (with higher clocks as AMD generally uses).
It's possible we see this exact same battle with AMD using 6144sp/16GB and nVIDIA 9216sp/18GB, doing this exact same thing for the exact relative prices but one tier down.
nVIDIA will look better in *some* instances, but things will change as RAM/compute requirements increase and things *actually* stabalize to the next-gen consoles, and how that manifests IDK.
That's why 1440p requirements are murky. Will 5080 level (with more ram and/compute) be enough? I don't know. That's why I say think of it as 1080p->4k, sometimes 1440p (as that product *kind of* is now).
Next tier up 1440pRT, and 4k raster or quality up-scaling, as 4090 is now (but may not always be capable of keeping the later at 60fps). This is why a faster part (nvidia refresh or maybe AMD right-away) is needed.
This is why I've said in order for 5080 to not be outdated (because of the difference in compute potential between such parts that they could then outdate through software demands, if not new games) the 24GB 5080 needs to clock to ~3.23-3.24ghz. The process *will* allow this. nVIDIA will almost certainly not, and it is for these reasons. To sell the more-or-less same part multiple times, and outdate each one through different limitations.
Follow? At the end of the day 5080 16GB becomes just as much a 1080p card as 9070 XT, and to me already is, which to me is humorous. It just has better up-scaling performance at 1080p->4k.
Again, AMD is already prepping for this by having a lower price and not selling it for something it is not. It is a 1080pRT card with the ability upscale to 1440p. 5080 can upscale to 4k. Paying for that is your choice.
And likely will continue to be. I have no doubt AMD will sell a better-matched card for actual next-gen 1440p that actually performs better than what nVIDIA does to replace 5080. BC this is what they do.
Yeah...I know. Wall of text. Sometimes I just edit rather than making a new post. Bad habit of mine and I apologize...It just keeps everything together (although I know it becomes a lot in one post).