2080 Ti matches a 3070 and 4060 Ti in performance. The 6800 XT, 7800 XT, 3080 are a tier above:
Remember when overclocking
wasn't dead?
Really dead? Not if you bought something 3000 series or later. Also, again, look at
keeping 60fps. Look at each game. You'll start to see what nVIDIA does.
Imagine a 2080ti around where that overclocking puts it (1440p it *can* suck; true, but for 1080p it's fine). May I also remind you
4060 Ti launched at $500 for the not-8GB edition...and this is/was $250.
You'll see why it was (and can still be, for mostly 1080p, sometimes 1440p) a good value, especially w/ DLSS. It will hit those cut-offs OCing. The radeons 1440p.
Same thing for 6800xt somewhat, 7800xt more-so in many instances. For the price difference, you really are not giving up much, if anything, +4GB ram. RT, but I'll talk about that whole shebang down below.
You may laugh, you may cry. Or, you may fail to understand. It's your money. Info is just there if you're the type.
Perhaps that is the biggest different between now and "the good ol' times": The GPU used market has exploded since 2020 (rtx 30 series).
It's a chicken/egg scenario. Used is worth more because of low supply and rising MSRP. New is priced higher because there's fluid competition in used market, no reason to undercut those cards.
You would think so...but it *generally* is not the case IME. Used prices are often a *much* better value than new cards, still, especially when people don't understand the thing like the 2080ti vs 4060ti, etc.
AMD's prices have held pretty steady bc...well...they've released the same card (more-or-less) going on three times now. Yeah, 7800xt overclocks better...yeah, 9070 xt has FSR4/better rt...but still.
Might finally be the end of the road for it, though (6800xt/7800xt/9070), at least in this segment. It'll likely drop to the lower-tier pretty soon and next go-round. Same for 9070xt, though, probably.
And everything below a ~5080-level of raster (which still needs >16GB to be a higher class of card). You think 4080 is okay...but...activate DLSS for 4kRT and *poof*, sub-60. 960->1440p prolly ok.
I mean, think about it. Low-end next-gen is probably 128-bit/16GB, same bw as this gen and 256-bit/GDDR6, and 3700mhz+ clocks where this gen was as low as 2430mhz. That's a BIG jump. Then 192-bit/18GB.
The shift next generation is going to be pretty monumental, and a lot of cards are going to get the shaft for how people want to use them. Again, JMO; people have different standards. I just try to keep mine.
I keep saying to watch
1080pRT and 1080pRT->4k up-scaling, as that will probably will be the baseline, but we can't even assume it will be decent for a 5070/9070. Maybe not even 9070xt (but more-likely).
Again, I wouldn't *really* recommend anything below a highest-end N48 (9070xt probably for 1440p, higher-end card if it exists for 4k RT up-scaling), but that's not what we're talking about.
We're talking the 3060-outdated-and-$300-12GB vs similar-priced options that are actually 'okay'. Both of those are, 3060 is not. None of them are going to be great once RT stabalizes (like how used in BMW).