I think it's rather interesting that you should decide to choose random cards...and presumably view this as a standard.
Let me explain why your retort is...not particularly valuable as a counter.
1) I never spoke about power efficiency...because my gaming PC doesn't run 24/7. If we assume 100 watts difference in power...because I'm going to give you that and 2 hours per day of gaming, then in a week I use 1 KWH extra of electrical energy per week. 52 weeks per year. In California it's about $0.26 dollars per KWH...or an annual cost of $13.52.
2) If I was to sell a 3080...which is about where the 4070ti competes at for 4k resolutions (and not the one you decided to cite), it's around $600 right now on ebay...basing that off of random low/high averages. 900-600 = 300. 300/13.52 = 22 years. Yes, to make up for purchasing a new 4070ti and shedding a 3080 it'd only need 22 years to pay itself back with an average 100 watt savings and a 2 hour per day (10 hour per week) gaming session.
3) Let's also ask about some stupidity from PC parts picker. I can choose a low end 4070ti and a high end 3080, and the prices highly vary. Of course, earlier this week, there was a newegg deal where a 3080 was at about the $700 mark:
Newegg 3080. Prices are highly variable...but I'm willing to state that with any real effort you might also be able to find something like Best Buy putting these on even deeper clearance pricing:
Best Buy - FE card clearance
4) Finally, I'm going to run this with a single last issue. Why did you choose 2560x1440? It's almost like you overlook the vast RAM requirement differences between 1080p, 2560p, and 4k. Of course the 4070ti and 3090ti (along with the 3080) are basically the same at lower resolutions. You crank that up a little bit, and it's still interchangeable. For giggles, I can also go down to SVGA resolutions and have all three within margin of error for their performance. You've decided to choose the last resolution where these cards don't start showing limitations for their available VRAM quantity. It's almost like the 3080, 3080 12G, 3080ti, and 3090ti are all basically indistinguishable from testing error when playing things at 1080p resolutions...but they demonstrate relative value when you go to 4k resolutions...because otherwise what value does a $1500 msrp card show over a $700 msrp card?
If you don't want to read, let me TL;DR.
1) 22 years to pay back increased power efficiency for a gaming solution is...laughable as a reason to buy new hardware.
2) Selecting expensive cards in one generation and cheap in another doesn't mean I can't buy a cheaper version in the one you showed as expensive. This is cherry picking a counter example...and pretty backwards.
3) Speaking of cherry picking, Nvidia said the 4070ti had 3090 level performance...but you've chosen a lower resolution to help prop up having half the VRAM. Again cherry picking.
Allow me to also suggest there are counters to my argument of the value proposition. If you spend more than 10 hours per week gaming at 4k then the 4070ti could pay itself back in its usable lifetime.
If you are looking to upgrade from a 1xxx or even 2xxx series Nvidia GPU to something new today then it's more accurately a $200 difference in overall cost. If you keep the card about 5 years, that's $40 per year. You could theoretically justify that price difference by the higher trade value of a card that is about 2 years newer...assuming you are playing the long game of decreasing values and sales.
Finally, the feature set. Nvidia tends to make new features not backwards compatible. Cool. DLSS 3.0 will be a selling point on this in a few years when it lacks the horsepower to actually render 4k with all of the new bells and whistles.
RT. I'm...not sold on this in any meaningful way. That said, the 4xxx series does it better than the 3xxx series. If that's an absolutely required feature then you can find value in the new series.
Now, let me answer all of these retorts. $159 msrp. 1 GB RAM. 2009 release. Midrange card, just like the 1070, 2070, and realistically the 3070 given that the 3060 was considered an entry level 1080p card. $222 adjusted for inflation, which is now priced like the 6 year old (2016) 1050ti. It's really silly that at this point the midrange of 2009 is competing with the pricing of the entry level offerings from half a decade ago.
I get inflation. I get that there's a difference between 1 GB and 12 GB of RAM, despite the cost not really changing (I bought 2 GB sticks of DDR2 for the same price a decade ago as 16 GB today). I also get that things cost more overall, and both AMD and Nvidia are pushing for more performance and driving the floor up based upon a plethora of new technologies. Now that I've said all of this, let me ask how many people still game at 1080p? Almost a decade later, why are the top two GPUs used in 2022 the 1650 and 1060? It's because a $800 video card is idiotic when most people are aiming for 1080p and good enough graphics...and the next $200 GPU that will do this is literally going to print money...while the 4k halo products that cost an arm and leg are literally poisoning the market's low end to PC gaming.
It's Nvidia and AMD both understanding that the premium market may be smaller, but it's more profitable...even if it basically kills the market by making the non-premium purchasers choose products more than half a decade old because they're the last offerings that provide good price to performance, rather than peak performance with compromised pricing.
I say that as a person who primarily games on a 480 and 3930k system, despite owning a 5700x and 3080 system that's primarily used to crank out work. Sometimes it's not about RT everything, but playing games that look good enough...and that doesn't necessitate a $1500 card.
Bringing this back to the review, $900 for a mid-high rang card, that was $222 about 14 years ago. We most assuredly have come a long way...and don't seem to be in a better place for consumers.