but it's still more innovation in their hardware than Nvidia did since Turing, which is nothing.
Do I need to remind you that Turing, a 6 years old architecture, is STILL more feature rich than whatever AMD have got in stock? There is zero software that runs on AMD GPUs but doesn't on Turing. However, a lot prosumer stuff refusing to do so (at least tinker-free) on AMD GPUs. Linux drivers might be better for AMD though but it's not a game changer. Sure, even 2080 Ti is slower than top dog RDNA2 and RDNA3 products but it'd be ridiculous if it wasn't.
They made the first chiplet GPU
Sure, it's an innovation. I forgot to mention it. However, why have they abandoned it right away? Seemed silly.
All they're giving us is more software gimmicks on the same hardware that has a different codename for some reason.
They're giving us more and more quality in said gimmicks so they no longer are gimmicks. DLAA (this is NOT upscaling) is a brilliant way to improve your visuals without buying a more powerful monitor (FSR and XeSS are very far behind at that). DLSS (this, however, is upscaling) is a great tool to let your ageing GPU have another couple years to last. RT... can't give any credit to NV in this regard because it's not them who invented this conceptually but at least their hardware does it better than anything else. Also... if you wanna buy the best raster performer it's an NVIDIA GPU. For, like, two decades straight with two brief periods of AMD GPUs being slightly more productive. Sure, expensive but it's the best. The best must be expensive.
I don't care about fake frames
Me neither.
What more is there to "finish"?
Let's start with the fact you have more options on an NVIDIA GPU. You have better image quality because even DLSS Q is better than anything that FSR/XeSS can do, even without upscaling. You have better RT performance. Your GPU can also do non-gaming stuff. Your driver control panel provides more functionality (oh no, I can't measure FPS with NVCP!) and is still better designed. Despite it effectively being a W98 era dinosaur.
AMD GPUs tank in recent titles because they cannot into RT. NVIDIA GPUs have some wiggle room (unless it's an 8 GB nonsense which I despise).
I also have no reason to see it as anything other than an attempt to establish a monopoly which is bad for every consumer, regardless of which side you prefer.
We got a monopoly precisely because AMD don't do anything like that. We would've had a healthy market if there were features exclusive to Radeons. Killer features that is. Like, imagine Radeon GPUs having some advanced thing that can make models and textures from older titles appear much more up to date on the fly without any meaningful performance impact. Or literally anything that gamers will enjoy, like, idk, TAA artifacts mitigator that makes image quality and stability even better than with DLAA.
They, however, just laze out and copy what NV do but so much worse it's not even worth consideration.
To me, it's a graphics card that plays my games like any other, just slightly cheaper.
I wasn't talking hardware, I was talking software. HW is fine in both parties (with an asterisk: AMD GPUs really lack RT performance). SW is relatively good in NV and it's beyond horrible in AMD.