Just a 6600XT replacement. AMD removed the XT from the final card name to have the right to compare it with RX 6600 instead of RX 6600XT. But based on market pricing and specs (number of cores), this is just an 6600XT replacement. So, what we get is a refresh, not a new architecture. Granted it is RDNA3 not 2, but in performance do we really see any difference? RDNA3 is a lost generation for AMD.
At least they try to be realistic and they price the card based on current market pricing. They could price 7600 at $300+ and while we here would be lauphing and throwing DOA all over the place, many out there would be paying those extra dollars over a 66x0/XT because some seller explained them that this is a newer and better card compared to RX 66x0/XT.
Anyway this generation in tldr
Nvidia : Better performance at higher prices, result stagnation.
AMD: The same performance at current prices, result stagnation.
How nice. Hope we get out of this comma soon.
----------
@W1zzard I will object even here about those 8GBs conclusions. While you are the expert and me the noob who just "read/seen it on the internet", an article where you prove that 8GBs are enough would be something needed. Because while the frame rate in some games would be smooth and paint a certain picture, videos have shown that the time it takes to keep loading textures in a VRAM limited card does have big impact in visual quality. So maybe FPS numbers don't really say the truth. Think of this in this way. If programmers think they found a way to offer smooth gaming without VRAM optimizations, by just loading uploading textures and just betting on user ignorance about what to expect from a game visually, you will keep saying that "8GBs are enough" even after a dozen or more examples of games offering inferior visual quality in a VRAM limited scenario. Run some tests now, see for yourself now, if you haven't done it already, because in a year or two things could be bad and people start questioning your review conclusions.
Lastly I got a little triggered about that backplate and "AMD's problem". I understand your points there, AMD should have tested more cables, but I thought that no matter the conditions, not inserting a cable ALL the way, it's a USER ERROR not
Nvidia's AMD's mistake. That's what Steve from Gamers Nexus is telling us every time he have to do some damage control for Nvidia in this matter. (obviously I don't have a problem with you here)
And more seriously, is there really a fire hazard with the 8 pin cable? 12 pin cables on fire and excuses about "not all the way in", might not apply on the old, tested, proved, 8 pin cable. Any tests/videos/articles that say otherwise?