a 175w mid to low end part is not "efficient" by any stretch of imagination, HEY GUYS, LOOK AT MY "NEW AND EFFICIENT" LED LIGHT BULB... ONLY CONSUMES 175W VS that OLD VINTAGE RETRO 60W LIGTH BULB!!! BUT... IT s just a crappy lamp?... and SUCKS way MORE POWER?? AH nooo BUT look! IT IS BRIGTHER!
absurd
Regardless of performance, a single watt past 120w constitutes an energy hog , why? On paper this is a 75w gpu like the 1050ti / 1650 gpu, PERIOD. AMD is pushing to HELL that 7hnm node advantage, hence the stupid HI 175w TDP
You have some very, very, very odd ideas on GPU segmentation and efficiency. I mean ... yes, this GPU consumes 175W. Though that's a pretty heavily factory OC'd version - the stock ones consume ~150W. So in terms of power draw, it's in the upper midrange area, though there have been high end cards not too far from this. That's not too strange - below 200W GPUs tend to be tightly stacked. Sure, it consumes a lot more power than the nominally higher "ranked" RX 570, but ... at the time the RX 570 was relevant, AMD didn't have a single GPU capable of competing above $300 (and that was before GPU prices went bananas).
You're classifying GPUs
without considering performance at all. Which is utterly and completely absurd. I mean, what is efficiency? It is the ratio between input and output - the less input and more output, the more efficient, whatever you're looking at. Less results for more work = less efficient. You're saying "this consumes a lot of power, so it's inefficient" yet ... it outperforms the RTX 3060, which consumes
more power. Are you railing at RTX 3060 reviews in the same way, shouting how these are inefficient garbage GPUs? I mean, come on man, get your act together.
You also say "on paper, this is a 75W GPU" - how, exactly? Seriously, which specs align with that? And does actual real-world performance reflect that statement in any way, shape or form? I mean, look at the chart
@W1zzard posted above, ffs. This performs a full 25% better per watt of power consumed compared to the GTX 1650 Super. So, it consumes 2x the power, but it also performs 2,5x better. That? That's
more efficient. And that's for an overclocked SKU with a significantly (25W, ~16%) increased power draw from stock.
If all you're going from is "this is a x600 tier card" - well, then you need to adjust your frame of reference. GPU numbering counts down from the highest end card. The faster those go, and the broader the range of useful performance, the higher up any given tier below the top will be. In the GTX 9xx series we had 5 relevant SKUs (980 Ti, 980, 970, 960, 950). In the 1100 series there were 8 (1080 Ti, 1080, 1070 Ti, 1070, 1060 6GB, 1060 3GB, 1050 Ti, 1050). In the RTX 2000/GTX 16 series there are 7 + 6 (2080 Ti, S, -, 2070 S, -, 2060 S, -, plus the GTX 1660 Ti, S, -, 1650 Ti, S, -). See how things are expanding? And in addition to that, AMD's previous GPUs were segmented to make their highest end at the time, which was a midrange GPU (not even upper midrange, no), look more "higher end" by calling it "80". Hence the RX 580 being soundly outperformed by the RX 5600 XT, despite that nominally being two tiers further down the stack.
So no, there is nothing about this GPU that is in the same tier as 75W GPUs. That is obviously not saying that it's not overpriced - it very much is. In a sane world, this would be a $300 GPU. But it's not a $150 75W GPU, and it never will be.
Oh, and another thing about efficiency: for rasterized graphics, closer to stock 6600 XTs are
the second most efficient GPU in existence.
I mean, if you don't
want a GPU that performs to this level, that is
perfectly fine. That doesn't make those that do so any less efficient. Your lightbulb analogy makes no sense whatsoever.
Edit: derp