Used to run my 980 Ti with one cable tho now using two as it was recommended.
Considering a 980 Ti is fine power wise (only 250W TDP), it would be fine, as NVIDIA has tested the 8 pin to go up to 175W safely, hence why some of their quadros that are rated for 250W only use 1x 8 pin, but running higher TDP cards like 300+ on a single cable is too much (unless it is 12 pin, as that is rated for more).
Actually no. Perf/W and perf/$$$ are derived from the three things that matter: performance, price and power draw.
Using derived measures, one can complain to no end if really wanting to find fault. Is the performance better than the previous generation? Complain about perf/W. Is perf/W ok? Derive again and complain about the increase of the perf/W being smaller than the previous generation. And so on and so forth.
Despite the discussion on internet forums, the primary parameters of a video card are those that I bolded above. With the mention that some do not care about power draw (for various reasons) while some (lucky bastards) can ignore the price altogether.
Please read through this entire post
While I would like to agree with you, I can't because there are some things you are incorrect. Nguyen is correct in that, most
consumers that are buying
GeForce graphics cards care about performance/dollar first, hence why lots of people saw value in AMD 500 series even though their performance/watt was disgusting compared to 10 series, because they offered much better performance/dollar. Who
SHOULD care about performance/watt over performance/dollar? Laptop, workstation and server users. Because all of these scenarios have specific conditions that require more priority on the performance/watt, such as limited power requirements. Why should
DESKTOP GAMERS care more about performance/dollar than performance/watt? Because 80% (might be really off on this one) of gamers don't use their graphics cards to make money, unlike content creation or mining. So performance/watt shouldn't matter for the desktop gaming market (but if you absolutely must care because maybe you are the 1% of exceptions, 30 series has higher perf/watt than 20 series).
TLDR, generally speaking,
Desktop gamers should care more about performance/dollar than performance/watt.
Laptop, workstation, and server users should care more about performance/watt than performance/dollar.
However your next point is quite apparent. Yeah, people literally go to extreme ends to find some sort of fault on Intel, AMD, and NVIDIA. My theory on this (relating to gaming) is, people have the never ending lust for more FPS.
I target 1080p60, that's all I really care about. But a lot of other people want higher and higher FPS, higher than what their monitor makes, and then they complained about tearing, and adaptive sync was made.
I still target 1080p60, and I like to use V-Sync. And then a lot of gamers complained that they needed 1440p144 and once they got it it wasn't enough they needed more refresh rates, and now they need 1440p240. It has a quality cost, but according to the Earth, more FPS equals better.
Oh by the way I still target 1080p60 with V-Sync, and I get to enjoy no weird quality loss. And then a lot of people complained about CPU cores not being enough, and we went from only needing 4c4t to 8c16t (ok 4c4t isn't really enough in 2020, but 4c8t is still holding up) so we had to go from requiring a 250$ CPU to needing a 500$ CPU or bust.
Oh by the way I still target 1080p60, I am using a 3770K and a GTX 980 (got both used). Yes, an 8 year old CPU can still keep up for 1080p60. I know, it's surprising, yes.
Dang, this whole time, it looks like most of us were just complaining for more and more, and not actually enjoying the video games that we built computers for...... Yeah, the PC community in a nutshell. Honestly, don't target something stupid like 8K120, and you won't have a heartbreak every damn time you go under 240 FPS. At 8K resolution. geez.
Okay don't get me wrong, there is some merit to these complaints, but most of this stuff is overhyped. It is literally to the point people literally defend buying a 240Hz monitor for Roblox. I am not even kidding about this. People get hyper competitive on Roblox FPS games, using FPS unlocker, and getting lo and behold 300 f*cking FPS in a lego game. Yeah, this community is weird as f*ck. Anyways, I'm going off into way too many tangents...
Yeah,
performance,
price, and
power draw are important figures, but not alone. A card running at 150W compared to a card running at 350W means nothing, if we don't know why it takes more power. You have to compare the bolded figures against themselves in order to actually say which card is better or worse.