You're assuming no one ever OC's anything in their rig, they all buy Founders Editions, they never get a higher end CPU, they aren't using any USB 3.2 \ USB-C devices, no SATA drives or HDD, and so on. Keep in mind these benchmarks are on stripped down systems.
A single HDD for example, can consume 20W. SSDs are better, but Tom's shows for example using a WD Blue vs a Samsung 850 can lop 30 minutes of battery life off a laptop. It's not zero.
Nope. Please re-read - though there were a few details I left out: a core tenet of this approach is checking
real-world power draws, in other words
the numbers for the GPU you're planning to buy. Not generic numbers, not FE numbers (unless you're buying the FE), not total system power numbers. If you plan to OC, obviously factor that in, but the
vast majority don't OC, and besides, the 20% total overhead is typically sufficient to account for that. Getting a higher end CPU doesn't make that much of a difference - CPU power draws scale far less than GPU power draws, except for the past two generations of Intel chips, of course (though even those are
far from their peak draws while gaming). But again, the 20% headroom accounts for that.
USB devices generally consume little power, and are unlikely to be in heavy use while the PC is under heavy load, like a game being run. The same goes for drives - and as I said, the average gaming PC today has a single SSD and possibly a HDD. HDD peak power draw happens
only during spin-up, so the chances of that happening during gaming is ... tiny. In-use power for a 7200rpm 3.5" HDD is <10W. But more importantly: cumulative power numbers adds a lot of invisible headroom. Gaming
never stresses both CPU and GPU to their maximum power draw, let alone the rest of the system. So if you have a peak 90W CPU and a peak 150W GPU, you're never going to see 240W from those two components while gaming. Games don't load the whole PC 100%. So in real-world usage scenarios those additional 20% are
already on top of built-in headroom.
For me, I have a USB hub plugged into a USB-C port that I use to charge my phone, keyboard, mouse, and so on. Separately I have a USB 3.1 external 3TB HDD. I also have both a SATA SSD and an M.2 (tests only have one M.2), and a pcie wireless/bluetooth card. This stuff all adds up, I bet there's an extra 50W draw in there, and if I plug in multiple devices to that hub it could be more. I don't think this is unusual, plenty of folks have much more.
That is definitely above average, if not
uncommon. As I said, most PC builds these days have a single SSD, and
maybe an HDD. Two years ago HDDs were ubiquitous, but not today. Mice and keyboards consume maybe a few watts each - they need to be USB 2.0 compliant, which means 2.5W max, though typically much less unless they have excessively bright RGB. Desktop USB-C ports output a maximum of 15W (5V3A) - that's all the specification allows for without an external PSU. And your HDD again might peak at 20W, but is more likely to be idling at 1-3W or running at 5-10W while the PC is being stressed.
This is why the rec from Nvidia is to have 650W for a 3070 and 750W for a 3080. I'm sure Nvidia will rec 550W+ for a 3060 Ti.
The thing is, even with your additional numbers, you get
nowhere near 650W. Not even close. The 3070 is a 220W (240W peak) GPU. Add a ~150W CPU, ~25W for the motherboard and RAM, 20W for a couple of drives, 20W for a few fans and an AIO pump, and another 10W for peripherals, and you get 465W, or 558W with a 20% margin. And again,
that system will never, ever consume 465W. Never. That's not how PCs work. Every single component is
never under full stress at the same time, even for a millisecond, let alone long enough for it to trip the PSU's OCP. And remember, that's with a 150W CPU, not a 95W or 65W one. There is, in other words, plenty of headroom built into these numbers already. For any other 3070 than the FE, exchange 240W in the calculation with its peak power draw. It really isn't hard.