Also, please note that the calculator you're quoting quite obviously contradicts your concerns about peripherals, PCIe networking, etc. It even bases itself on
much lower power numbers per fan, SSD, HDD etc. than my formula - in other words, my calculations have more built-in headroom than that table does. The only thing I really disagree with in those numbers (aside from the margins added at the end, which are at first sensible, then go into plain silliness) is the ridiculous power draw allotted to the motherboard, particularly when that number doesn't also account for RAM. A high-end, feature-packed 2020 motherboard including 4 sticks of fast RAM is unlikely to consume more than 50W, even including VRM losses when powering a 250W CPU like an i9-10900K. You might see a combined 75W for mobo+RAM with a TB3/10GbE-equipped motherboard when those controllers are under heavy load - though that's highly unlikely to coincide with a heavy CPU+GPU load. Or do you tend to do long-term continuous >1Gbps data transfers while gaming? The same goes for the ODD - which >1% of PC builds in 2020 have at all - how often is that going to be running full tilt at the same time as the CPU and GPU are? Are you ripping blu-rays while gaming? Do you see that as a common use case for a PC?
You don't seem to grasp the crucial point here, which means I have to repeat it yet again:
normal consumer workloads never ever stress every component to 100% at the same time. It doesn't happen. Period. Games
never stress the CPU
and GPU to 100%, which means that starting with real-world maximum power draw numbers for each of those
already includes a significant margin. I mean, just look at the difference between TPUs 10600K review power draw numbers. 162W under CB, 191W under P95, and 383W while gaming. The test setup uses an
EVGA 2080Ti FTW3, which alone consumes 304W average while gaming. That means
the rest of the system is consuming around 79 watts while gaming. CPU, motherboard, VRMs, SSDs, RAM, USB, PSU efficiency losses,
everything. Do you see how that leaves
a lot of headroom if you account for ~125W for the CPU alone, plus ~50-75W for the rest of the system? That is about 120W of margin just from base component numbers,
before my added safety margin. There are of course games that need more CPU power than TW3, but 120W more? Not even close. My calculations for that same setup would end up at ~490W (~125W CPU + 304W GPU + 35W motherboard/RAM + 25W storage and cooling, of course depending on the specific configuration) + 20% margin, or a 590W recommended PSU (550W would be fine, but cutting it a bit close, so below what I would recommend). Yet the real-world gaming numbers for that exact setup are below 400W. And you somehow claim that my calculations are unsafe? Based on what, exactly?
This is, for the record, also a case where Nvidia's recommended PSU numbers align decently, as they recommend a 650W PSU for the 2080 Ti (as does EVGA for that specific OC model), though you could perfectly safely game on this setup with a high quality 500W unit - with more than 100W of headroom! - just don't run furmark+P95 on it. Step down to a less power hungry CPU and/or GPU, and you're looking at a smaller recommended PSU - a Ryzen 5 3600 + 2070S would cut ~45W and ~75W from the base numbers, for example, bringing the recommended PSU down to ~450W including a 20% margin, and real-world power draws would likely be closer to 300W. Heck, I've seen enough people run undervolted 2080 Tis off 400W SFF PSUs to know that is entirely feasible as long as you're comfortable with running on the bleeding edge. Yet for a setup like that you'd be more likely to find people saying "get a 750W PSU just to be safe" (or also the classic "aim for 2x power draw, so get an 800W unit"), than for people to make reasonable recommendations based on actual data.