Slightly off topic, but buying a PSU based on 50% load is a waste of money.
Technically yes. If you're only using 325Watts and the power supply can deliver 650W then you are leaving half of the potential power delivery on the table.
However, the lifespan, efficiency, heat, and noise of a 325W PSU providing 325W would be bad, and I doubt it would survive much beyond the warranty period (if it even reached it at all) operating 24/7 at 100% load, maximum, loud fan speed, and at it's maximum rated component temperature.
The other thing that prevents you from realistically using a 325W-rated PSU to power a PC with a recorded peak TDP of 325W is that the recorded peaks are all actually averages. Even when you look at GPU reviews and the 3080 needs 350W at it's peak, that's only an
average of 350W over the sampling period of the driver - probably around 2 seconds based on how fast I observe the Nvidia driver reacting in situations where the driver reports "power limit" as the reason for reducing clocks on my 2070S. When sampling at 0.1 seconds, my 2070S can actually spike multiple times a minute at more than 115% power limit. I would suspect that a shorter sample time might even make that higher still.
Rather than argue about hypotheticals that are difficult to prove because I don't have an easy way to sample power usage faster than every 0.1s, let's talk about all the 3080 launch reviews. Most of them were using i9 or Ryzen9 test benches with very few components other than a CPU, cooler, and GPU. That means that the system power draw numbers are relatively easy to calculate - they should be the PPT of the CPU, the TDP of the chipset, the power rating of the CPU fan, and the total board power of the GPU. In fact, the kill-a-Watt power readings performed by a few reviewers matched that formula almost exactly, once adjusting for the PSU efficiency at the measured power draw levels. That's not really surprising because (providing it's not OC'd, and the BIOS PPT limits are at stock or set manually) all of the power-consuming parts have fixed, known power-draw caps.
So why, then, for an i9+3080 system that
was measured to draw only 635W from the wall at full synthetic load (250W PL2 + 350W GPU + 8W Z490 + 7W CPU cooling) fail to run
using 750W and 850W Seasonic Platinum power supplies? Linus, Steve, Jay all had to swap out their PSUs for higher-Wattage units for their 3080FE reviews. By your argument, the 750W PSU would only have been running at 82% load, and the 850W PSU would only have been running at 72% load.
Answer: Because instantaneous peak power draw in a system is higher than the average peak power draw in a system, even if only for fraction-of-a-second bursts. The result is bluescreens, crashes, GPU driver restarts. I don't make the rules, that's just how it is; If you don't like it go and argue with the laws of physics, Maybe pick a fight with James Watt, Charles de Coulomb, or Otis Boykin instead of PSU manufacturers, Nvidia, or us forum-dwellers.
There is a middle ground between 50% and 100%.
Well, based on Ampere reviews, 65% might be pushing your luck because even 72% turned out to be unstable for all three of the top three most popular Youtube reviewers (by subscriber count).
Also, if your CPU+GPU combined load is under 250W (pretty common, actually) then you're out of luck because most reputable PSU vendors start at 500W for their lowest model. The most popular gaming spec on Steam appears to be a quad-core i5 and GTX1060, pulling around 200W at full load. That's only 40% of a 500W PSU.