Overkill and future proof are probably the most overused terms used in relationship to PCs. As every user's goals are different, the word can not be universally applied.
1. If the goal is to build the best machine for the intended usage for the lowest cost is the goal, then anything can be "overkill". Grandma who browses the web, gets emails w/ pics of her grandkids and stores recipes, pays her bills via Quicken will not benefit in any way from a i7 a GFX card, 3rd party CPU cooler and pretty much any component in the box.
2. Just about anyone in the corporate world also doesn't need anything more than a basic box. Video production, CAD, rendering, gaming, other specialized apps obviously don't fit into this category.
3. 99% of those with an SSD will never see an iota of increased productivity as a result of adding an SSD. Every argument to the contrary uses tasks that folks just don't perform on a daily basis .... the fact that yiu can open 100 tabs in chrome copy paste 2 TB of files, zip giant files measurably faster is not an issue if yu almost never do any of those things ... and booting 0.9 seconds faster isn't going to change your life in any way. I PC ius only as fast as it's weakest link and most often that is the person using the keyboard.
4. Installing a workstation CPU in a gaming box simply because "most expensive must be better", is not overkill, it's self defeating as a cheaper CPU will perform better.
5. Using a CPU with a high core count when nothing you do is even warming up the 4th core.
6. An architect or engineer buying a Quaddro card "cause they read on the inernet" that it is for CAD. Actually AutoCAD and other programs commonly used by A/Es in no way benefit from workstation CPUs or Workstation GFX cards. AutoCAD is not multi-threaded nor are these other programs. Using workstation CPUs and GFX cards here reduces performance. Now when you make the transition from 2D / 3D drafting to rendering and animation, that is the place for workstation CPUs and GFX.
On the other hand ....
1. One of the most common uses of overkill is in relation to PSUs.
https://www.guru3d.com/articles_pages/msi_geforce_gtx_1070_ti_gaming_review,8.html
So, while an oversized PSU might be overkill from a "wattage standpoint", that doesn't men it is over kill. A larger PSU will have a larger cooling system. So one that operates at 50% of itc rated capacity.
a) Will produce less heat than a smaller one
b) Will use less electrticty than a smaller one
c) Will run quieter than a smaller one.
d) Will help on borderline stability issues on CPUs / GPUs as electrical noise and voltage instability increase the closer you get to rated loads.
2. CF / SLI I don't think falls into the overkill classification. It is either appropriate of inappropriate.
-Two 560 Tis were faster and cheaper than the 580, great scaling ... no brainer. More performance and less costs is never a bad decision The few games that didn't support SLI / CF were never AAA class games and most often obscure.
-Two 650 Ti's were way cheaper and a bit faster than the 680. Above applies
-Two 770s were faster and cheaper than the 780. Above applies
-Two 970s were 40% faster and cheaper than the 980. Arguments were made that less games were supporting SLI but all the big games were... so the question was .... what's better ? ... being 40 % faster in 85% of games while being 12% slower in a minority but in which case your fps was already up at 80 ? Or getting the 980 and seeing 35 fps in Tomb Raider when ya friend who spent less with twin 970s getting 58 fps ? Or 41 fps in FC3 w/ the 980 and 69 with 970 SLI...still a literal no brainer.
Nvidia changed things up with the 10xx series ... all of a sudden average scaling went from 70% to just 18% at 1080p and 34% at 1440p. What happened. I think nvidia saw that the biggest profit margins on their top tier card(s) were being lost to competition ... from themselves. AMD had no horses in the race from the 1060 on up so, it would seem, they nerfed SLI so that the SLI 1070 option was no longer attractive "by the numbers". The fact that scaling isn't bad at all at 2160p was kept so as to allow 60+ fps at 4k
So overkill ? Not in the past ... today, only make sense at 4k ... using it below 4k isn't overkill, it's just unwise, it's no longer cheaper to buy twin 1070s
3. Wasn't long ago folks were saying 16GB or faster RAM was overkill .... and a false equivalency was used to make this case. The most common argument was faster RAM had no effect on avg frame rates, and references were cited that "here's 4 games that don't show an improvement" to make the case. But this didn't take into account all instances. When impact on min frame rates was examined, we did see an effect, performance is limited by what component is "the bottleneck" and in many instances, this would be the GFX card, but when SLI was in the picture, often RAM became the bottleneck. In other instances, games like the STALKER series and for example, F1 which has 11% higher frame rates w/ 2400 as opposed to 1600.
4. I have seen it stated that CLCs are overkill ... I don't agree as I have yet to see a 240/280 CLCs topping the performance of available < $50 air coolers.
5. Delidding - While a worthwhile endeavor for Ivy Bridge, I haven't had a OC limited by temperature since Haswell arrived. Sure it get you cooler temps but since Haswell all of our OCs have been voltage limited rather than temp limited. So yes, it does earn you bragging rights to "my temps dropped 12 C", if the original temps were of no concern then that extra 12C isn't bringing anything to the table.