? I can't tell whether this is a troll answer. By definition global climate change (because warming is inaccurate) is not influenced by the consumption of energy...and that's trivial to demonstrate. Nuclear power is taking unstable atoms, concentrating them until they create a controlled chain reaction, and then using their heat to driven a turbine. This process has literally been going on since the earth formed...and contributes no additional greenhouse gasses. If you'd like to argue, there are millennia old caves in Mexico that demonstrate this process is literally older than man. (
Naica Crystal Caves)
By this logic, the only option would be to define that what you want is necessary, then do a calculation on total power draw rather than peak draw. Theoretically then, a 4080 shouldn't exist because the 3060 exists...and it can do the came calculations much slower, but overall more efficiently. Likewise, a 3060 should not exist because there are lower spec and more efficient GPUs...let alone the system you've got being less efficient using CISC processors rather than RISC. If you don't get it yet, there's a point where this argument of efficiency is a joke race to the bottom...because technically humans have pencils. Our biological processes are more efficient than any computer...so it'd be more efficient to physically color and illustrate a game than to use a computer of any kind...which hopefully highlights how silly this point is by proxy.
With regards to the thread topic, a 4080 using 420 watts of power is...whatever it is. I'm honestly looking forward to the market cards that target 1080p resolution gaming...which in this generation might be the 4060 or 4070. If they release as absolutely energy hungry messes, then it's going to be a strong case to break out the 30x0 cards as a much better value for the money. It'll also hopefully be an opportunity for AMD to get its crap together and release cards right. That is to say they release a competitive product that forces Nvidia to be more price competitive (with drivers and BIOSes that aren't an utter joke).
Personally, I've dealt with the AMD 5700 xt lineup...and having months of wait to get the properly tuned bios for the cards that didn't make them stutter more than was anywhere near acceptable was...frustrating. Nvidia pushing an idiotic level of power consumption would be that opportunity in a nutshell. I may be a bit more optimistic than rational though.