As someone who works in the film industry, I can attest that we will continually need MORE powerful GPUs to keep up with the hardware requirements needed to render 8K and higher video footage. We don't use consumer/gaming graphics cards for editing and rendering pro-grade footage since we need MUCH more powerful cards than the consumer RTX series to process special effects and convert footage to different formats. Even the 3090 Ti is far too slow to be practical in the post-production phase of filmmaking. Most of our editing rigs are custom built for us by a company that specializes in ultra high end video editing systems. Most of the newer systems cost upward of $50,000 and come equipped with multiple server-grade GPUs and multi CPU motherboards. The good part about the release of the 40 series is that the cost of professional grade video cards will drop, allowing us to stretch our budget farther than before. There will always be a need for more powerful GPUs and CPUs, even if you're just gaming with your PC. Progress will continue, and games will require more and more powerful hardware to keep up with the latest software graphics engines used, especially as we transition to 8K monitors for gaming and video editing. When 1080p was the industry standard for gaming, most people didn't see a need for 4K hardware. Now that 4K gaming is on the rise, 8K will become more and more commonplace, following the same trend that computer hardware and software have always followed. As far as efficiency is concerned, you are correct that the manufacturers should be more focused on creating more efficient GPUs. Energy prices are soaring and the U.S. isn't going green fast enough to justify the wattage these GPUs are pulling.