I will admit it if I’m wrong but there is no way these leak specs point to a 50% increase over the 4090. It will be 30% at best averaged over the 27 games in the TPU test suite.
I'll take that bet. I bet it'll hit 50%.
You're just talking nonsense.
EU regulations are done after a long and comprehensive research done by very smart people based on all aspects. They will never decide to regulate 75W for a videocard. That is plain nonsense.
Why is that nonsense? Gaming is a wasteful hobby. Why do you need 150 watt to play bing bing wahoo? Hell why do you need 75 watt? The switch only pulls like 10 at load, that's all you need for video games.
You say it's nonsense because YOU want a GPU with more than 75w of capability. Funny how that works. Once you start down the road of "you dont need this", it WILL be taken to its logical extreme. Remember the UK's potato peeler license debacle?
I'd love to hear what this sensible argument for a NEED of GPUs more than 75 watt to play games with is.
But they might definitely block a video card to draw more than 1KW from you power source, and that's because not of ecology related things, but mostly because of possible fire hazards, fuze blowing for sensitive buildings, etc, whatever. This is not Wild West, or Rednek Ville, where everybody can run a 1MW power plant, just because.
If the common sense is not in the vocabulary of a Corporation, then maybe is the duty of a statal committee or simmilar to impose that.
No video card on the consumer side draws more than 1kW of power. Not even close. The highest was the 600w the 3090ti could tickle. If you're talking whole systems, buddy, SLI gaming PCS were pushing 2KW back in 2008. Somehow you all survived.
a 1kW card poses no more of a risk then a 500w card does, or a 300w card. If you have such an old building, you should be upgrading the wiring instead of buying GPUs to waste time gaming on.
Here in "rednek ville" we have modern 25 amp wiring with 20 amp breakers that can run a 5090 without burning the house down. Interesting that this is such a concern in the EU, with all those "very smart people" banning everything. Why are you even allowed to have a building with wiring that cant handle a 1000w load? That's literal knob and tube style wiring.
If you have such widespread problems with wiring, the solution is NOT to ban GPUs, because you're gonna have to ban fridges, microwaves, and vacuum cleaners too. What you SHOULD be doing is banning that old wiring. This is why the EU gets referred to as a "nanny state".
I miss the days that a top end GPU was $650 and if you wanted the Titan brand card you'd pay more, but that 980Ti....oh how I miss those days.
Let's see....what's $650 get you now for a current GPU? Well, after looking at my local Micro Center, nothing. It will buy you nothing for GPUs. With the new gen coming out the restock of the current gen has stopped from the looks of it. You either step up to $850+ range for a 4070Ti Super or down to the $400 range for a 7700XT.
I guess we see if Intel and AMD can really shake up the low to mid tier GPU pricing this gen. Would be nice to see things more affordable for the middle and low end, while still offering a decent performance range. For me, as it currently stands, I'm priced out of the top end GPUs by a lot.
The 980ti came out in 2015. Inflation is a thing. Also, to hit on
@Hxx's point, I'm making about 60% more per year then I was in 2015. $850 today is $650 from 2015. I really wouldnt want to take a pay cut to 2015 levels just to get $650 GPUs back.
Also obligatory: the 8800 ultra was $850. In 2006. That's over $1200 today. Expensive top GPUs are not a new phenomena.