Anyway Win 12 will drop the full x86 archi, lol.
Extremely unlikely, as doing so would obliterate Windows' backwards compatibility and the majority of their reason to exist. There's nothing wrong with 32 bit libraries, they do not hurt anyone.
The mroe recent push to eliminate older software and APIs, which has pushed more software to web based or multiplatform applications, have significantly hurt MS, windows is now below 3/4ths of the total market, down from a peak of 92%. The more they push to eliminate backwards compatibility, the more market they will lose.
Of course, this is the same company that strangled their console market to the point of being bedridden, so who knows, maybe MS will be the first multi trillion dollar company to go bankrupt from their own stupidity.
Finally someone with common sense. Desktop GPUs had an engineering standard for almost 2 decades. It stated that a GPU should NOT be over 250W TDP and usually that was 210-230W power draw.
Also I think game developers should stop with the super high fidelity graphics, it makes for worse games and super broken, unfinished, super expensive to develop ones at the same time.
What standard was this? Do you have documentation for it?
The "200w standard" didnt exist. 20 years ago, GPUs were limited to 100-150w, not because of some ancient wisdom, but because they were pushing the limit of process nodes at the time. The moment nodes allowed higher limits, we pushed higher. This has always happened.
I remember people clutching their pearls and fetching their fainting couches over the GTX 280's power draw. You know, a card that was sufficiently cooled by a basic 2 slot blower. Saying anything over 150 was just too much. Hell there was hand wringing over the voodoo cards needing external power connectors WAY back when, and how they just used too much and what is happening to our hobby, ece ece. And dear god, the fire and brimstone when CPUs needed coolers with FANS! THE HORROR!
If this had been in the AI and Server market there would have been bigger consequences than in the consumer market where it is....
I guess this shows once again NGreedia doesn't care much for consumers but only investors and such, if they really cared they wouldn't have launched any of cards if these issues and further more they created the hole 12VPWR problem themselves because "oh" our gpu's needs 500W+++ to be so fast instead of innovate on bringing power consumption down to a single maximum 2x8pin pcie power connectors this would improve a lot.
I sad it a couple of times already stop the 5ish % performances and fakeness upscales and frames and stand still while improving power consumption instead of users needing 1000W+ to run a crappy card at the end of the day.
I wish both AMD and NV would stop the performance race and start looking into just innovate on the performance we see and bring it down to 200-250W that would be innovation but I guess none of them have the brains, b**** and p**** to do so....
Nothing is stopping you from buying 200w GPUs. Or undervolting higher end parts. Sounds like the one lacking brains here isnt the developers of GPUs......
There is no point unless you wanna play almost 2 decade old games. The latest games that made use of 32 bit physx are from 2013 but most of them are from 2007-2010.
If I didnt care about backwards compatibility, I'd buy a mac or a console.