- Joined
- May 24, 2023
- Messages
- 939 (1.72/day)
The current situation is a mess:
I would propose the levels, which are factor of 1,6 apart from each other: 25?, 40, 65, 100 and 160W
25W should be enough for an office PC and multimedia playback
40W may enable the 25W parts to get onboard a more powerful integrated graphic card or other stuff as an AI accelerator, etc. I am not sure if having 25W and 40W separately is necessary.
65W is what a little "pancake" cooler with no heatpipes can handle and is plenty for any office or universal family computer, should be able to serve lower power graphic cards.
100W can offer a serious amount of computing power already and should be enough for gaming even with the most demanding graphic cards, it is coolable with a very small 2 heatpipe cooler
160W can provide huge amount of computing power, if somebody really needs more than that he probably would be better off buying a proper workstation. It can be cooled with a small 4 heatpipe cooler.
There are three more benefits to this:
This may greatly reduce CPU lineups, force manufacturers to optimise and come up with the best what they can develop for the given level of CPU power.
A producer may sell:
one 40W CPU, the best universal CPU they can make
one 65W CPU, the best universal CPU they can make
two 100W CPUs, the best universal CPU they can make and the best gaming CPU they can make
two 160W CPUs, the best universal CPU they can make and the best "home workstation" CPU they can make, with some specialised accelerators helping with productive workloads etc.
these are 6 CPUs that would serve the PC market. Does anybody really need more???
A third additional benefit would be that it would greatly simplify comparing performance of CPUs from different manufacturers.
Is there even a place of overclocking in this scheme? I doubt it. What is the point of allowing the 65W CPU to be overclocked e.g. to 100W, when you can get a native 100W CPU performing much better than the overclocked CPU? Only the 160W parts may be enabled to run at a maximal power draw of 200W at the cost of loosing energy efficiency. 200W could be a hard cap for PC CPUs. To anybody being offended to be limited by this cap I would say that having a CPU on a personal computer drawing e.g. 350W is OBJECTIVELY INSANE. Then you realise that this whole scheme is built on energy efficiency, increasing the power limit and overclocking really do not make any sense anymore.
The different power (and performance) levels could be clearly colour coded or marked with some performance number. Coolers needed for this level of perfomance could be marked the same way.
The energy efficiency is built in this scheme. The manufacturer needs to bring max performance for a given power level. It is a definition of efficiency. This is the best thing ever for enviroment and for consumers too, because manufacturers would be FORCED HARD to optimise and bring to the customer the best they can and also make it on the best technology they can use. No lazy-ass using old stuff over and over.
A similar scheme should be emloyed for graphic cards, forcing manufacturers to make the best say 100, 160 and 250W cards they can.
Discuss and share this to make this happen. Thanks!
I think I will write a letter to Europen commision. This stuff is green as a frog, benefits consumers greatly and pushes research and technology development hard.
- CPUs are marked with power draw numbers which do not mean real power draw.
- Real power draw of the CPU depends on the settings each motherboard manufacturer chose, the same processor can draw different amounts of power on boards from different manufacturers. You may be OK with a larger air cooler on one board and require water cooling on another.
- CPUs are sold with absurd power draw levels which have po place in personal computers.
- represent real power draw
- enable customers so choose appropriate cooler
- indicate clearly the performance level of the CPU
I would propose the levels, which are factor of 1,6 apart from each other: 25?, 40, 65, 100 and 160W
25W should be enough for an office PC and multimedia playback
40W may enable the 25W parts to get onboard a more powerful integrated graphic card or other stuff as an AI accelerator, etc. I am not sure if having 25W and 40W separately is necessary.
65W is what a little "pancake" cooler with no heatpipes can handle and is plenty for any office or universal family computer, should be able to serve lower power graphic cards.
100W can offer a serious amount of computing power already and should be enough for gaming even with the most demanding graphic cards, it is coolable with a very small 2 heatpipe cooler
160W can provide huge amount of computing power, if somebody really needs more than that he probably would be better off buying a proper workstation. It can be cooled with a small 4 heatpipe cooler.
There are three more benefits to this:
This may greatly reduce CPU lineups, force manufacturers to optimise and come up with the best what they can develop for the given level of CPU power.
A producer may sell:
one 40W CPU, the best universal CPU they can make
one 65W CPU, the best universal CPU they can make
two 100W CPUs, the best universal CPU they can make and the best gaming CPU they can make
two 160W CPUs, the best universal CPU they can make and the best "home workstation" CPU they can make, with some specialised accelerators helping with productive workloads etc.
these are 6 CPUs that would serve the PC market. Does anybody really need more???
A third additional benefit would be that it would greatly simplify comparing performance of CPUs from different manufacturers.
Is there even a place of overclocking in this scheme? I doubt it. What is the point of allowing the 65W CPU to be overclocked e.g. to 100W, when you can get a native 100W CPU performing much better than the overclocked CPU? Only the 160W parts may be enabled to run at a maximal power draw of 200W at the cost of loosing energy efficiency. 200W could be a hard cap for PC CPUs. To anybody being offended to be limited by this cap I would say that having a CPU on a personal computer drawing e.g. 350W is OBJECTIVELY INSANE. Then you realise that this whole scheme is built on energy efficiency, increasing the power limit and overclocking really do not make any sense anymore.
The different power (and performance) levels could be clearly colour coded or marked with some performance number. Coolers needed for this level of perfomance could be marked the same way.
The energy efficiency is built in this scheme. The manufacturer needs to bring max performance for a given power level. It is a definition of efficiency. This is the best thing ever for enviroment and for consumers too, because manufacturers would be FORCED HARD to optimise and bring to the customer the best they can and also make it on the best technology they can use. No lazy-ass using old stuff over and over.
A similar scheme should be emloyed for graphic cards, forcing manufacturers to make the best say 100, 160 and 250W cards they can.
Discuss and share this to make this happen. Thanks!
I think I will write a letter to Europen commision. This stuff is green as a frog, benefits consumers greatly and pushes research and technology development hard.