# Intel Core i9-12900KS



## W1zzard (Apr 1, 2022)

The Intel Core i9-12900KS is the company's new flagship Alder Lake processor. After our review, we can confirm that it is the "world's fastest gaming CPU," but that comes at a price not only in terms of dollars, but increased power draw and heat output, too.

*Show full review*


----------



## W1zzard (Apr 1, 2022)

This is a proper review, not an april fools' joke


----------



## Lifeless222 (Apr 1, 2022)

even more pointless than 3080/3090 TI


----------



## Voodoo Rufus (Apr 1, 2022)

Fast, hot, expensive. Don't need it. Want it anyway.


----------



## Airisom (Apr 1, 2022)

Did they cut AVX 512 on it?


----------



## ThrashZone (Apr 1, 2022)

Hi,
Missed what the 360mm aio did temp wise ?

Also 12900k released at 790.us so release wise 12900ks is less 740.us 
Call it minus tariff costs since they were dropped but only recently was 12900k price lowered to current 550.us mark.


----------



## wheresmycar (Apr 1, 2022)

Hi W1zzard,

those temps are high!

I'm just curious... [generally speaking] looking at the power consumption charts what is the best method to estimate power consumption whilst gaming. I'm assuming its a mix of the single-threaded and Multi-threaded chart? I'm also assuming its way below the MT 298W rating?

I guess its a ton of work putting together these reviews but any future possibilities of including game-only temps and power consumption (maybe over a handful of diverse titles)?


----------



## AnotherReader (Apr 1, 2022)

W1zzard said:


> This is a proper review, not an april fools' joke


Thanks for the great review. How did you resist doing an April Fool's article?


----------



## W1zzard (Apr 1, 2022)

Airisom said:


> Did they cut AVX 512 on it?


AVX512 was never part of Alder Lake, you had to disable the E-Cores completely. 
In recent BIOS updates it was removed completely. To verify I just tested 12900K (not KS) on my board and can't enable AVX512 no more, even though I see the AVX512 option in the BIOS and disabled the E-Cores

What are you using AVX512 for?



AnotherReader said:


> How did you resist doing an April Fool's article?


Had no time for an April Fools' this year, too busy with more important things. Like 4x 3090 Ti and this 12900 KS just this week, oh and two SSD reviews. Going on holiday tomorrow, so I had to get it out today.



wheresmycar said:


> but any future possibilities of including game-only temps and power consumption (maybe over a handful of diverse titles)?


This is definitely planned for the next big rebench, but difficult to add now considering how many CPUs we have in the test group


----------



## Airisom (Apr 1, 2022)

W1zzard said:


> AVX512 was never part of Alder Lake, you had to disable the E-Cores completely.
> In recent BIOS updates it was removed completely. To verify I just tested 12900K (not KS) on my board and can't enable AVX512 no more, even though I see the AVX512 option in the BIOS and disabled the E-Cores
> 
> What are you using AVX512 for?


RPCS3 gets a big performance boost with AVX 512. Maybe some boards still give you the option to enable it. Iirc, there's an xoc bios (X3C - on hwbot forum) on the GB Tachyon that let's you enable it still, but I'm not sure if the 12900ks fused it off.


----------



## Solid State Brain (Apr 1, 2022)

Default TjMax of 115°C is interesting, I wonder if it's really long-term safe.

EDIT: I noticed on the Ark website that Intel lists a 100 °C TJunction for the i9-12900K, but this information is omitted for the i9-12900KS:









						Product Specifications
					

quick reference guide including specifications, features, pricing, compatibility, design documentation, ordering codes, spec codes and more.




					ark.intel.com
				











						Product Specifications
					

quick reference guide including specifications, features, pricing, compatibility, design documentation, ordering codes, spec codes and more.




					ark.intel.com


----------



## GoldenX (Apr 1, 2022)

W1zzard said:


> What are you using AVX512 for?


RPCS3  The difference is massive.

The review may not be an April Fools, but the product is.
I didn't expect competition to also mean "to hell with efficiency".


----------



## Warigator (Apr 1, 2022)

I respect them for introducing current year value king - the 12100F, but this 12900KS is a joke processor. Even worse than both the 8086K and 9900KS. +2% in games? What were they thinking? This is a product for people with more money than brains or for extreme overclockers. But it's binned anyway and can't overlock notably high.


----------



## Dividend (Apr 1, 2022)

"$150 premium over the Core i9-12900K", like one days work is not a big premium for someone who is planning on using it for many many years...


----------



## ppn (Apr 1, 2022)

There is no point on using it for years when 14900 based on intel's 4 delivers 40 threads, probably a new 1800 socket, combine that with 4090 based on tsmc's 4 and you get what comes close to many years, for it can last longer, but this now this is just money grabs.. The 12100 value king is such a bad value for 150 euro, I expect 12400F.


----------



## Dividend (Apr 1, 2022)

ppn said:


> There is no point on using it for years when 14900 based on intel's 4 delivers 40 threads, probably a new 1800 socket, combine that with 4090 based on tsmc's 4 and you get what comes close to many years, for it can last longer, but this now this is just money grabs.. The 12100 value king is such a bad value for 150 euro, I expect 12400F.


But 14900 isn't out yet.


----------



## Solid State Brain (Apr 1, 2022)

From the article conclusions:



> Intel has a highly effective, very fine-grained thermal throttling algorithm that only throttles those cores that overheat, and only by as little as is required to keep them from overheating, checked several times a second.



Is this an i9-12900KS-specific behavior? On my 12700K all cores are throttled if just one reaches the thermal limit.


----------



## BorisDG (Apr 1, 2022)

W1zzard said:


> This is a proper review, not an april fools' joke


Are your sure?  Seems exactly like joke.


----------



## Nater (Apr 1, 2022)

I might buy one after I get my winter Lambo paid off, and close on my 3rd vacation home in Costa Rica.  Who buys this shit?  I mean really?


----------



## dj-electric (Apr 1, 2022)

None embargo tied 12900KS TPU review. Rare sight. Im jealous and signed.
Global is only lifted April 5th


----------



## thegnome (Apr 1, 2022)

This CPU sure is a good april fools joke given how worthless it is compared to the 12900k when normally cooled. With chips like these XOC results are much more interesting.


----------



## Space Lynx (Apr 1, 2022)

those are insane temps for being on a 360mm AIO... nearly double the temp of 10th gen high end Intel chips... that is just wild...


----------



## Aretak (Apr 1, 2022)

GoldenX said:


> RPCS3  The difference is massive.


Rather it _can_ be massive, but only in SPU-limited titles (which does include many of the big name exclusives). In others it makes very little difference.

This review might not be a joke, but those heat and power draw figures sure are. Better grab one of those 2KW PSUs if you're planning to pair one of these with an RTX 4000 card.


----------



## Sabishii Hito (Apr 1, 2022)

Since this was tested on an Asus board, you know the big question...

What was the SP score? Overall and detailed?


----------



## mama (Apr 1, 2022)

Taken from the old handbook: "More power=More Good"


----------



## AusWolf (Apr 1, 2022)

_"this is exactly how Turbo Boost 3.0 is designed to work, the problem is that even though we're using Windows 11, and all communication protocols between the OS and CPU are active, there's still cases where threads don't end up on these two cores and thus lose a bit of performance."_ - It's strange, considering that I never run into this problem with my 11700 on Windows 10. Maybe Windows 11's scheduler is a bit flawed?

Good review otherwise.


----------



## Undertoker (Apr 1, 2022)

Waiting for my KS to arrive , I bought the 9900KS and never regretted it for a moment.
I’m not as keen on the recently announced 3090ti though, I’m already gaming in my pants after a couple of hours running a 3090 OC.- the heat is atrocious on a 3090, so a 3090ti will clearly be a lot worse, then there is the power draw….
I think they need to start thinking very seriously about limiting power draw before governments legislate and ruin it - as they do everything else.
They are robbing us blind now across Europe for electricity and this comes as british MP’s now get their electricity paid for by us the tax payers now on top of a £2200 pay rise, so clearly it doesn’t matter them, but “we are all in it together“ eh…


----------



## BArms (Apr 1, 2022)

Great review as always. I would like to suggest one new benchmark though I think would be really cool to see:  Civilization 6 Time-between-Turns, it probably just scales with single-threaded performance more than anything but it would be nice to see.


----------



## phanbuey (Apr 2, 2022)

Undertoker said:


> Waiting for my KS to arrive , I bought the 9900KS and never regretted it for a moment.
> I’m not as keen on the recently announced 3090ti though, I’m already gaming in my pants after a couple of hours running a 3090 OC.- the heat is atrocious on a 3090, so a 3090ti will clearly be a lot worse, then there is the power draw….
> I think they need to start thinking very seriously about limiting power draw before governments legislate and ruin it - as they do everything else.
> They are robbing us blind now across Europe for electricity and this comes as british MP’s now get their electricity paid for by us the tax payers now on top of a £2200 pay rise, so clearly it doesn’t matter them, but “we are all in it together“ eh…



Many truths in this post.


----------



## Minus Infinity (Apr 2, 2022)

Honestly, meh. Disgraceful power draw and heat, a cpu for p!ssing contest only. 

OCing Alder Lake should be illegal, it's a farcical waste of energy, stock 12900K needs no help. I hope Raptor Lake focuses on greatly reducing power draw but I suspect we won't see huge improvements until Arrow Lake.


----------



## N3M3515 (Apr 2, 2022)

hahahahhahah..........wtf this is worst than 3090 Ti, +$150 freaking dollars for 1%? that's even margin of error.......also more consumes power, what a joke.
This is the champion of useless hardware.


----------



## NutZInTheHead (Apr 2, 2022)

For the funs I ran the Blender BMW test as well to see where my 5950X stands against that chip. Mine finished the run in 83.39 seconds while hovering around 209W for CPU package power as reported by HWinfo. Also just for those that are curious, while the test was running the all core VID reported by HWinfo was between 1.1 and 1.2 volts and the all core clock speed was between 4.35 and 4.42 GHz.


----------



## swirl09 (Apr 2, 2022)

wheresmycar said:


> I'm just curious... [generally speaking] looking at the power consumption charts what is the best method to estimate power consumption whilst gaming. I'm assuming its a mix of the single-threaded and Multi-threaded chart? I'm also assuming its way below the MT 298W rating?



For what its worth, I run a 12900K with a D15, set to 5.1 all core at 1.23v fixed. Gaming can be between 35-85w, with temps in the 60s usually. Its rare to see a spike to the low 80s.

The chip can suck down crazy power if you let it and will look scary on charts. But if it is gaming that you are after, you dont need to let it or worry about it.

(*E: More relevent info: 
I disabled the Ecores, left ring on auto - which results in typical 4700ring which is about 500 more than default
running about half a year now without issue)


----------



## InVasMani (Apr 2, 2022)

In Prime 95 the CPU uses more power consumption than my PSU's official rating. If I got a identical PSU I couldn't officially run a 12900KS and a RTX 4090 together because they both exceed my PSU's rated wattage. It officially would probably work along some fans to aid in cooling them, but what a sh*ts show between the two of them. It'll be embarrassing if AMD decides to play along too. This isn't where things were suppose to lead with smaller manufacturing nodes even higher power draw devices. What next memory makers pushing DDR5 to 2.8v!!?


----------



## tpu7887 (Apr 2, 2022)

A suggestion for reviews in the efficiency section (because right now it's just consumption).

You need a game demo that lasts 5 minutes and is a good mix of GPU and CPU usage.
By good mix I mean when the demo was running it would tax the GPU and CPU in such a way that the numbers would be close to the average of all the games benchmarked in the review.

Once you have it, you run it - measuring the machine's power consumption when you do. And like you do on the 1080/1440/4k results pages, adjust the review sample's FPS to 100%
12900KS: 5m Demo: 14.2kJ consumption, 100% performance
11900K: 5m Demo: 11.6kJ consumption, 93.2% relative

To figure out how much more power the 12900k used, you put it's consumption over the 11900K's

14.2/11.6, which is 122.4%

But to get efficiency, you need to adjust for how much less work the 11900K did.

And this is the last calculation 122.4*0.932 =

114.1

The 12900K takes 114.1% of the power to do the same work of the 11900K (for example)

You can reduce this to a ratio: 1.141 to 1

Then you can multiply 1.141 by the 11900K's consumption during a different a game to estimate the 12900K's draw.

Obviously the more different the other game is, the less accurate the estimation is. But that's not the point.

The point is the demo stays the same as the reviews progress. And you can compare efficiency.  When saving for future use, you just need to remember to put the reviewed card's frames per second beside its 100% so the results are transferrable. Posting it in the review would be helpful too, though isn't necessary, especially if you always exported results to the next review. (multiply the results you want to include by the ratio of review 1's 100% FPS to review 2's 100% FPS).

The math isn't complicated. Just thought I'd throw out the idea in an easily useful, hopefully persuasive form.

Also, it wouldn't be much more work to run the demo at all the common resolution. Other than the first time (gathering comparison points) it's not much work. Then as time goes on the database grows.

We'd be able to see how much hardware is improving, both speed and energy wise! Able to compare any two points in time. Until the benchmark becomes too old to stress the hardware. Unless Intel and AMD stagnate for another decade after this short reprieve


----------



## btk2k2 (Apr 2, 2022)

On the gaming testing front is there any chance that some none FPS metrics could be incorporated next time the suite gets an overhaul? Stuff like AI turn time for civ 6 (or 7 if released), late game tic rates (or an abstraction) for the Paradox Grand Strategy games like CK3, Stellaris, EU4. Late game city builder tic rates and so on.

While the 720p results give an indication finding a way to actually benchmark those games where the CPU is far more important for the gameplay experience than the GPU would be a great addition to the CPU test suite. 

For most of the games above my 2200G could do 4k at playable frame rates. The issues that part has (and still does) is late game hitching when calculating, turn time and the fact that late game 'fastest' game speed often becomes slower than early game 'normal' making it drag. FPS is rarely an issue worth complaining about.

As for this CPU. Does not seem worth it over the 12900K. Barely any extra performance for a pretty hefty price increase.


----------



## Deleted member 24505 (Apr 2, 2022)

Tadasuke said:


> I respect them for introducing current year value king - the 12100F, but this 12900KS is a joke processor. Even worse than both the 8086K and 9900KS. +2% in games? What were they thinking? This is a product for people with more money than brains or for extreme overclockers. But it's binned anyway and can't overlock notably high.



12900KS+RTX3090TI anyone?


----------



## Solid State Brain (Apr 2, 2022)

Minus Infinity said:


> Honestly, meh. Disgraceful power draw and heat, a cpu for p!ssing contest only.
> 
> OCing Alder Lake should be illegal, it's a farcical waste of energy, stock 12900K needs no help. I hope Raptor Lake focuses on greatly reducing power draw but I suspect we won't see huge improvements until Arrow Lake.



Decreasing the number of P-cores and increasing significantly the number of low-clocked E-cores (perhaps in a higher IPC iteration) may do a lot for power consumption in multithreaded tasks. A large number of cores that clock to low speeds (thus to low voltages) is mainly how the 5950X achieves a relatively low power consumption in such loads.

Also, having fully independent voltage rails for all cores should help since right now on Alder Lake the voltage applied to _all_ cores is the highest requested by any core and uncore. In other words, currently you cannot _simultaneously_ have one fast core and many slow cores at separate voltages. The slow cores would have the same voltage as the fast one.

I think for short bursts (~seconds to perhaps a couple minutes) and single-core loads it's fine to use the CPU to the maximum of its capabilities, but by doing some analysis of performance versus configured PL1 (i.e. package power) it's clear that allowing the CPU to consume 50–100+% more than about 100-120W is definitely not going to yield corresponding performance increases and will just be wasteful (and noisy, stressful, etc) for significant compute tasks (video or 3D rendering, etc).

I overclocked my 12700K but mostly for single-core performance and multi-core bursts. For long-term compute loads I have a relatively low PL1 and Tau limits set in place. I think most people will just run their overclocked CPUs with no limits.


----------



## W1zzard (Apr 2, 2022)

tpu7887 said:


> A suggestion for reviews in the efficiency section (because right now it's just consumption).
> 
> You need a game demo that lasts 5 minutes and is a good mix of GPU and CPU usage.
> By good mix I mean when the demo was running it would tax the GPU and CPU in such a way that the numbers would be close to the average of all the games benchmarked in the review.
> ...


Good idea, here's some data. 








Actually the power measurement is at 1440p, but to get more scaling let's just use 720p data for FPS

Dividing power by FPS for each of those:




This does look promising indeed. 

Thoughts on the unit "Watt per Frame" ? Technically it is Joules per Frame because 1 W = 1 J/s and 1 FPS = 1 Frame/s, so power / FPS = (J/s) / (F/s) = (J/s) * (s/F) = J/F.

Guess "Energy per Frame"?


----------



## tfdsaf (Apr 2, 2022)

Way too power hungry and way too expensive! Why go for this KS variant over the K variant? There is no point other than people having too much money lying around and wanting to just give their money to Intel for nothing. Literally 1% faster than the K version, but way hotter and way more power hungry. This CPU consumes more power than top of the line graphic cards! 

It looks even worse compared to the 5950x as its again only 1% faster overall, yet consumes over 150W more power and has temps of 100+ degrees Celsius.  I'm still sticking with AMD to punish Intel for almost a decade of garbage 2% incremental speed increases when they had a monopoly even though its now a tossup between their CPU's. The last 3 years I'd say AMD had the better processors and way more value oriented ones, but now its equal, but I'm going with AMD still as I don't want to reward Intel just yet for a full decade of garbage products! 

I'm looking at AMD's Ryzen 6000 or 7000 whatever they call it to see how it stacks against intel's latest offerings. From what I've read its supposedly going to be available early Autumn.


----------



## FlanK3r (Apr 2, 2022)

TDP MAX of i9-12900KS cant be 241W, but its defined  for 260W by Intel....


----------



## W1zzard (Apr 2, 2022)

FlanK3r said:


> TDP MAX of i9-12900KS cant be 241W, but its defined  for 260W by Intel....


The default PL1 and PL2 values are 241 W, that's what the processor knows about, everything else is just made up numbers that are listed for one reason or the other.

The 260 W number is from an earlier leak that apparently turned out to be false


----------



## Markosz (Apr 2, 2022)

Intel  ?


----------



## thelawnet (Apr 2, 2022)

Dividend said:


> But 14900 isn't out yet.


yes but we can see for example that the 11900K was matched by the 12600K, these chips don't have a lifespan of years but months.

I mean sure you could buy the most hot, inefficient, steaming mess of a bloated CPU and blast it at 30 jigawatts for the next 10 years, but that doesn't seem very rational.


----------



## FlanK3r (Apr 2, 2022)

@W1zzard u right, sorry, only base TDP is higher


----------



## Rares (Apr 2, 2022)

Intel just want to be on top of everything. This CPU is just a waste of resources, time and money. Only stupid geeks, with tones of money, would buy this monstrosity.


----------



## progste (Apr 2, 2022)

That 5950x looks better with every competitor review.


----------



## AusWolf (Apr 2, 2022)

Minus Infinity said:


> Honestly, meh. Disgraceful power draw and heat, a cpu for p!ssing contest only.
> 
> *OCing Alder Lake should be illegal*, it's a farcical waste of energy, stock 12900K needs no help. I hope Raptor Lake focuses on greatly reducing power draw but I suspect we won't see huge improvements until Arrow Lake.


OCing any modern CPU should be illegal. The built-in "multi core enhancement" feature of motherboards should be plenty enough for everybody.

My 11700 (non-K) maxes out its 200 W limit set by my Asus Tuf motherboard in Prime95 and draws around 160-170 W in Cinebench R23 at 4.4 GHz. I literally needed to have my 280 mm AIO just to enable MCE on a locked (non-K) CPU! I'm not complaining because it never goes above 75-77 °C under normal loads (like CB R23 for example), but c'mon... who needs more than this for home use?



tfdsaf said:


> Way too power hungry and way too expensive! Why go for this KS variant over the K variant? There is no point other than people having too much money lying around and wanting to just give their money to Intel for nothing. Literally 1% faster than the K version, but way hotter and way more power hungry. This CPU consumes more power than top of the line graphic cards!
> 
> It looks even worse compared to the 5950x as its again only 1% faster overall, yet consumes over 150W more power and has temps of 100+ degrees Celsius.  I'm still sticking with AMD to punish Intel for almost a decade of garbage 2% incremental speed increases when they had a monopoly even though its now a tossup between their CPU's. The last 3 years I'd say AMD had the better processors and way more value oriented ones, but now its equal, but I'm going with AMD still as I don't want to reward Intel just yet for a full decade of garbage products!
> 
> I'm looking at AMD's Ryzen 6000 or 7000 whatever they call it to see how it stacks against intel's latest offerings. From what I've read its supposedly going to be available early Autumn.


Honestly, I think even K versions are a waste of money nowadays. One can just buy a no-suffix base version with a good B-series motherboard, unlock the power limits and call it a day. It's much cheaper than struggling for a +100 MHz boost on a wastefully expensive K or KS chip.


----------



## Patr!ck (Apr 2, 2022)

Excellent review W1zzard. I wished this Limited Edition chip got tested with the recently reviewed DDR5-6400 CL32 kit but I understand that your time was limited https://www.techpowerup.com/review/g-skill-trident-z5-ddr5-6400-cl32-2x-16-gb/


----------



## Readlight (Apr 2, 2022)

smooth experience


----------



## ThrashZone (Apr 2, 2022)

CallandorWoT said:


> those are insane temps for being on a 360mm AIO... nearly double the temp of 10th gen high end Intel chips... that is just wild...


Hi,
I missed the aio temp 
Temp page only showed air cooler 14s hitting 104c


----------



## damric (Apr 2, 2022)

Reminds me of the Vishera FX-9590 launch, but this i9 is actually good performance. It has taken forever to get from 5GHz to 5.5GHz stock Turbo clocks. Certainly not compelling me to leave AM4 just yet.


----------



## fevgatos (Apr 2, 2022)

Exactly like the original 12900k review, there must be something wrong with your temperatures. Im hitting 20 less C on the cbr23 with a u12a, that on paper at least, should be worse than your u14.


----------



## Chrispy_ (Apr 2, 2022)

430W with adaptive boost. That's a lot of power for a few hundred MHz.

Also, a 12600K will let you hit your GPU bottleneck for less than half the power consumption, and if you want to render/compute fast for $800, then you can pick up a 24C/48T 2970WX for less.

I guess if you have money, don't care about efficiency, heat, noise, and want the (temporary) best CPU, then this crazy, inefficient, halo monster is going to be it for a short while.


----------



## AusWolf (Apr 2, 2022)

Chrispy_ said:


> I guess if you have money, don't care about efficiency, heat, noise, and want the (temporary) best CPU, then this crazy, inefficient, halo monster is going to be it for a short while.


If you don't care about efficiency, heat, noise, or performance, just want something that's on the top of... some chart... somewhere... then buy a used FX-9590. 

Seriously, KS processors were never meant for sane people, or any kind of home user. The 12900KS is no exception. I don't even know who they're actually meant for, to be honest.


----------



## Assimilator (Apr 2, 2022)

FFS. It's a halo product at a halo price and halo power consumption. Why is everyone getting pissy about it? Do you get pissy about the existence of supercars too?


----------



## altermere (Apr 2, 2022)

"It's irresponsible performance" ©
btw, how did 6700k snuck up in there? it's $240 to boot, I recently sold mine for $120.


----------



## HD64G (Apr 2, 2022)

Assimilator said:


> FFS. It's a halo product at a halo price and halo power consumption. Why is everyone getting pissy about it? Do you get pissy about the existence of supercars too?


Since it should remind most of the old PC tech followers the situation with P4EE from Intel (that was made to win over Athlon 64 at any cost), what was back then to praise Intel for that will make us do so for this one?


----------



## mama (Apr 3, 2022)

Tigger said:


> 12900KS+RTX3090TI anyone?


Don't forget to add in the new power supply.


----------



## Jism (Apr 3, 2022)

damric said:


> Reminds me of the Vishera FX-9590 launch, but this i9 is actually good performance. It has taken forever to get from 5GHz to 5.5GHz stock Turbo clocks. Certainly not compelling me to leave AM4 just yet.



FX9590 was a halo product. Engineering thing that 5Ghz was capable, and proberly the first chip able to archieve. However past 5Ghz it really gets difficult. You can tell by the temps this thing runs into. And with a max of 115 degrees, it's beserk to think your close to the limits before the chip actually starts to melt down from it's own IHS or packaging.

Intel needs to work on finding ways to lower the temperature. The density of heat is so intense you cant cool it properly anymore, not even with a 360mm rad.

Running long term 115 degrees is'nt good for the chip nor board either, esp with such a power consumption. The 5800x / 5950 needs half of that, and you still have more threads available. Easy take.


----------



## Undertoker (Apr 3, 2022)

i’m a 54 year old undertaker who has been pc gaming and a pc enthusiast for about 35 years now
i have no other vices than my pc tbh.
i like having the very best and at the moment it is still for me intel and nvidia - but only just.
i won’t be buying a 3090ti but i’ve ordered this
but i do have concerns with the direction it’s going in
ive noticed a big change in the last couple of generations and it is power draw and as i said in an earlier post, the heat generated from this power draw means you end up gaming in your pants - it really is that hot.

i genuinely believe this may be my last intel purchase and nvidia now and that is solely due to power draw.
i’m at the limit of what i want to use power wise now

i’d much rather see the focus be on maintaining the current performance but focusing solely how to do it with less power - for me that would be a real achievement and something well worth buying, and i’d buy it.

As for the bitterness and resentment i see in some posts i’d simply say that this is a halo product - it isn’t for economy or value.
the K variant was already pushing the boundaries on this gen and the KS id agree is simply a cocktail of pushing it further and money grab, as it always is - it’s halo
So “value” simply is not a consideration here.
People should know that but apparently still don’t ?
In the same way you wouldn’t expect a ferrari to be particularly good on petrol this high end halo cpu was never going to be any exception - that’s what halo is.

But that said i would hope that moving forward these companies will begin to realise that silly amounts of power draw is taking the shine off things now and that it is time to focus solely on that i think.
As a pc enthusiast and gamer of 35 years i’d suggest we have the performance we need and dreamt of, we now need to focus solely on getting it at a lot less power draw.
that needs to be the goal now moving forward


----------



## qubit (Apr 3, 2022)

It's just as I thought, a tiny little performance upgrade for a hefty price increase and significantly more power usage. In other words, a complete waste of money. Better wait for the next gen if you really want to get better performance. And that's what, just a few months away now?

I already don't like this hybrid architecture for a desktop CPU and the fact that apps can get mischeduled, such as MySQL in the testing, would make me constantly wonder if that's happened and I'm getting lower performance because of it. I don't like uncertainty like that and for this reason I'd get the 12600 which has 6 P cores and no E cores and save a load of cash too. I don't think the scheduling will ever be 100% since the criteria for scheduling inherently has some vagueness to it.


----------



## progste (Apr 3, 2022)

The idea that "it's a halo product so it shouldn't be criticized" seems quite absurd.


----------



## Assimilator (Apr 3, 2022)

progste said:


> The idea that "it's a halo product so it shouldn't be criticized" seems quite absurd.


Do me a favour - read every post in this thread that criticises this CPU, and count how many say anything different to the cons in W1zzard's review. I'll save you the trouble: the answer is zero.

That's not criticism, that's mindless repetition. That's dogpiling for the sake of dogpiling. That is, to use a phrase I despise, virtue signalling. And it adds absolutely _nothing_ of value to the discussion around this CPU; in fact it just adds noise, and in doing so _detracts_ from any real discussion.

There's an old saying "if you can't say anything nice, don't say anything at all". I wish more people would substitute "useful" for "nice".


----------



## Solid State Brain (Apr 3, 2022)

If anything, since this is such a highly binned CPU, it should be able to run more efficiently at lower/moderate power levels than the 12900K, 12700K and lower models. Even without operating it at unlimited power or 115°C, you could still keep the awesome few-threads performance or possibly improve it by another 100-200 MHz with good cooling and thermal velocity boost overclock (TVB OC). Of course most end-users will simply want to run it to the max all-around to justify the purchasing price.


----------



## msimax (Apr 3, 2022)

i have a 5950x sitting next to a 12900k and they're best of friends lol


----------



## Why_Me (Apr 3, 2022)

progste said:


> That 5950x looks better with every competitor review.


Not for gaming.


----------



## Selaya (Apr 3, 2022)

to be fair buying anything above like, a 12700k's pointless for gaming anyways so there's that


----------



## watzupken (Apr 4, 2022)

This to me is a meaningless product from Intel as usual just for the sake of winning the performance crown and their ego. Not only did it win the performance crown, it also won the most power hungry and hottest CPU awards too. With this sort of heat and power requirement, it is not going to win Intel a lot of business in the enterprise space as well as Apple back (since it is clear that they are trying to outdo Apple‘s M1).



Selaya said:


> to be fair buying anything above like, a 12700k's pointless for gaming anyways so there's that


I agree. In fact for gaming, an i5 is more than enough.


----------



## cyberloner (Apr 4, 2022)

win almost everything even with the power usage xD


----------



## _JP_ (Apr 4, 2022)

History seemingly repeats itself, despite more advanced new µArch. @W1zzard's 1st paragraph oddly makes the recall for when the Pentium Extreme Edition was launched (Halo product, 130W back when there were few air coolers even for +95W, EIST was new back then). And this comparison, I believe, is very close when the 5800X3D launches.
Quoting Wiki:

Comparison to Athlon 64 X2

The competing AMD Athlon 64 X2, although running at lower clock rates and lacking Hyper-threading, had some significant advantages over the Pentium D, such as an integrated memory controller, a high-speed HyperTransport bus, a shorter pipeline (12 stages compared to the Pentium D's 31), and better floating point performance,[11] more than offsetting the difference in raw clock speed. Also, while the Athlon 64 X2 inherited mature multi-core control logic from the multi-core Opteron, the Pentium D was seemingly rushed to production and essentially consisted of two CPUs in the same package. Indeed, shortly after the launch of the mainstream Pentium D branded processors (26 May 2005) and the Athlon 64 X2 (31 May 2005), a consensus arose that AMD's implementation of multi-core was superior to that of the Pentium D.[12][13] As a result of this and other factors, AMD surpassed Intel in CPU sales at US retail stores for a period of time, although Intel retained overall market leadership because of its exclusive relationships with direct sellers such as Dell.[14]


----------



## Vayra86 (Apr 4, 2022)

BorisDG said:


> Are your sure?  Seems exactly like joke.



Blame the product not the review 



Assimilator said:


> FFS. It's a halo product at a halo price and halo power consumption. Why is everyone getting pissy about it? Do you get pissy about the existence of supercars too?



In this day and age? Yes

We also get pissy about Bezos shooting himself into space for no reason. These products fall in that category. Meant for nothing, except epeen at the expense of planet. When we see that we're going to have to cut deep into our own lives to keep things habitable, products like these simply have no right to exist.

Besides, why are you so pissy about other people having their opinion about a product? Implying its not valid 'because there are other things'?

Live and let live goes both ways... and this e-waste is certainly not taking that principle into account


----------



## Warigator (Apr 4, 2022)

Releasing new products that are only 2-4% faster than previous ones makes no sense. Just keep selling what you had already released.


----------



## ratirt (Apr 4, 2022)

$150 for literally few % up and the power consumption is hard not to notice or it is hard not be concerned even if you were blind and yet, people still praise it since this is the top model and almost saying that it is supposed to be like that.
I remember same people, bashing on AMD for a 3700X to 5800X price difference of $50. It's a new architecture nonetheless. And of course the heat 5800X produce (80c? if I remember correctly?) was unacceptable even thouh the performance jump between 3700x and 5800x was extraordinary especially in games. Look here now at Intel's top tier CPU. The temps are hotter than a frying oil (with a top cooling solution), power draw is so high and it is still OK and more, they claim it should be that way now. Obviously, giving an argument that in gaming, it is not that hot or it doesnt draw that much power. It's just funny though.


----------



## chrcoluk (Apr 4, 2022)

nice micro portable heater that also plays games.


----------



## Arcdar (Apr 4, 2022)

Wizz, thanks for being the perfect German  ...

even if this wouldn't be a great review (your love for detail is just "adorable" in a very positive way  ) I'd celebrate you "hard" just for this comment

"*In case you're wondering, this is a proper review, not an April Fools' prank."*

 ... Man, I read it only today because of time restrictions but this alone just made my day . Thanks for the laugh


----------



## The red spirit (Apr 4, 2022)

Undertoker said:


> Waiting for my KS to arrive , I bought the 9900KS and never regretted it for a moment.
> I’m not as keen on the recently announced 3090ti though, I’m already gaming in my pants after a couple of hours running a 3090 OC.- the heat is atrocious on a 3090, so a 3090ti will clearly be a lot worse, then there is the power draw….
> I think they need to start thinking very seriously about limiting power draw before governments legislate and ruin it - as they do everything else.
> They are robbing us blind now across Europe for electricity and this comes as british MP’s now get their electricity paid for by us the tax payers now on top of a £2200 pay rise, so clearly it doesn’t matter them, but “we are all in it together“ eh…


Ain't nobody is robbing you in Europe, at least not everywhere. VAT was removed for electricity and Lithuania said no to Russian gas, my own city makes more electricity than it consumes. Anyway, you can't expect a place that is not resource rich, to make tons of electricity and for cheap. And by the way California made some restrictions for wattage and it's totally okay. We don't need trash like 3090 Ti, Jensen and Lisa too long gave zero fucks about heat output and power usage, so it's all good that it bites them in the arse. Not sure about others, but I fucking hate, when GPU is scorching my legs under desk and is as loud as leafblower. If our future is requirement for AC, if you want to play games, then Jensen can stick his RTX 4090 up his arse.


----------



## fevgatos (Apr 4, 2022)

The red spirit said:


> Ain't nobody is robbing you in Europe, at least not everywhere. VAT was removed for electricity and Lithuania said no to Russian gas, my own city makes more electricity than it consumes. Anyway, you can't expect a place that is not resource rich, to make tons of electricity and for cheap. And by the way California made some restrictions for wattage and it's totally okay. We don't need trash like 3090 Ti, Jensen and Lisa too long gave zero fucks about heat output and power usage, so it's all good that it bites them in the arse. Not sure about others, but I fucking hate, when GPU is scorching my legs under desk and is as loud as leafblower. If our future is requirement for AC, if you want to play games, then Jensen can stick his RTX 4090 up his arse.


I dont understand your post. You dont need a 3090ti or a 4090 to play games. Unless im missing something, there are cards starting from 75w tdp. Buy one of those? You know nobody is forcing you to buy a 300 or 500 watt gpu, right? Right??


----------



## AusWolf (Apr 4, 2022)

fevgatos said:


> I dont understand your post. You dont need a 3090ti or a 4090 to play games. Unless im missing something, there are cards starting from 75w tdp. Buy one of those? You know nobody is forcing you to buy a 300 or 500 watt gpu, right? Right??


That's true, although midrange parts have also been creeping up in power consumption. The 960 and 1060 ate 120 W, the 2060 160 W and the 3060 180 W if I remember right (not to mention that low profile / no power connector options have completely disappeared). The same is true for CPUs. The 7700 I had just about fit into its 65 W TDP running at advertised 4 GHz all-core, but my 11700 can only do 2.8 GHz in Cinebench R23 within the same 65 W limit. It needs liquid cooling and a good motherboard that can supply it with 160+ W to reach its factory turbo bins (it's a locked CPU, so overclocking isn't even in the picture). One can praise AMD for their recent innovations, but their chiplet design isn't easier to cool at all. I briefly tried a R5 3600 which surprisingly ran hotter than my 11700 with the same cooler and power limits. These are all midrange parts...


----------



## ThrashZone (Apr 4, 2022)

Hi,
Why do i keep hearing voices


----------



## Solid State Brain (Apr 5, 2022)

AusWolf said:


> The 7700 I had just about fit into its 65 W TDP running at advertised 4 GHz all-core, but my 11700 can only do 2.8 GHz in Cinebench R23 within the same 65 W limit.



The i7-7700 was a 4-core processor, the i7-11700 an 8-core processor. Both were made on a 14nm lithography; performance can only improve so much without further shrinking transistor size and more fundamental architectural changes. Try seeing what happens by limiting the 11700 to 4 cores.


----------



## AusWolf (Apr 5, 2022)

Solid State Brain said:


> The i7-7700 was a 4-core processor, the i7-11700 an 8-core processor. Both were made on a 14nm lithography; performance can only improve so much without further shrinking transistor size and more fundamental architectural changes. Try seeing what happens by limiting the 11700 to 4 cores.


A good point, although GPUs have been shrinking in lithography, but that doesn't reflect on their power consumption. I guess nvidia and AMD are trying to get so much performance out of everything they sell that efficiency gets thrown out of the window regardless of lithography.


----------



## The red spirit (Apr 5, 2022)

fevgatos said:


> I dont understand your post. You dont need a 3090ti or a 4090 to play games. Unless im missing something, there are cards starting from 75w tdp. Buy one of those? You know nobody is forcing you to buy a 300 or 500 watt gpu, right? Right??


Pal, I don't think that anyone needs that. 75 watt parts are scarce, the last card with that was RX 560 or GTX 1650. They are too old and too weak for me to care. Anyway, I already have a card I need, I'm just not terribly excited about evolution of GPUs, as they tend to get worse and worse in everything else, other than absolute power. Performance per watt seems to be stagnant.


----------



## fevgatos (Apr 5, 2022)

AusWolf said:


> That's true, although midrange parts have also been creeping up in power consumption. The 960 and 1060 ate 120 W, the 2060 160 W and the 3060 180 W if I remember right (not to mention that low profile / no power connector options have completely disappeared). The same is true for CPUs. The 7700 I had just about fit into its 65 W TDP running at advertised 4 GHz all-core, but my 11700 can only do 2.8 GHz in Cinebench R23 within the same 65 W limit. It needs liquid cooling and a good motherboard that can supply it with 160+ W to reach its factory turbo bins (it's a locked CPU, so overclocking isn't even in the picture). One can praise AMD for their recent innovations, but their chiplet design isn't easier to cool at all. I briefly tried a R5 3600 which surprisingly ran hotter than my 11700 with the same cooler and power limits. These are all midrange parts...


You can always buy  a lower end model though. I mean if you had a 1060 and you went to remain at the 120W tdp,  you don't need to buy a 3060, you can go for the 3050 (that's also 120w tdp). I think the problem is, people want the performance of a 3090 with the TDP of a 3050, which of course can't happen and I don't think that's Jensen's or Lisa's fault.



The red spirit said:


> Performance per watt seems to be stagnant.


I don't think that's true. There isn't a huge progress, but there is progress. For example a 6600 is 30% faster than a 1070 at lower TDP, the 6600xt is 60% faster at the same TDP. So we have a 60% increase in performance / watt compared to 2017. Not great, not terrible either.


----------



## ratirt (Apr 5, 2022)

fevgatos said:


> You can always buy a lower end model though. I mean if you had a 1060 and you went to remain at the 120W tdp, you don't need to buy a 3060, you can go for the 3050 (that's also 120w tdp). I think the problem is, people want the performance of a 3090 with the TDP of a 3050, which of course can't happen and I don't think that's Jensen's or Lisa's fault.


That's not the problem. From what people are saying and are concerned about, the problem is each new gen offers more performance but the power is up as well. Basically the nodes are more advanced, are shrinking but the power consumption grows. Were you had a 75watt cards capable of playing games in a decent res and detail now you need twice that much power to consider gameplay worth your attention.


fevgatos said:


> I don't think that's true. There isn't a huge progress, but there is progress. For example a 6600 is 30% faster than a 1070 at lower TDP, the 6600xt is 60% faster at the same TDP. So we have a 60% increase in performance / watt compared to 2017. Not great, not terrible either.


You should compare graphics from the same producer but different architectures as that is an indicator of power consumption progress or power/performance progress for every new architecture release in a company. If you compare 1070 from Nvidia compare it to Turing or Ampere not AMD RDNA2 since that is a different company and they have totally different products.


----------



## Solid State Brain (Apr 5, 2022)

What exactly are the detail level and resolutions that older 75W GPUs were capable of at decent levels, and at what framerate? I think that the view that there has been no improvement at the very least disregards that the commonly used display resolutions on desktop systems have increased over the years, as well as general demand for graphical detail.

I find highly unlikely that nodes shrunk without corresponding efficiency increases. There's no question that peak GPU power has been steadily increasing, but people need to tweak the power limit not just for overclocking, but also downward for efficient operation and ask reviewers to test this, not only peak performance as usual. Same story as with CPUs, really.


----------



## The red spirit (Apr 5, 2022)

fevgatos said:


> I don't think that's true. There isn't a huge progress, but there is progress. For example a 6600 is 30% faster than a 1070 at lower TDP, the 6600xt is 60% faster at the same TDP. So we have a 60% increase in performance / watt compared to 2017. Not great, not terrible either.


There's such comparison (even if it's incorrect, since you compare ancient card with brand new card, compare RDNA with RDNA2 or Polaris with RDNA2) and there is "what can I buy" thing. Current line up has disproportionally too many high wattage cards and too little lower wattage cards. AMD doesn't even have anything at 50 watts or 75 watts. nVidia doesn't even have anything at less than 130 watts. I know it's personal, but my personal upper TDP limit is around 130 watts, but I also like 100 watt cards. That used to be xx60 model power usage. AMD only has 6600, 6500 xt could have been okay in perf/watt, if it wasn't recycled laptop chip, which somehow lacks features that cards from early 2010s had. The only choice I could have is either 3050 or 6600. That's not much of choice. And then 3050 sucks donkey balls, it is slower than 6600, uses more power and costs the same or more. 6600 is nothing special and frankly is just okay card at too high price too. I remember I bought RX 560 4GB and in its day it ran games at 1440p 50-60 fps medium-high settings, was 150 Euros and consumed just 37 watts (since I accidentally got version without 6 pin connector). Imagine doing that today, it's not the same anymore. Your only option would be detuned 1650.



Solid State Brain said:


> What exactly are the detail level and resolutions that older 75W GPUs were capable of at decent levels, and at what framerate?


It depends, but 7000 series Radeons in their day were capable of 1080p medium-high at 45-50 fps. RX 560 that I have is 37.5 watt model and when it was new it could run CoD WW2, GTA 5 at 1440p medium-high with 50-60 fps. 1050 is RX 560, but a little bit faster and with only 2 GB VRAM. Still, it could have been decent 1080p card. They certainly weren't awful e-waste.



Solid State Brain said:


> I think that the view that there has been no improvement at the very least disregards that the commonly used display resolutions on desktop systems have increased over the years, as well as general demand for graphical detail.


That's a bit of farce. Even 5 years ago nobody running games at ultra was really thinking "yeah this game looks like ass, I want more". At this point graphics quality improvements barely matter anymore and it's hard to tell high from ultra, sometimes even medium from ultra. Games don't look bad, haven't looked bad for pretty much decade now, that's not terribly important. At lower budget, you are happy to run games at medium, not ultra or high, so that's irrelevant argument too. And it's a farce, mostly because devs never asks us if we want that, they just "improve". At this point you just get more and more bloat, rather than legit improvements to graphics or whatever else. My main reason for upgrading last card wasn't because it sucked, but because lack of talent among developers left me no other choice. It's so bad, that I can run a game from decade ago at say high setting at 1440p with RX 560, it looks good and runs good, but with same card today, latest AAA title runs at 900p low with barely acceptable fps. I don't see how it's better or improved. Sounds like bullshit to me. Even if current AAA title ran well at same resolution, I don't think that I could tell that new game looks actually better at low settings, than old one at high settings. If anyone actually cared about games running well, nVidia and AMD would be out of business.


----------



## Shatun_Bear (Apr 5, 2022)

There will be some poor sap out there that will pair this 300W CPU with a 450W 3090 Ti and wonder why he's sitting there sweating when loading up the menu of Fortnite or other life wasting mainstream game.


----------



## ratirt (Apr 5, 2022)

Solid State Brain said:


> What exactly are the detail level and resolutions that older 75W GPUs were capable of at decent levels, and at what framerate? I think that the view that there has been no improvement at the very least disregards that the commonly used display resolutions on desktop systems have increased over the years, as well as general demand for graphical detail.
> 
> I find highly unlikely that nodes shrunk without corresponding efficiency increases. There's no question that peak GPU power has been steadily increasing, but people need to tweak the power limit not just for overclocking, but also downward for efficient operation and ask reviewers to test this, not only peak performance as usual. Same story as with CPUs, really.


Maybe that is another thing you can compare? Why do you ask me about it. I'm just saying, and agree with others, there were cards you could use to play games with a decent resolution and detail level (for that time).
With a node shrink, you can use that node to exert more power or performance than efficiency. It would seem that's the trend for graphics cards. Putt a large emphasis on the performance increase rather than efficiency or either balance.
If you think about performance as an improvement sure, it has increased no doubt but the power consumption has increased as well.


----------



## fevgatos (Apr 5, 2022)

The red spirit said:


> I know it's personal, but my personal upper TDP limit is around 130 watts, but I also like 100 watt cards.


They say vote with your wallet. You bought an rx 580 that draws 180w instead of a 1060 that draws 120w,. If everyone that cares about power consumption did what you did, no wonder there aren't many low power consumption cards, right? I mean why would nvidia stick the xx60 series to 120w when even people that care about power consumption (like you) opted for an rx580 instead?



Shatun_Bear said:


> There will be some poor sap out there that will pair this 300W CPU with a 450W 3090 Ti and wonder why he's sitting there sweating when loading up the menu of Fortnite or other life wasting mainstream game.


Im pretty sure my 12900k doesn't draw 300w in a fortnite loading screen.  It barely hits 80-90 during games so yeah...NOPE


----------



## The red spirit (Apr 5, 2022)

fevgatos said:


> They say vote with your wallet. You bought an rx 580 that draws 180w instead of a 1060 that draws 120w,. If everyone that cares about power consumption did what you did, no wonder there aren't many low power consumption cards, right? I mean why would nvidia stick the xx60 series to 120w when even people that care about power consumption (like you) opted for an rx580 instead?


Because I'm not mister money bags mate. Saw good deal and took it, but here's a thing. I also was willing to modify my card's vBIOS to my liking and now it's capped to 100 watts and 1100 MHz core clock. In practice, actual power consumption of card varies and is 70-90 watts in gaming. With such tune, I can only reach 100 watt power usage in very specific mining or distributed computing situations, maybe in Furmark too. Also my specific card wasn't 180 watts, it's 145 watt card from factory and in most conditions actually only consumed 130-140 watts. Why that happens? Because card hits clock speed, voltage wall and games don't utilize GPU resources ideally ever. Also you didn't really know that Polaris cards are phenomenal undervolters, I tried that out, but didn't stick with it. There are good gains, but alone weren't good enough for me. So I ended up with card, that is faster than RX 570, but slower than RX 480, while consuming less power than GTX 1060 and is slightly slower than GTX 1060, but performance per watt is higher. So, yeah I'm fucking menace of free market and of reasonable cards. Anyway, where I live, RX 580 which I bought was selling for 210 Euros new, meanwhile GTX 1060 6GB was going for over 300 Euros. As you see, I don't like to be robbed by Jen-chan and took proper action against that. Unfortunately, AMD clamped down on vBIOS modifications and now you can't do it in reasonable way anymore, so next time I will care about official wattage rating much more and next time I won't buy lowest end model of card either.

Edit:
A bit off-topic, but my whole PC with CPU and GPU mining at same time, uses only 230 watts or less. In gaming it uses 170-210 watts. I think I managed to reach 250-260 watts with prime95 small FFTs and Furmark at same time. At idle it sips 45-50 watts. When turned off, it consumes 0.2 watts. Come on dude, your i9 uses more power than my whole PC, you have some audacity to knock on its power usage.


----------



## fevgatos (Apr 5, 2022)

The red spirit said:


> Because I'm not mister money bags mate. Saw good deal and took it, but here's a thing. I also was willing to modify my card's vBIOS to my liking and now it's capped to 100 watts and 1100 MHz core clock. In practice, actual power consumption of card varies and is 70-90 watts in gaming. With such tune, I can only reach 100 watt power usage in very specific mining or distributed computing situations, maybe in Furmark too. Also my specific card wasn't 180 watts, it's 145 watt card from factory and in most conditions actually only consumed 130-140 watts. Why that happens? Because card hits clock speed, voltage wall and games don't utilize GPU resources ideally ever. Also you didn't really know that Polaris cards are phenomenal undervolters, I tried that out, but didn't stick with it. There are good gains, but alone weren't good enough for me. So I ended up with card, that is faster than RX 570, but slower than RX 480, while consuming less power than GTX 1060 and is slightly slower than GTX 1060, but performance per watt is higher. So, yeah I'm fucking menace of free market and of reasonable cards. Anyway, where I live, RX 580 which I bought was selling for 210 Euros new, meanwhile GTX 1060 6GB was going for over 300 Euros. As you see, I don't like to be robbed by Jen-chan and took proper action against that. Unfortunately, AMD clamped down on vBIOS modifications and now you can't do it in reasonable way anymore, so next time I will care about official wattage rating much more and next time I won't buy lowest end model of card either.
> 
> Edit:
> A bit off-topic, but my whole PC with CPU and GPU mining at same time, uses only 230 watts or less. In gaming it uses 170-210 watts. I think I managed to reach 250-260 watts with prime95 small FFTs and Furmark at same time. At idle it sips 45-50 watts. When turned off, it consumes 0.2 watts. Come on dude, your i9 uses more power than my whole PC, you have some audacity to knock on its power usage.


Im not knocking on your pc's power usage, im knocking on you pretending to care about it when you buy an inefficient card  Yes you can tune it, but so can you the 1060 and you are back to square one.  I don't know about shops in your country but the 1060 was always cheaper than the rx580, since its' also 1 year older. The 580 came out in 2017, by that time 1060's in my country were 230-250.


----------



## The red spirit (Apr 5, 2022)

fevgatos said:


> Im not knocking on your pc's power usage, im knocking on you pretending to care about it when you buy an inefficient card  Yes you can tune it, but so can you the 1060 and you are back to square one.  I don't know about shops in your country but the 1060 was always cheaper than the rx580, since its' also 1 year older. The 580 came out in 2017, by that time 1060's in my country were 230-250.


RX 580 is just better card overall. Efficiency matters, but so does price, longevity. 1060 has 6GB VRAM, RX 580 has 8GB and it will last longer. 1060 was stupidly uneconomical to buy, again RX 580 wins. 1060 had slightly lower TDP stock, 1060 wins, but you make it sound like RX 580 is just complete garbage in that aspect. The difference is 30 watts. Not nothing, but not a lot either. If you claim that you aren't knocking my computer's power usage, how come you argue so much about 1060? Especially, when I clearly state, that it was stupidly priced, uncompetitive. Also 1060 didn't really come out earlier, RX 580 was refreshed RX 480, which came out in 2016. There's no architectural difference, just slightly better yield and efficiency thrown out of window. I have managed to make RX 480's vBIOS to work on RX 580, but figured, that even RX 480 wasn't efficient enough for me.

If you still haven't got the memo, I don't care about your country and its prices, I don't live there. You can't even write in which currency those numbers are. It was what it was in Lithuania and the only 1060 you could buy for similar price to 580 was that 3GB scam version, which was DOA. As if VRAM situation wasn't insulting already, they also cut down core. Again, it was incompetitive with RX 570, let alone RX 580.


----------



## AusWolf (Apr 6, 2022)

Solid State Brain said:


> I find highly unlikely that nodes shrunk without corresponding efficiency increases. There's no question that peak GPU power has been steadily increasing, but people need to tweak the power limit not just for overclocking, but also downward for efficient operation and ask reviewers to test this, not only peak performance as usual. Same story as with CPUs, really.


GTX 1080: 16 nm, 180 W.
RTX 2070: 12 nm, 175 W.
RTX 3060: 8 nm, 170 W.

All three offer relatively the same performance with relatively the same power requirement. Where's the advantage of the shrunk node? Why should I tweak my power levels when graphics cards offered decent efficiency out of the box a couple years ago?



Solid State Brain said:


> What exactly are the detail level and resolutions that older 75W GPUs were capable of at decent levels, and at what framerate? I think that the view that there has been no improvement at the very least disregards that the commonly used display resolutions on desktop systems have increased over the years, as well as general demand for graphical detail.


My 1050 Ti kicked ass with passive cooling and no power connector. Which current gen midrange graphics card can do the same?


----------



## Mussels (Apr 6, 2022)

How is this worth it to anybody, other than people who's literal job is benchmarking and overclocking?



AusWolf said:


> GTX 1080: 16 nm, 180 W.
> RTX 2070: 12 nm, 175 W.
> RTX 3060: 8 nm, 170 W.
> 
> ...


the 3060 is a good 20-25% faster than the 1080 - 25% faster with the same power consumption is exactly what you want from a product two generations newer - an improvement.


----------



## AusWolf (Apr 6, 2022)

Mussels said:


> the 3060 is a good 20-25% faster than the 1080 - 25% faster with the same power consumption is exactly what you want from a product two generations newer - an improvement.


That's nothing considering that the 1060 6 GB is nearly twice as fast as the 960 with the same power consumption - THIS is what I want from a product ONE generation newer.

Or let's just continue playing the "20% improvement" game so I can keep my 2070 longer.


----------



## InVasMani (Apr 6, 2022)

It's the high end halo cards for GPU's that represents the biggest problem just like the 12900KS. The die space required and the worse yields in relation to die space and in the case of GPU's the additional VRAM and other circuitry requirements. These things all diminish the likely hood of adequate supply and cost association of the lower tier brackets of cards below them that could be reproduced more affordably and abundantly. Fewer SKU's and a quicker architecture turn around product cycles with less outrageous halo tier products that are out of hand is what is needed. Standards to follow, adhere, and stick towards meeting is overdue. It's high time that we see higher efficiency standards for GPU's and CPU's and PSU's across the spectrum towards a more sustainable environmentally friendly and more open accessibility.

GPU's should go thru a generation of performance progress followed by efficiency progress on a tick tock cycle. Don't exceed power draw from the previous generation in the subsequent generation which would be use as a optimization focused one that also brings down the cost for everyone at the same time and improves availability to all. It's more environmentally sound and better from a societal fairness standpoint as well. The tech industry doesn't need to emulate the muscle car industry. There has already been enough consequences from poor decision making in regard to the auto industry the tech industry doesn't need to compound that problem with it's own set of misguided decision making.


----------



## fevgatos (Apr 6, 2022)

The red spirit said:


> RX 580 is just better card overall. Efficiency matters, but so does price, longevity. 1060 has 6GB VRAM, RX 580 has 8GB and it will last longer. 1060 was stupidly uneconomical to buy, again RX 580 wins. 1060 had slightly lower TDP stock, 1060 wins, but you make it sound like RX 580 is just complete garbage in that aspect. The difference is 30 watts. Not nothing, but not a lot either. If you claim that you aren't knocking my computer's power usage, how come you argue so much about 1060? Especially, when I clearly state, that it was stupidly priced, uncompetitive. Also 1060 didn't really come out earlier, RX 580 was refreshed RX 480, which came out in 2016. There's no architectural difference, just slightly better yield and efficiency thrown out of window. I have managed to make RX 480's vBIOS to work on RX 580, but figured, that even RX 480 wasn't efficient enough for me.
> 
> If you still haven't got the memo, I don't care about your country and its prices, I don't live there. You can't even write in which currency those numbers are. It was what it was in Lithuania and the only 1060 you could buy for similar price to 580 was that 3GB scam version, which was DOA. As if VRAM situation wasn't insulting already, they also cut down core. Again, it was incompetitive with RX 570, let alone RX 580.


https://tpucdn.com/review/msi-rx-580-mech-2/images/power_average.png}

Techpowerup shows the 580 at 198w and the 1060 at 116 watts. The difference is huge. I know cause I bought a 1060 specifically for that reason, it had way lower power consumption. Both cards are too slow for the  ram to make any difference but whatever

As i've said, people vote with their wallet, nvidia offered you almost twice the efficiency and you ignored it, so yeah, makes sense they got the memo that you don't care about efficiency.


----------



## Solid State Brain (Apr 6, 2022)

AusWolf said:


> GTX 1080: 16 nm, 180 W.
> RTX 2070: 12 nm, 175 W.
> RTX 3060: 8 nm, 170 W.
> 
> All three offer relatively the same performance with relatively the same power requirement. Where's the advantage of the shrunk node? Why should I tweak my power levels when graphics cards offered decent efficiency out of the box a couple years ago?


Transistor count has steadily increased, features have increased, VRAM has increased. It's not really a fair comparison considering that there's no real power constraint nor demand for it on desktop systems. The desktop RTX3060 still is about 20% faster than the GTX 1080 as already mentioned above.

If on the other hand you check out the mobile versions (where clocks are lower, allowing for more efficient operation due to a practical need for lower power), it becomes clearer that smaller nodes lead in principle to better efficiency, which should be an obvious statement anyway:

GTX 1080 Mobile: 150W
RTX 2070 Mobile: 115W
RTX 3060 Mobile: 80W

These should actually be performance-wise all within a few % from each other.



> Why should I tweak my power levels when graphics cards offered decent efficiency out of the box a couple years ago?


Because most desktop gamers do not care as long as power remains within reasonable levels, and manufacturers have realized this. There's no need to artificially gimp performance when end-users can do that themselves if they want or need.



> My 1050 Ti kicked ass with passive cooling and no power connector. Which current gen midrange graphics card can do the same?


Once current midrange GPUs will have the same inflation-adjusted price of midrange GPUs of when your 1050Ti was released, low-power, passive GPUs might start appearing as well.

Until then, this won't make economically sense neither for manufacturers nor end-users, also given that the latter can adjust power themselves and probably already run their cards passively or semi-passively given the massive coolers they generally come up with nowadays.



AusWolf said:


> That's nothing considering that the 1060 6 GB is nearly twice as fast as the 960 with the same power consumption - THIS is what I want from a product ONE generation newer.


100% improvement after one generation is never going to happen (or: never again, if you were referring to some cases from the late 1990s-early 2000s)


----------



## fevgatos (Apr 6, 2022)

AusWolf said:


> GTX 1080: 16 nm, 180 W.
> RTX 2070: 12 nm, 175 W.
> RTX 3060: 8 nm, 170 W.
> 
> All three offer relatively the same performance with relatively the same power requirement. Where's the advantage of the shrunk node? Why should I tweak my power levels when graphics cards offered decent efficiency out of the box a couple years ago?


It's a little bit of an unfair comparison, only because the 3060 is probably the worst Nvidia offering. It's just a bad card. If you compare the 3060ti to a 1080ti, the difference is massive. Τhe 1080ti consumes 25% more while being 15% slower. That's without even including all the goodies of the 3060ti (dlss / rt etc.). So yeah...

Even a 3070 is around 35-40% faster at lower tdp..


----------



## AusWolf (Apr 6, 2022)

Solid State Brain said:


> Transistor count has steadily increased, features have increased, VRAM has increased. It's not really a fair comparison considering that there's no real power constraint nor demand for it on desktop systems. The desktop RTX3060 still is about 20% faster than the GTX 1080 as already mentioned above.


Features and VRAM have got little to do with the whole card's power consumption, as long as you have the same number of RAM chips of the same kind. Wait, the 1060 actually has more than the 960. 



Solid State Brain said:


> If on the other hand you check out the mobile versions (where clocks are lower, allowing for more efficient operation due to a practical need for lower power), it becomes clearer that smaller nodes lead in principle to better efficiency, which should be an obvious statement anyway:
> 
> GTX 1080 Mobile: 150W
> RTX 2070 Mobile: 115W
> ...


Then there should be no reason for nvidia and AMD to shoot their desktop cards' TDPs through the roof. Sure, the extra 5% will convince idiots people who still believe that 5% is visible while one's focus is on the game, but it also portrays their products as inefficient pieces of garbage that need gigantic coolers to run at acceptable temperatures... unless the same "all for 5%" people also believe that bigger is always better.



Solid State Brain said:


> Because most desktop gamers do not care as long as power remains within reasonable levels, and manufacturers have realized this. There's no need to artificially gimp performance when end-users can do that themselves if they want or need.


I guess I'm in the minority with my love for small form factor / passively cooled hardware. I'm happier to see a modern game run on an iGPU or old / low profile PC than to see a hundred core CPU with a 3090 in action.



Solid State Brain said:


> Once current midrange GPUs will have the same inflation-adjusted price of midrange GPUs of when your 1050Ti was released, low-power, passive GPUs might start appearing as well.


I hope you're right.



Solid State Brain said:


> Until then, this won't make economically sense neither for manufacturers nor end-users, also given that the latter can adjust power themselves and probably already run their cards passively or semi-passively given the massive coolers they generally come up with nowadays.


Actually, I think we live in a time when it makes perfect sense. Low power cards need less electricity which isn't only good for the green movements, but also to counteract rising energy prices. They also need smaller heatsinks that are cheaper to manufacture. Nvidia / AMD shouldn't blame the high price of their products on resource costs when they themselves design them to be needing bigger coolers and beefier VRMs than they practically should. I mean, sure, copper and aluminium are expensive, but who said you must have 5 kg of them on your graphics card when with a little (factory) tweaking, it would work just fine with a lot less? Also, having fewer fans (or no fans) on your graphics card significantly decreases the chance of failure, also decreasing the amount of e-waste on the planet.



Solid State Brain said:


> 100% improvement after one generation is never going to happen (or: never again, if you were referring to some cases from the late 1990s-early 2000s)


Then maybe generations shouldn't come so soon after each other, either. At least to me, it doesn't make any sense to release a product that's barely better than the last one.



fevgatos said:


> It's a little bit of an unfair comparison, only because the 3060 is probably the worst Nvidia offering. It's just a bad card. If you compare the 3060ti to a 1080ti, the difference is massive. Τhe 1080ti consumes 25% more while being 15% slower. That's without even including all the goodies of the 3060ti (dlss / rt etc.). So yeah...
> 
> Even a 3070 is around 35-40% faster at lower tdp..


True. Though x80 Ti cards have always been halo products. Only that nvidia decided to change that to x90 with Ampere - or just decided to release several halo products within the same generation (whichever way one looks at it). Also, you're comparing within two generational gaps again. If I saw the same improvement coming from Turing, I'd say it's great.

Not that I'm complaining, though. Like I said, with baby steps like these, I can probably keep my 2070 for a long time.


----------



## fevgatos (Apr 6, 2022)

AusWolf said:


> True. Though x80 Ti cards have always been halo products. Only that nvidia decided to change that to x90 with Ampere - or just decided to release several halo products within the same generation (whichever way one looks at it). Also, you're comparing within two generational gaps again. If I saw the same improvement coming from Turing, I'd say it's great.
> 
> Not that I'm complaining, though. Like I said, with baby steps like these, I can probably keep my 2070 for a long time.


Well if there was no DLSS or RT I wouldn't replace my 1080ti with a 3070 for example, but we have to keep in mind that the 1080ti was a 700$ card while the 3070 was a 500$ card. So 2 gens for 40% more rasterization performance, cheaper price point, lower cost, lower consumption and RT + DLSS isn't bad at all in my opinion.

Just to give you an example though, my tuned 1080ti (watercooled) was basically half (and sometimes less than that!!) the performance of my 3090, while the first one was hitting 250w power consumption and the 2nd one ~400.  It's an okay improvement in terms of performance / watt.


----------



## AusWolf (Apr 6, 2022)

fevgatos said:


> Well if there was no DLSS or RT I wouldn't replace my 1080ti with a 3070 for example, but we have to keep in mind that the 1080ti was a 700$ card while the 3070 was a 500$ card. So 2 gens for 40% more rasterization performance, cheaper price point, lower cost, lower consumption and RT + DLSS isn't bad at all in my opinion.


I would agree  if the 3070 wasn't a $500 card only on paper. And again, we're talking about two generations here. There is a lot less improvement coming from Turing (basically nothing).

Ampere should have been named "Turing Refresh" in my opinion.


----------



## InVasMani (Apr 6, 2022)

AusWolf said:


> Features and VRAM have got little to do with the whole card's power consumption, as long as you have the same number of RAM chips of the same kind. Wait, the 1060 actually has more than the 960.
> 
> 
> Then there should be no reason for nvidia and AMD to shoot their desktop cards' TDPs through the roof. Sure, the extra 5% will convince idiots people who still believe that 5% is visible while one's focus is on the game, but it also portrays their products as inefficient pieces of garbage that need gigantic coolers to run at acceptable temperatures... unless the same "all for 5%" people also believe that bigger is always better.
> ...


I agree with most of this though I feel a tick/tock architecture transitioning between performance and optimization architectures is the way to go. Keep a ceiling threshold in mind for performance architecture and don't budge beyond it and during the optimization architecture bump up efficiency while scaling down the power draw increasing efficiency and reducing the cost of the previous generation. In particular scaling down the highest tier SKU's should be possible thanks to advancements in different area's like VRAM speeds and capacities along with node transitions and refinement plus better yields to the node itself over time. I think racing to chase performance beyond a very non favorable tipping point at the expensive of power draw and yields + added material costs is a clear mistake that hurts the majority of consumers and the environment in the process. Say what you will, but I don't consider it very wise to feed said wolf.


----------



## ratirt (Apr 6, 2022)

fevgatos said:


> Well if there was no DLSS or RT I wouldn't replace my 1080ti with a 3070 for example, but we have to keep in mind that the 1080ti was a 700$ card while the 3070 was a 500$ card. So 2 gens for 40% more rasterization performance, cheaper price point, lower cost, lower consumption and RT + DLSS isn't bad at all in my opinion.
> 
> Just to give you an example though, my tuned 1080ti (watercooled) was basically half (and sometimes less than that!!) the performance of my 3090, while the first one was hitting 250w power consumption and the 2nd one ~400. It's an okay improvement in terms of performance / watt.


So you want to tell me that if you have a GPU that is 10x faster than a 1080 Ti consuming 2000W of power that would have been a great achievement and an improvement because of performance / watt?
Because your point of view strongly suggests that course of action.


----------



## The red spirit (Apr 6, 2022)

fevgatos said:


> https://tpucdn.com/review/msi-rx-580-mech-2/images/power_average.png}
> 
> Techpowerup shows the 580 at 198w and the 1060 at 116 watts.


Bruh, that's not that model.




fevgatos said:


> The difference is huge. I know cause I bought a 1060 specifically for that reason, it had way lower power consumption. Both cards are too slow for the  ram to make any difference but whatever


lol at too slow. More VRAM, means better textures and anisotropic filtering now. For long term, it also means that your card doesn't start to stutter sooner than it should. VRAM is good thing too have and as much as you can. 

BTW your signature shows that you have 3090, not 1060. It's also very "efficient" card.



fevgatos said:


> As i've said, people vote with their wallet, nvidia offered you almost twice the efficiency and you ignored it, so yeah, makes sense they got the memo that you don't care about efficiency.


At twice the price, less VRAM and also while scamming people with that DOA 3GB model. All that says that they themselves don't give a shit about opinions, actual demand of products and feel all good about scamming people. Power consumption for them is not a priority and is just an externality of cut down core, that's all. And what I do is not really important, when 1060 outsold RX 580, RX 570, RX 480 and RX 470 combined. nVidia got the memo that people buy what is heavily advertised and don't care too much about anything else. If they actually "got the memo" about power consumption, how come 2060 consumes 30 watts more and basically the same with 3060? Also nVidia haven't mentioned a single thing about lower power usage or efficiency on their RTX 3060 page. That really shows how much they care about that. Probably a lot more than me with 100 watt RX 580. Their own page keeps yapping about some Ai shit, ray tracing, performance, creativity, drivers. meanwhile people, who buy xx60 tier cards mostly care about value and by value I mean fps/dollar ratio, which nV doesn't even mention. That also really shows that they don't have their heads in their arses. At least Polaris was advertised at that, demoed against 900 series and beat them soundly in power efficiency and value for gamers. We all know that refresh was garbage, but nothing what easy vBIOS mod wouldn't fix.


----------



## fevgatos (Apr 6, 2022)

The red spirit said:


> Bruh, that's not that model.


Doesn't matter, im looking at the reference numbers. Are you suggesting there is an rx 580 that consumes 130 watts? LOL



The red spirit said:


> lol at too slow. More VRAM, means better textures and anisotropic filtering now. For long term, it also means that your card doesn't start to stutter sooner than it should. VRAM is good thing too have and as much as you can.


Not true. Vram, same as actual system ram is absolutely useless until you don't have enough of it. For the calibre of a 1060, 6gb is enough. I don't think you are going to be playing at anything over 1080p resolution, are you?



The red spirit said:


> At twice the price


1060 is not twice the price of an rx 580. Im sorry but that is just lying. By the time the RX580 launched you could easily find a 1060 for 250 or less. I know cause I bought 2, an asus dual and an nvidia founders edition. You voted with your wallet, you don't mind your GPU to draw twice the power for the same performance for a tradeoff of vram. So now you can't complain about increasing power consumption on gpu's.


----------



## InVasMani (Apr 6, 2022)

FPS GO VROOOM SLI 9000W WITH HYPE SYNC 404 TECHNOLOGY.


----------



## fevgatos (Apr 6, 2022)

ratirt said:


> So you want to tell me that if you have a GPU that is 10x faster than a 1080 Ti consuming 2000W of power that would have been a great achievement and an improvement because of performance / watt?
> Because your point of view strongly suggests that course of action.


It's not my view, it's math. Yes a card that consumes 8 times as much power but performs 10 times as much IS an improvement in efficiency. It will consume 20% less power for the same workload. Meaning, I'll render a scene in 1 hour consuming 2.000watts while the 1080ti will render it in 10 hours consuming 2.500 watts. How is that not an improvement? LOL


----------



## The red spirit (Apr 6, 2022)

fevgatos said:


> Doesn't matter, im looking at the reference numbers.


There hasn't been reference RX 580 anywhere.



fevgatos said:


> Are you suggesting there is an rx 580 that consumes 130 watts? LOL


It is task dependent, because different tasks don't use up all core components and therefore don't require full core amperage. Power usage is watts, which are volt * amps. 



fevgatos said:


> Not true. Vram, same as actual system ram is absolutely useless until you don't have enough of it. For the calibre of a 1060, 6gb is enough. I don't think you are going to be playing at anything over 1080p resolution, are you?


I'm running games at 1440p with RX 580. I'm damn sure that I can turn on slightly better textures than with your 1060 6GB. 




fevgatos said:


> 1060 is not twice the price of an rx 580. Im sorry but that is just lying. By the time the RX580 launched you could easily find a 1060 for 250 or less. I know cause I bought 2, an asus dual and an nvidia founders edition.


Cool. Founders ed wasn't available in Lithuania ever, no retailer ever has founder cards from nV, only if other brand releases founders card. Something like Asus GTX 1060 turbo (fictional example, I don't think there was 1060 with blower). I also never claimed that 1060 was twice the price, I only said that it was 300 EUR card. Maybe on lucky day with lethargic cooler it was 280 EUR, but no less than that. So wtf you claim with "Im sorry but that is just lying"? That you lie to yourself? Again, you don't mention currency of "250", if it's in USD, then that's totally pointless to me, since there are import taxes, VAT, customs. And yeah, that's a nice flex mr. money bags, you can have your 2 1060s and bugger off.


----------



## fevgatos (Apr 6, 2022)

The red spirit said:


> There hasn't been reference RX 580 anywhere.
> 
> 
> It is task dependent, because different tasks don't use up all core components and therefore don't require full core amperage. Power usage is watts, which are volt * amps.
> ...


The review has a reference 580, that's where I quoted the consumption from.

Yes different tasks require different wattage but that applies to both cards. So when an rx 580 will consume 130w the 1060 will consume 70. The difference will still be there.

Im talking about euros. Why would I be flexing with 1060's, lol, I had 2 PC so I bought 2 cards.And yes, you mentioned twice the price in your previous comment, so I assumed you were talking about the 1060. As far as I can remember the 1060 was around 280 to 300 on release, but when you bought the 580 (which has to be a year later, since it was actually released a year later) the  prices were much lower. I bought my asus dual for 234€ and my FE for 250.


----------



## InVasMani (Apr 6, 2022)

On the one hand you're arguing about a RX 580 vs 1060 on power draw then mention 1080Ti vs RX 3090 on power draw and it's in no way shape or form similar or the same. It's literally 150w difference with the latter and as per AMD and NVIDIA's own stated TDP's 65w difference between the former. So basically it's over double on the halo tier cards no real shocker there power rise with performance isn't very linear.


----------



## ratirt (Apr 6, 2022)

fevgatos said:


> It's not my view, it's math. Yes a card that consumes 8 times as much power but performs 10 times as much IS an improvement in efficiency. It will consume 20% less power for the same workload. Meaning, I'll render a scene in 1 hour consuming 2.000watts while the 1080ti will render it in 10 hours consuming 2.500 watts. How is that not an improvement? LOL


All know how math work but realistically that card would have been a total disappointment despite any math. There is a fine line that should not be crossed. 
This situation correlates with bad architecture optimization and advancement. It also shows how manipulative math can be to prove a point but it is totally out of any sort of reasonable perspective.


----------



## fevgatos (Apr 6, 2022)

ratirt said:


> All know how math work but realistically that card would have been a total disappointment despite any math.


No, it certainly wouldn't. All 3d artists or whatever that care about efficiency would instantly jump on that card. You need to realize that power consumption on it's own is completely and utterly irrelevant. What determines efficiency is the work done for said consumption. It's not even just graphics cards. My AC? Yeah, I hope it consumed 50 times as much but produced 60 times more heat. That way I could open it for 1 minute instead of 1 hour and still save electricity since it's more efficient.


----------



## InVasMani (Apr 6, 2022)

Just a reminder this thread is about the Intel chip being a power hungry pig, but this slightly off topic popcorn discussion is still good relevant to the overarching issue matter power consumption and ridiculous performance chasing at the expense of massive power inefficiency. GPU's were rightfully so called out as well on the issue, but this is still about the 12900KS being the Pentium 4/Bulldozer power hog CPU turd of 2022.


----------



## The red spirit (Apr 6, 2022)

fevgatos said:


> The review has a reference 580, that's where I quoted the consumption from.


Which does not exist. There hasn't been a single RX 580 with reference cooling sold. Reference RX 580 is more like concept. Same goes for whole RX 500 series.



fevgatos said:


> Yes different tasks require different wattage but that applies to both cards. So when an rx 580 will consume 130w the 1060 will consume 70. The difference will still be there.


Maybe, but since they are on different architectures, it wouldn't be such scaling as you say.



fevgatos said:


> Im talking about euros. Why would I be flexing with 1060's, lol, I had 2 PC so I bought 2 cards.And yes, you mentioned twice the price in your previous comment, so I assumed you were talking about the 1060. As far as I can remember the 1060 was around 280 to 300 on release, but when you bought the 580 (which has to be a year later, since it was actually released a year later) the  prices were much lower. I bought my asus dual for 234€ and my FE for 250.


Cool, but doesn't change anything about prices here in Lithuania. Just for reference, they are selling RX 580 new for over 800 EUR right now:








						Vaizdo plokštė Sapphire Radeon RX 580 Pulse PCIE 11265-05-20G, 8 GB, GDDR5 - 1a.lt
					

1A.LT yra viena didžiausių internetinių parduotuvių Lietuvoje, turinti daugiametę patirtį. Kokybiškas aptarnavimas ir geriausi prekių pasiūlymai.




					www.1a.lt
				




Still over 200 EUR for low end 1050 Ti:








						Vaizdo plokštė Palit GeForce GTX 1050 TI StormX NE5105T018G1F, 4 GB, GDDR5 - 1a.lt
					

1A.LT yra viena didžiausių internetinių parduotuvių Lietuvoje, turinti daugiametę patirtį. Kokybiškas aptarnavimas ir geriausi prekių pasiūlymai.




					www.1a.lt
				




The only deal there is RX 6600:








						Vaizdo plokštė Gigabyte Radeon RX 6600 EAGLE GV-R66EAGLE-8GD, 8 GB, GDDR6 - 1a.lt
					

1A.LT yra viena didžiausių internetinių parduotuvių Lietuvoje, turinti daugiametę patirtį. Kokybiškas aptarnavimas ir geriausi prekių pasiūlymai.




					www.1a.lt
				




So, please, don't speak of Europe as a whole, when you don't know a shit about its regions and regional commerce. And no, those prices are quite normal for other shops too.


----------



## ratirt (Apr 6, 2022)

fevgatos said:


> No, it certainly wouldn't. All 3d artists or whatever that care about efficiency would instantly jump on that card. You need to realize that power consumption on it's own is completely and utterly irrelevant. What determines efficiency is the work done for said consumption. It's not even just graphics cards. My AC? Yeah, I hope it consumed 50 times as much but produced 60 times more heat. That way I could open it for 1 minute instead of 1 hour and still save electricity since it's more efficient.


Too bad we dont have those kind of cards. I wonder why? I'm sure it is not because AMD or NVidia could not build one with a chiplet tech.


----------



## fevgatos (Apr 6, 2022)

ratirt said:


> Too bad we dont have those kind of cards. I wonder why? I'm sure it is not because AMD or NVidia could not build one with a chiplet tech.


We have the 4090 that supposedly can hit 600watts. The reason we don't have 2000 watts cards is practical. It will be huge, won't fit in any normal case, and won't be easy to cool. It's definitely not because of the efficiency. It's the same thing with the CPU's, you cannot cool a die through an IHS at 400 watts, that's why even the 3990x stops at 280w tdp. If we can find a way to cool 400+ watts off of a tiny surface area like a CPU die then there will be 400 watts CPUs.


----------



## MerrHeLL (Apr 6, 2022)

Voodoo Rufus said:


> Fast, hot, expensive. Don't need it. Want it anyway.


I'm kinda there too... I love the fastest! But My 5950X crunches through everything on 1/3 to 1/2 the electricity depending on app... and it makes far less than 1/4th the heat. 442 watts on a stress test! That's nuts.


Voodoo Rufus said:


> Fast, hot, expensive. Don't need it. Want it anyway.





fevgatos said:


> We have the 4090 that supposedly can hit 600watts. The reason we don't have 2000 watts cards is practical. It will be huge, won't fit in any normal case, and won't be easy to cool. It's definitely not because of the efficiency. It's the same thing with the CPU's, you cannot cool a die through an IHS at 400 watts, that's why even the 3990x stops at 280w tdp. If we can find a way to cool 400+ watts off of a tiny surface area like a CPU die then there will be 400 watts CPUs.


2000 watts would pop the breaker on most 15 amp household circuits. My cryptorigs tell me so.


----------



## fevgatos (Apr 6, 2022)

MerrHeLL said:


> 2000 watts would pop the breaker on most 15 amp household circuits. My cryptorigs tell me so.


Not in Europe, but yeah, I get you. As i've said, practical reasons


----------



## MerrHeLL (Apr 6, 2022)

fevgatos said:


> Not in Europe, but yeah, I get you. As i've said, practical reasons


the standard rooms in the US are usually only 15 amps. We have 20 and 30 amp circuits but only if you ask for them when building, or specify/change later. Type F at 16 amps would still trip. 2000 watts is 16.666 amps.... then add in the rest of the system, the cpu, monitors... well over 20 amps


----------



## AusWolf (Apr 6, 2022)

fevgatos said:


> No, it certainly wouldn't. All 3d artists or whatever that care about efficiency would instantly jump on that card. You need to realize that power consumption on it's own is completely and utterly irrelevant. What determines efficiency is the work done for said consumption. It's not even just graphics cards. My AC? Yeah, I hope it consumed 50 times as much but produced 60 times more heat. That way I could open it for 1 minute instead of 1 hour and still save electricity since it's more efficient.


Who said only 3D artists care about efficiency?


----------



## ratirt (Apr 7, 2022)

fevgatos said:


> We have the 4090 that supposedly can hit 600watts. The reason we don't have 2000 watts cards is practical. It will be huge, won't fit in any normal case, and won't be easy to cool. It's definitely not because of the efficiency. It's the same thing with the CPU's, you cannot cool a die through an IHS at 400 watts, that's why even the 3990x stops at 280w tdp. If we can find a way to cool 400+ watts off of a tiny surface area like a CPU die then there will be 400 watts CPUs.


We have? where? I don't think we do as of now. Either way, just because NV is going to release one doesn't mean it should be that way. You can't blindly go and praise whatever companies release, saying from now on it will be like that. What they are doing now is raising the power usage to achieve better performance among other slight improvements. It would seem they have stopped trying to make cards better. No huge improvements, they shrink nodes saying 10%-25% better power usage but you cant see it. That is suppose to be gen over gen. They pursuit performance so badly that they don't even try to make a decent card. So what if 4090 will be faster if it will use shit ton more power. If you crank it down to 250W how much faster it will be really? I hope someone will have a chance to check that out at some point.
We dont have them because with such a power usage efficiency doesn't matter. If that is the case what matters most? I guess power the card will draw which in this case would have been atrocious. Looking at cards release, the new gen over gen. The cards use a lot of power and you say it is fine unless they are more efficient. It would seem there is a fine line, how much power one gpu can draw for me it starts with 400W and I'm not gonna praise any card for the efficiency if the power it uses hits the roof.
Try harder NV and AMD


----------



## AusWolf (Apr 7, 2022)

ratirt said:


> We have? where? I don't think we do as of now. Either way, just because NV is going to release one doesn't mean it should be that way. You can't blindly go and praise whatever companies release, saying from now on it will be like that. What they are doing now is raising the power usage to achieve better performance among other slight improvements. It would seem they have stopped trying to make cards better. No huge improvements, they shrink nodes saying 10%-25% better power usage but you cant see it. That is suppose to be gen over gen. They pursuit performance so badly that they don't even try to make a decent card. So what if 4090 will be faster if it will use shit ton more power. If you crank it down to 250W how much faster it will be really? I hope someone will have a chance to check that out at some point.
> We dont have them because with such a power usage efficiency doesn't matter. If that is the case what matters most? I guess power the card will draw which in this case would have been atrocious. Looking at cards release, the new gen over gen. The cards use a lot of power and you say it is fine unless they are more efficient. It would seem there is a fine line, how much power one gpu can draw for me it starts with 400W and I'm not gonna praise any card for the efficiency if the power it uses hits the roof.
> Try harder NV and AMD


I'm more radical than that - my line (currently) is at 250 W. My Seasonic Prime Ultra Platinum 550 W is the best PSU I've ever had and it's still well within warranty thanks to Seasonic's amazing warranty policy (10 years for Focus, 12 for Prime). It wasn't too cheap, either, so I'd rather not buy another one just because Intel, nvidia and AMD decided to go balls to the walls with performance. 200 W for CPU, 250 W for GPU and 100 W for everything else, including a little safety headroom should be enough.

I'm quite radical with PC size, too. I love mini-ITX systems, even though I have a micro-ATX one at the moment, which is the biggest size I'm happy with. Full towers with lots of unused expansion slots are a thing of the past, imo. I also don't like how they look. I know, one needs the space for airflow, but all the void inside makes the case look empty and bigger than it really should be - kind of the same way people buy SUVs for going to the supermarket once a week. My PL-unlocked i7-11700 and RTX 2070 are kind of the maximum of what I can comfortably cool in this size.


----------



## fevgatos (Apr 7, 2022)

ratirt said:


> We have? where? I don't think we do as of now. Either way, just because NV is going to release one doesn't mean it should be that way. You can't blindly go and praise whatever companies release, saying from now on it will be like that. What they are doing now is raising the power usage to achieve better performance among other slight improvements. It would seem they have stopped trying to make cards better. No huge improvements, they shrink nodes saying 10%-25% better power usage but you cant see it. That is suppose to be gen over gen. They pursuit performance so badly that they don't even try to make a decent card. So what if 4090 will be faster if it will use shit ton more power. If you crank it down to 250W how much faster it will be really? I hope someone will have a chance to check that out at some point.
> We dont have them because with such a power usage efficiency doesn't matter. If that is the case what matters most? I guess power the card will draw which in this case would have been atrocious. Looking at cards release, the new gen over gen. The cards use a lot of power and you say it is fine unless they are more efficient. It would seem there is a fine line, how much power one gpu can draw for me it starts with 400W and I'm not gonna praise any card for the efficiency if the power it uses hits the roof.
> Try harder NV and AMD


But you are just WRONG. They are not raising the power usage to achieve better performance. Of course it is all rumours at this point, but the 4090 is supposedly almost twice as fast as the 3090. So yeah, a 20% consumption increase for a 100% performance increase is an insane efficiency jump. I don't understand how you do not get this.


----------



## AusWolf (Apr 7, 2022)

fevgatos said:


> But you are just WRONG. They are not raising the power usage to achieve better performance. Of course it is all rumours at this point, but the 4090 is supposedly almost twice as fast as the 3090. So yeah, a 20% consumption increase for a 100% performance increase is an insane efficiency jump. I don't understand how you do not get this.


1. I would rather not base anything I say on rumours. The 30-series was said to be supper efficient before launch, and look how it turned out. There isn't a single card without an 8-pin power connector in the whole lineup all the way down to the 3050, and power to performance ratios are just on par with similar Turing chips.

2. The post you commented on said that there should be a line drawn to how much power any graphics card is allowed to consume. The 4090's rumoured 450 W is way above that line. A car that has 1000 HP and does 10 miles per gallon is more efficient than one that has 150 HP and does 30 mpg, but do you really want to fuel a car that only does 10 mpg? If so, enjoy, but most people don't.

Edit: To stay on topic, Alder Lake is said to be super efficient too, but if that efficiency comes at 200+ Watts, I couldn't care less.


----------



## Warigator (Apr 7, 2022)

Solid State Brain said:


> 100% improvement after one generation is never going to happen (or: never again, if you were referring to some cases from the late 1990s-early 2000s)


RDNA 3 is going to be 100% faster in raster than RDNA 2 and more % in ray-tracing (albeit at higher TDP). After we get out of current stagnation, higher growth will come back. We are currently in a temporary trough caused by still using 2D silicon electronic circuits. Once we move to 3D photonic circuits from different materials (around 2030), like graphene or molybdenum disulfide, we will once again see greater gains in efficiency. Gamers should also voice their opinions that current slow growth is unacceptable. Low gigahertz CPUs and low teraflops GPUs are extremely slow, we just only at the beginning of the evolution of computing and great things lie ahead, we've seen nothing yet. i9-12900KS is just a bad joke, it's no better than a normal 12900K and no one should buy one.


----------



## ratirt (Apr 7, 2022)

AusWolf said:


> I'm more radical than that - my line (currently) is at 250 W. My Seasonic Prime Ultra Platinum 550 W is the best PSU I've ever had and it's still well within warranty thanks to Seasonic's amazing warranty policy (10 years for Focus, 12 for Prime). It wasn't too cheap, either, so I'd rather not buy another one just because Intel, nvidia and AMD decided to go balls to the walls with performance. 200 W for CPU, 250 W for GPU and 100 W for everything else, including a little safety headroom should be enough.
> 
> I'm quite radical with PC size, too. I love mini-ITX systems, even though I have a micro-ATX one at the moment, which is the biggest size I'm happy with. Full towers with lots of unused expansion slots are a thing of the past, imo. I also don't like how they look. I know, one needs the space for airflow, but all the void inside makes the case look empty and bigger than it really should be - kind of the same way people buy SUVs for going to the supermarket once a week. My PL-unlocked i7-11700 and RTX 2070 are kind of the maximum of what I can comfortably cool in this size.


Well that is a matter of perspective 250W vs 400W sure it is a difference but like some people suggest going for 600W or higher (which if we keep this course of action will happen) is ridiculous and saying, you dont have to buy it is just plain stupid. Also saying, you can downclock it and save power is even more ridiculous in my opinion since you pay for performance of the card. So you will pay shit a lot to get 250W obviously slower card. Not good.
I have a 6900XT and you may argue about the power it uses but the fact is, it doesnt use a lot. I really dont go across 260W ever. It's not OC'ed but it doesn't have to be either.


fevgatos said:


> But you are just WRONG. They are not raising the power usage to achieve better performance. Of course it is all rumours at this point, but the 4090 is supposedly almost twice as fast as the 3090. So yeah, a 20% consumption increase for a 100% performance increase is an insane efficiency jump. I don't understand how you do not get this.


They are raising the power usage. You said it yourself 4090 will use 600watts. How is that not raising power consumption for the cards? It will be faster no doubt but the power for a card is almost 3 times higher than it used to be.
Yes it is almost twice the 3090 and each of those cards had power draw up. Each generation the power draw goes up. Are you blind or something?
1080 Ti card only power draw 231W
2080 Ti card only power draw 273W
3080 Ti card only power draw 356W
All gaming only.
Don't even try to tell me it's just the way it is or it performs better because that is simple bullshit. Power goes up despite how fast it is and now you will get 4090 with power of 600w. Get 4080 TI with power 550W that is terrible no matter the performance. The power goes up for all the cards and that is illustrated clearly by TPU's reviews. each gen power draw for the same card segment goes up.
4090 600w twice performance of a 3090 350w still shit a lot of power consumption for both cards. NV has been upping the power needed for the cards exponentially every gen. Obviously the 3090 TI uses even more so then they will compare 4090 to a 3090 Ti with a power draw of 450 and it will not seem so bad. But if you look across generations of cards it sucks so bad.



AusWolf said:


> Edit: To stay on topic, Alder Lake is said to be super efficient too, but if that efficiency comes at 200+ Watts, I couldn't care less.


Some people are just blind to see this. You can't boost the power over and over just because the efficiency is there. That just  a dead end and for the companies they are slacking in delivering a good round product.


----------



## Mussels (Apr 7, 2022)

Is it ironic that the best big.LITTLE approach would be a quad core intel with an 8 core ryzen?


----------



## phanbuey (Apr 7, 2022)

Mussels said:


> Is it ironic that the best big.LITTLE approach would be a quad core intel with an 8 core ryzen?


Disaggregation at it's finest.


----------



## THU31 (Apr 7, 2022)

Perfect CPU to pair with a 3090 Ti...

...for the next 6 months, when you will have to swap both components because you always need to have the best of the best.


----------



## chrcoluk (Apr 8, 2022)

btk2k2 said:


> On the gaming testing front is there any chance that some none FPS metrics could be incorporated next time the suite gets an overhaul? Stuff like AI turn time for civ 6 (or 7 if released), late game tic rates (or an abstraction) for the Paradox Grand Strategy games like CK3, Stellaris, EU4. Late game city builder tic rates and so on.
> 
> While the 720p results give an indication finding a way to actually benchmark those games where the CPU is far more important for the gameplay experience than the GPU would be a great addition to the CPU test suite.
> 
> ...



Agree on turn times, also lag (and frequency) on shader and texture streaming needs to get implemented, granted its a headache to implement, but times are moving.  I think TPU should be a leader on that front.


----------



## InVasMani (Apr 8, 2022)

fevgatos said:


> But you are just WRONG. They are not raising the power usage to achieve better performance. Of course it is all rumours at this point, but the 4090 is supposedly almost twice as fast as the 3090. So yeah, a 20% consumption increase for a 100% performance increase is an insane efficiency jump. I don't understand how you do not get this.


CPU power consumption will also go up if GPU workload increases and heat output will rise as well. It is a big jump in efficiency in terms of performance per watt, but it also is moving the goal posts at the same time on power draw. It's justified in some instances, but in others not really on the whole not really the goal posts shouldn't be moving or not in that direction if anything.


----------



## Warigator (Apr 8, 2022)

PC power consumption really needs to stop at some point or we will have few kilowatts PCs in the future, which is absurd. 65 watts for a CPU and 150 watts for a GPU seems good to me. There's not much difference in game framerate between a 65 watt 5600 or 5600X and 300 watts 12900K or 12900KS anyway.


----------



## THU31 (Apr 8, 2022)

Tadasuke said:


> PC power consumption really needs to stop at some point or we will have few kilowatts PCs in the future, which is absurd. 65 watts for a CPU and 150 watts for a GPU seems good to me. There's not much difference in game framerate between a 65 watt 5600 or 5600X and 300 watts 12900K or 12900KS anyway.



Games do not fully utilize that many cores, because they do not need to, so you can use a lower power CPU without issues.

But with GPUs, they will utilize all you can throw at them, without limits. The difference between a 100 W and a 300 W GPU is gigantic.

Those 600 W graphics cards will be pushed to the extreme, compromising efficiency. Those cards are for people who used to have multi-GPU setups. 3-way SLI was a thing, and Quad CrossFire I think.

Multi-GPU is dead now, so enthusiasts might want something extremely powerful. You will still get great products in the 150-300 W range, and if you undervolt those, you will get exceptional efficiency, just like you can get with a 3080 for example (mine consumes 200-250 W while losing about 10% performance vs. stock 340 W).


----------



## fevgatos (Apr 8, 2022)

Tadasuke said:


> PC power consumption really needs to stop at some point or we will have few kilowatts PCs in the future, which is absurd. 65 watts for a CPU and 150 watts for a GPU seems good to me. There's not much difference in game framerate between a 65 watt 5600 or 5600X and 300 watts 12900K or 12900KS anyway.


How many times do we need to repeat that the 12900 doesn't consume 300 watts during gaming? Actually, it's a very efficient gaming CPU, way more efficient than most zen 3s.


----------



## Deleted member 24505 (Apr 8, 2022)

fevgatos said:


> How many times do we need to repeat that the 12900 doesn't consume 300 watts during gaming? Actually, it's a very efficient gaming CPU, way more efficient than most zen 3s.



Soon as they see Intel, some peoples brain turns off.


----------



## onemanhitsquad (Apr 8, 2022)

...might have to marry one up with a z690 and some ddr5


----------



## ThrashZone (Apr 8, 2022)

Hi,
As many times as it takes 

Some people can't even understand the term *Intel system bios default* verses *amd system bios default *when comparing power usage
They just keep insisting the intel chip is more efficient after they alter bios settings wow you think


----------



## fevgatos (Apr 9, 2022)

ThrashZone said:


> Hi,
> As many times as it takes
> 
> Some people can't even understand the term *Intel system bios default* verses *amd system bios default *when comparing power usage
> They just keep insisting the intel chip is more efficient after they alter bios settings wow you think


No altering bios is needed to make alderlake more efficient in gaming. 

I cant fathom that there are actually people that think it consumes 300watts at gaming. Lol


----------



## Voodoo Rufus (Apr 9, 2022)

If you're gaming or idling around the web, does power consumption really matter all that much? Not everyone is running Blender or whatever 24/7.


----------



## fevgatos (Apr 9, 2022)

Voodoo Rufus said:


> If you're gaming or idling around the web, does power consumption really matter all that much? Not everyone is running Blender or whatever 24/7.


In light loads, again, alderlake is way more efficient. I see 7 watts in idle and around 15 to 20 when browsing.


----------



## InVasMani (Apr 9, 2022)

Stock 12900KS 63W vs 5950X 54W according to TPU so in actuality12900KS stock is 9 watts higher at idle than a 5950X. It's also a chip geared towards multi-thread performance and in that area on efficiency it pummels the 12900KS. Single thread performance isn't as critical as it use to be and multi-thread performance has been getting better and better both on the software and hardware side. In fact if you were to use a single 12900KS or 5950X to replace 2 side by side desktop PC's it's pretty readily obvious which would be a better pick overall.


----------



## AusWolf (Apr 9, 2022)

InVasMani said:


> Stock 12900KS 63W vs 5950X 54W according to TPU so in actuality12900KS stock is 9 watts higher at idle than a 5950X. It's also a chip geared towards multi-thread performance and in that area on efficiency it pummels the 12900KS. Single thread performance isn't as critical as it use to be and multi-thread performance has been getting better and better both on the software and hardware side. In fact if you were to use a single 12900KS or 5950X to replace 2 side by side desktop PC's it's pretty readily obvious which would be a better pick overall.


To me, the real difference between Intel and AMD isn't just performance or efficiency, but heat transfer. If you have a standard ATX system and go balls to the walls with cooling, it doesn't really matter which side you pick, as both are awesome for different reasons. On the other hand, if you go small form factor with a compact micro-ATX or even mini-ITX system, and your cooling options are limited, Intel CPUs are just way easier to cool, even if you give them the same power limit as you would your AMD one. Reviews don't tend to look at things from this way, but I know from experience, as I like buying lots of hardware just for the fun of it.


----------



## fevgatos (Apr 9, 2022)

AusWolf said:


> To me, the real difference between Intel and AMD isn't just performance or efficiency, but heat transfer. If you have a standard ATX system and go balls to the walls with cooling, it doesn't really matter which side you pick, as both are awesome for different reasons. On the other hand, if you go small form factor with a compact micro-ATX or even mini-ITX system, and your cooling options are limited, Intel CPUs are just way easier to cool, even if you give them the same power limit as you would your AMD one. Reviews don't tend to look at things from this way, but I know from experience, as I like buying lots of hardware just for the fun of it.


Yeah, Intels have been easier to cool and if you check TPU's cooler reviews it tends to show. If you go to their extreme tests you can see that even cheap small tower coolers can keep a 10900k from throttling while the 3900x requires the more expensive stuff.


----------



## progste (Apr 9, 2022)

fevgatos said:


> Yeah, Intels have been easier to cool and if you check TPU's cooler reviews it tends to show. If you go to their extreme tests you can see that even cheap small tower coolers can keep a 10900k from throttling while the 3900x requires the more expensive stuff.


But the 5950x is much easier to cool than the 12900k.


----------



## AusWolf (Apr 9, 2022)

progste said:


> But the 5950x is much easier to cool than the 12900k.


That's only because of the power budget. Intel decided to limit power to 241 W with Alder Lake, which is crazy. If you slap the same cooler onto both, and limit power to let's say, 125 W, the Intel chip will run cooler.

Another interesting thing is that AMD chips with fewer cores per CCX run hotter, as the same power is consumed by fewer cores (that is, a smaller, more concentrated die space). So a 5600X is hotter than a 5800X and a 5900X is hotter than a 5950X when set to the same power limit.


----------



## progste (Apr 9, 2022)

AusWolf said:


> That's only because of the power budget. Intel decided to limit power to 241 W with Alder Lake, which is crazy. If you slap the same cooler onto both, and limit power to let's say, 125 W, the Intel chip will run cooler.


So what? If you limit the AMD CPU that one will also run cooler.


----------



## ThrashZone (Apr 9, 2022)

progste said:


> So what? If you limit the AMD CPU that one will also run cooler.


Hi,
Yeah sort of a waste of time repeating this 

All I have is intel systems and you'd always hit a thermal wall and limit voltage.

But this efficiancy nonsense has just got outer limits with these guys.


----------



## Deleted member 24505 (Apr 9, 2022)

I have a 12700k, run it stock( it's plenty quick enough) all i have done is set PL1 and 2 to 195W


----------



## fevgatos (Apr 9, 2022)

progste said:


> But the 5950x is much easier to cool than the 12900k.


Not true? Im pretty sure you can't cool a 5950x @200 watts with a u12a. My 12900k stays at 76c @220 watts



progste said:


> So what? If you limit the AMD CPU that one will also run cooler.


What part of "if you limit both to the same power limit" didnt you understand?


----------



## InVasMani (Apr 9, 2022)

Stock load 5950X 57C vs Intel 12900KS stock load 92c is pretty lopsided. So it idles for more wattage and is hotter at load  at the same time and overclocked things don't get better as these chips are already pushed beyond the point they should've for the architecture design. Hell the 5950X runs cooler than the 12600K and idles at less power consumption for multi-threaded load and stress test are also lower according to TPU. The 12900KS is a abomination in the context of things relative to the 5950X. 

The 5800X is looking pretty solid as well though it runs hotter than the 5950X, but it's energy consumption is a bit more neutral balanced for mixed workloads. I think 5950X might've been better overall had it had like 100MHz to 200MHz less boost and 100MHz to 200MHz higher base frequency looking at TPU's charts it appears like that trade off would've been worth it.


----------



## Mussels (Apr 11, 2022)

Voodoo Rufus said:


> If you're gaming or idling around the web, does power consumption really matter all that much? Not everyone is running Blender or whatever 24/7.


It certainly matters less, yes.
Thing is... not all of us are short sighted.

I'm in many, many facebook groups for PC gamers. The last 2-3 years have been filled with quad core users bitching of 100% CPU usage in modern games, holding back their GPU's and overloading previously 'good enough' cooling


I'll simplify answers to a few users here into one single answer/question:

You may be happy with how efficient and cool it runs for your tasks now, but if those tasks change and you're hitting those high power usages all the time - are you still going to be happy with your purchase?
Are you going to be forced to re-design your system for better cooling, lower noise, etc?


----------



## Deleted member 24505 (Apr 11, 2022)

Mussels said:


> It certainly matters less, yes.
> Thing is... not all of us are short sighted.
> 
> I'm in many, many facebook groups for PC gamers. The last 2-3 years have been filled with quad core users bitching of 100% CPU usage in modern games, holding back their GPU's and overloading previously 'good enough' cooling
> ...



I suppose that depends on your cooling. I probably wouldn't have to, but some on air may well have too.


----------



## fevgatos (Apr 11, 2022)

Tigger said:


> I suppose that depends on your cooling. I probably wouldn't have to, but some on air may well have too.


Nah, im on a small single tower air cooler. It can handle cbr23 at 65c so I'm sure it can handle any game with the 12900k


----------



## Deleted member 24505 (Apr 11, 2022)

fevgatos said:


> Nah, im on a small single tower air cooler. It can handle cbr23 at 65c so I'm sure it can handle any game with the 12900k



I guess some would consider my cooling a bit ott, but it's part looks, part performance.


----------



## Undertoker (Apr 22, 2022)

I have both a 12900K and a 12900KS in two Asus boards , a K in an Asus hero and the KS in an extreme board.
The K variant has an SP number of 82 , the KS has an SP number of 92, with 100 on the P cores, it’s clearly a better binned sample.
When I first plugged in the KS to the extreme board running the 702 bios it read an SP of 200!, then 96 on the first boot, then after a bios update to 1403 it dropped to 92 who knows why….
But it does run really well and peaks at 5.7Ghz regularly, usually sitting at between 5.3 and 5.5ghz constantly - on an AIO !
The KS will do also do 5.3GHz on all 8 cores stable and the strangest thing is that it never goes over 80 Degrees C, I’d heard these KS CPU’s are supposed to run really hot and I’m only using an NZXT Z63 280mm AIO. - it just doesn’t get hot at all, in fact I assumed I’d have to buy a better cooling solution but the Z63 more than copes really comfortably. I’m amazed and I’ve benched it hard as well.
As regards seeing a huge difference with a KS over a K variant then clearly you won’t really, it is a small increase and this is simply a halo product, nothing more it’s a “nice to have“ your not going to notice any fps increase gaming, you will notice it pulls a few more watts.
But if you like playing around overclocking for example as I do then the KS is certainly worth buying, simply because it’s a way better binned cpu to do it with.
if money is the issue for you then go with the K variant, you may get lucky and get a decent sample in the silicon lottery but I’m guessing most of the best binned chips are used for the KS now.
overall I’m happy with it, but I still also have my old 9900KS as well and having a wonderful example of the latest and best is a nice thing to have.
I also run a 3090 strix GPU and I can say I wasn’t at all tempted by the 3090ti like I was the KS over the K
for me the issue with the 3090ti is the insane power draw, I can tell you after several hours gaming I often end up sitting in my pants gaming as the room gets so very hot, so there is no way I’d ever go for the 3090ti gpu, unless they vastly improve the power draw I genuinely believe my next build will be an AMD build.

With European governments frankly taking the total piss with the price of electricity as well this has to be a consideration for all of us and Nvidia need to consider this very carefully moving forward or these draconian governments will legislate and do what they do best - ruin it for everyone.

As with any halo product there is a premium to pay and if 150usd is an issue for you - clearly don’t buy one - but don’t whine and get salty because others choose to do so.
It’s the very best cpu there is currently and like all top end kit, you pay for that halo badge
This clearly is a cpu made for those who want the very best , it isn’t “value” and if your whining about a days wages to pay for a cpu you’ll use for the next few years then clearly the 12600k or 12700k or an AMD 5800X 3D is the one for you and not the 12900KS, so leave it to us enthusiasts 

That said I do love seeing the salty, mocking and clearly bitter comments about it , they do make me chuckle, I don’t understand how a cpu can upset you that much, just don’t buy it and move on


----------



## THU31 (Apr 22, 2022)

It is interesting that the power draw of the 3090 Ti bothers you, but that of the 12900KS does not. And it consumes around three times more power than the 5800X3D, even in gaming. And I honestly doubt you will notice the difference in performance without a framerate counter.

But if my day's wages could buy me this CPU, I doubt I would ever complain about anything.


----------



## John Naylor (Jul 5, 2022)

Would love to see an extra performance per dollar comparison in a "typical system" .. and by that I mean, a system where all the components are an appropriate match for someone investing $700+ in the CPU....perhaps 3-4 tiers in various price categories where each CPU grouping would have an appropriate tier category... maybe $500+ / $300 - $400 /  $200 - $300 / < $200


----------



## progste (Jul 5, 2022)

John Naylor said:


> Would love to see an extra performance per dollar comparison in a "typical system" .. and by that I mean, a system where all the components are an appropriate match for someone investing $700+ in the CPU....perhaps 3-4 tiers in various price categories where each CPU grouping would have an appropriate tier category... maybe $500+ / $300 - $400 /  $200 - $300 / < $200


what would "appropriate" mean though? In some configurations you can get away being cheap on most component except CPU and GPU (in the example of a gaming PC), the only meaningful thing could be taking into account power consumption and a motherboard/PSU able to handle it.


----------



## MerrHeLL (Jul 5, 2022)

the biggest reason I prefer the Ryzens is the actual power use in real world conditions. A 5950X during most gaming loads uses more like 100-120 watts. Easy games like LOL its closer to 80-100 watts. It only uses a total of 194 watts with PBO enabled and 100% load on all 16 cores and 32 threads. Something like Handbrake.

I do want to try an i9 12900 KS and probably will, but the 12900 K/KS average power use for ordinary usage and gaming usage is much higher than the Ryzen. Gaming on the 12900 K/KS keeps the cpu drawing over 200 watts for long periods of time. I'm hoping another die shrink will help intel on power draw.


----------



## fevgatos (Jul 5, 2022)

MerrHeLL said:


> the biggest reason I prefer the Ryzens is the actual power use in real world conditions. A 5950X during most gaming loads uses more like 100-120 watts. Easy games like LOL its closer to 80-100 watts. It only uses a total of 194 watts with PBO enabled and 100% load on all 16 cores and 32 threads. Something like Handbrake.
> 
> I do want to try an i9 12900 KS and probably will, but the 12900 K/KS average power use for ordinary usage and gaming usage is much higher than the Ryzen. Gaming on the 12900 K/KS keeps the cpu drawing over 200 watts for long periods of time. I'm hoping another die shrink will help intel on power draw.


Totally not true. The 12900k sips power in gaming compared to the 5950x. Igor's lab and other sites have reviews about this, the 12900k is way way more efficient than the 5950x in lightly threaded tasks like games.


----------



## Deleted member 24505 (Jul 5, 2022)

Here is my 12700k gaming, less than 50 watts, at 1440p


----------



## MerrHeLL (Jul 6, 2022)

Tigger said:


> Here is my 12700k gaming, less than 50 watts, at 1440p
> View attachment 253782



That helps a lot. Thanks. I found a comparison from Igor that seems to explain. https://www.igorslab.de/en/intel-co...ng-in-really-fast-and-really-frugal-part-1/9/



Tigger said:


> Here is my 12700k gaming, less than 50 watts, at 1440p
> View attachment 253782



This helped too 




__
		https://www.reddit.com/r/intel/comments/rn3i1f


----------



## Mussels (Jul 6, 2022)

Tigger said:


> Here is my 12700k gaming, less than 50 watts, at 1440p
> View attachment 253782


Us HWinfo64 and show the min/max/average for a game session instead of a single peak GPU limited value

You do realise your CPU would use a lot more power with a more powerful GPU, or lower settings?


Heres mine over 2 hours of gaming (V Rising, which for some reason is absurdly demanding on hardware):


----------



## MerrHeLL (Jul 6, 2022)

Mussels said:


> Us HWinfo64 and show the min/max/average for a game session instead of a single peak GPU limited value
> 
> You do realise your CPU would use a lot more power with a more powerful GPU, or lower settings?
> 
> ...


----------



## AusWolf (Jul 6, 2022)

MerrHeLL said:


> the biggest reason I prefer the Ryzens is the actual power use in real world conditions. A 5950X during most gaming loads uses more like 100-120 watts. Easy games like LOL its closer to 80-100 watts. It only uses a total of 194 watts with PBO enabled and 100% load on all 16 cores and 32 threads. Something like Handbrake.
> 
> I do want to try an i9 12900 KS and probably will, but the 12900 K/KS average power use for ordinary usage and gaming usage is much higher than the Ryzen. Gaming on the 12900 K/KS keeps the cpu drawing over 200 watts for long periods of time. I'm hoping another die shrink will help intel on power draw.


I can only join the others in saying that Intel CPUs are extremely efficient in lightly threaded workloads. My 11700 only needs a maximum of 50-60 W in games (not even that in older ones). It's multi-threaded work, like rendering, where AMD CPUs really show their teeth in the efficiency department.


----------



## ratirt (Jul 6, 2022)

Mussels said:


> Us HWinfo64 and show the min/max/average for a game session instead of a single peak GPU limited value
> 
> You do realise your CPU would use a lot more power with a more powerful GPU, or lower settings?
> 
> ...


Exactly that. 
The problem I see with some people here when they try to show off a CPU. 
When power consumption showdown is involved they use 4K and 60FPS cap and it does 50Watts or around
When they show performance they use 720p with medium settings.
It is kinda silly sometimes reading the comments and posts and how people blindly try to twist the truth no matter what.


----------



## THU31 (Jul 6, 2022)

The 12900KS is ridiculous. Even though games do not use all the cores, the CPU will still clock super high with crazy high voltages. This is where the power consumption comes from.

You can lower the power limit by 50% and you will only lose like 10-15% performance. Then again, if all you do is gaming and not productivity, i9 is a complete waste of money whether you limit the power or not.


----------



## fevgatos (Jul 6, 2022)

ratirt said:


> Exactly that.
> The problem I see with some people here when they try to show off a CPU.
> When power consumption showdown is involved they use 4K and 60FPS cap and it does 50Watts or around
> When they show performance they use 720p with medium settings.
> It is kinda silly sometimes reading the comments and posts and how people blindly try to twist the truth no matter what.


Im playing warzone at 5120x1440 with mixed settings (mostly medium) and dlss, alternating between cpu and gpu bound at 220 to 300 fps and the 12900k never exceeds 100watts.

Of course there are games that it gets up to 150 at ultra low settings (like cyberpunk) but the point remains, intel is more efficient than all zen except the 3d in gaming.

Its a pretty well known fact that in lightly threaded workloads and games intel is both way faster and way more efficient


----------



## Deleted member 24505 (Jul 6, 2022)

fevgatos said:


> Im playing warzone at 5120x1440 with mixed settings (mostly medium) and dlss, alternating between cpu and gpu bound at 220 to 300 fps and the 12900k never exceeds 100watts.
> 
> Of course there are games that it gets up to 150 at ultra low settings (like cyberpunk) but the point remains, intel is more efficient than all zen except the 3d in gaming.
> 
> Its a pretty well known fact that in lightly threaded workloads and games intel is both way faster and way more efficient



Pointless arguing with the AMD heads, they will constantly just try to prove you wrong. My 12700k hardly uses anything gaming i know that, whether it is over 2 hours or 5 mins. It might spike up sometimes but i would expect all CPUs to do that. 

We own Intel ADL CPU's we are flogging a dead horse on TPU trying to defend them. Sometimes I think maybe i should just switch to AM4/5800x and join in the pitch forking.


----------



## progste (Jul 6, 2022)

Tigger said:


> Pointless arguing with the AMD heads, they will constantly just try to prove you wrong. My 12700k hardly uses anything gaming i know that, whether it is over 2 hours or 5 mins. It might spike up sometimes but i would expect all CPUs to do that.
> 
> We own Intel ADL CPU's we are flogging a dead horse on TPU trying to defend them. Sometimes I think maybe i should just switch to AM4/5800x and join in the pitch forking.


Don't lie man, I've been around these forums for some years and I've alway seen you argue like this, we know you love it way too much.


----------



## Deleted member 24505 (Jul 6, 2022)

progste said:


> Don't lie man, I've been around these forums for some years and I've alway seen you argue like this, we know you love it way too much.



Not any more. I give up, this is definitely AMD biased here now.


----------



## AusWolf (Jul 6, 2022)

Tigger said:


> Pointless arguing with the AMD heads, they will constantly just try to prove you wrong. My 12700k hardly uses anything gaming i know that, whether it is over 2 hours or 5 mins. It might spike up sometimes but i would expect all CPUs to do that.
> 
> We own Intel ADL CPU's we are flogging a dead horse on TPU trying to defend them. Sometimes I think maybe i should just switch to AM4/5800x and join in the pitch forking.


I don't even understand reviews anymore, to be honest.

First, they test performance in games. Second, they test power consumption in rendering. Then they draw the conclusion that Intel CPUs are inefficient and run hot. I mean... WHAT?  

Edit: My point is: what can we expect from regular users and forum visitors when even highly respected (Youtube) reviews are turning more and more towards sensationalism, clickbait and show?


----------



## fevgatos (Jul 6, 2022)

Tigger said:


> Pointless arguing with the AMD heads, they will constantly just try to prove you wrong. My 12700k hardly uses anything gaming i know that, whether it is over 2 hours or 5 mins. It might spike up sometimes but i would expect all CPUs to do that.
> 
> We own Intel ADL CPU's we are flogging a dead horse on TPU trying to defend them. Sometimes I think maybe i should just switch to AM4/5800x and join in the pitch forking.


Amd can do no wrong. I still remember the bashing the 7700k received ( i was bashing it as well). Yet fast forward to 2022, we have the 5800x 3d, which is basically a worse (and more expensive) version of the 7700k, and people are going nuts over it.


----------



## ratirt (Jul 6, 2022)

fevgatos said:


> Amd can do no wrong. I still remember the bashing the 7700k received ( i was bashing it as well). Yet fast forward to 2022, we have the 5800x 3d, which is basically a worse (and more expensive) version of the 7700k, and people are going nuts over it.


Are you serious saying that 5800x3D is like 7700K? May I ask on what grounds you came up with that conclusion?


----------



## fevgatos (Jul 6, 2022)

ratirt said:


> Are you serious saying that 5800x3D is like 7700K? May I ask on what grounds you came up with that conclusion?


Nah, I'm not saying that. Im saying the 7700k was better back in 2017 than the 3d is in 2022. I dont even think that it is a controversial statement. They are both way way slower on the vast majority of applications not only compared to price competitors but also compared to way cheaper cpus. They are both good basically at just games and released on basically dead end platforms 

But what makes the 7700k better is the fact that it was much cheaper, unlocked, had an  igpu and was at least beating the r7 1700 in lightly threaded /single threaded / latency dependant applications, not just games. The 3d gets embarrassed by much cheaper offerings like the 12600k and the 12700f or even amds own 5900x


----------



## Mussels (Jul 6, 2022)

Tigger said:


> Pointless arguing with the AMD heads, they will constantly just try to prove you wrong. My 12700k hardly uses anything gaming i know that, whether it is over 2 hours or 5 mins. It might spike up sometimes but i would expect all CPUs to do that.
> 
> We own Intel ADL CPU's we are flogging a dead horse on TPU trying to defend them. Sometimes I think maybe i should just switch to AM4/5800x and join in the pitch forking.


You have a flawed test setup.

You've got your GPU pegged at 100%, and claim that your CPU is nice and cold and that your CPU is amazing because of it.

Except that with anything more powerful than your 5 year old GPU, or in fact just lowering your game settings... that CPU usage, power consumption and heat will skyrocket.


----------



## fevgatos (Jul 6, 2022)

Mussels said:


> You have a flawed test setup.
> 
> You've got your GPU pegged at 100%, and claim that your CPU is nice and cold and that your CPU is amazing because of it.
> 
> Except that with anything more powerful than your 5 year old GPU, or in fact just lowering your game settings... that CPU usage, power consumption and heat will skyrocket.


But we already know that alderlake are more efficient in games than all zen 3 except the 3d,right? I mean i have a 3090 and a 12900k, we can test it, im absolutely positive a 5950x will get absolutely embarrassed when it comes to gaming efficiency

Also a normal gaming scenario is with the gpu pegged at 100%, else you are having a cpu bottleneck...


----------



## Mussels (Jul 6, 2022)

fevgatos said:


> But we already know that alderlake are more efficient in games than all zen 3 except the 3d,right? I mean i have a 3090 and a 12900k, we can test it, im absolutely positive a 5950x will get absolutely embarrassed when it comes to gaming efficiency
> 
> Also a normal gaming scenario is with the gpu pegged at 100%, else you are having a cpu bottleneck...


I'm not going to trust anyones claims without proper evidence backing it up, because of posts like we've just had here with an absolutely flawed and biased setup.

Or can I just swap my 3090 for my 750ti and claim that my CPU is more power efficient too?


----------



## fevgatos (Jul 6, 2022)

Mussels said:


> I'm not going to trust anyones claims without proper evidence backing it up, because of posts like we've just had here with an absolutely flawed and biased setup.
> 
> Or can I just swap my 3090 for my 750ti and claim that my CPU is more power efficient too?


But we have derbauer and igorslab testing for gaming efficiency, the 12900k is up to 50% more efficient than the 5950x in games. You think they are biased as well? And that's not anything new, the same applied to comet lake, they also were pretty efficient at gaming. The only intel cpus that were worse than zen 3 in gaming were rocketlake. I even had one, they were pretty atrocious


----------



## Deleted member 24505 (Jul 6, 2022)

fevgatos said:


> But we have derbauer and igorslab testing for gaming efficiency, the 12900k is up to 50% more efficient than the 5950x in games. You think they are biased as well? And that's not anything new, the same applied to comet lake, they also were pretty efficient at gaming. The only intel cpus that were worse than zen 3 in gaming were rocketlake. I even had one, they were pretty atrocious



Your wasting your time. Might as well talk to a potato

At 1440 with a much better GPU the CPU would still not need to be so taxed as surely the GPU is doing all the work. So the CPU would still use little power.


----------



## ratirt (Jul 6, 2022)

fevgatos said:


> Nah, I'm not saying that. Im saying the 7700k was better back in 2017 than the 3d is in 2022. I dont even think that it is a controversial statement. They are both way way slower on the vast majority of applications not only compared to price competitors but also compared to way cheaper cpus. They are both good basically at just games and released on basically dead end platforms


Actually you did say that explicitly but anyway. 5800x3d is slower in some apps but it is supposedly be a gaming CPU and if you consider that as your main goal it is a very good CPU and its app performance don't suck. It is slightly slower than a 5800x. That's still decent performance. 7700k was overwhelmed by the changing landscape with more cores and higher utilization of resources.


fevgatos said:


> But what makes the 7700k better is the fact that it was much cheaper, unlocked, had an igpu and was at least beating the r7 1700 in lightly threaded /single threaded / latency dependant applications, not just games. The 3d gets embarrassed by much cheaper offerings like the 12600k and the 12700f or even amds own 5900x


Unlocked means nothing here since we are talking about AMD CPUs and these are different and this 'unlocked' literally means nothing in today's world with boosting frequency. With the pricing you have to always remember what happened literally not so long ago with Covid and price hikes. It was different times back then and I'm sure you do realize that. That's how quickly things have developed. Embarrassment? I literally don't know what you are talking about especially, 5900x owners can elaborate on that how much embarrassed they are with their choice. It is literally laughable what you say. Intel released 3 gens of CPUs to stay competitive and yet people choosing AMD 5000 series should be embarrassed?


----------



## Dr. Dro (Jul 6, 2022)

Mussels said:


> You have a flawed test setup.
> 
> You've got your GPU pegged at 100%, and claim that your CPU is nice and cold and that your CPU is amazing because of it.
> 
> Except that with anything more powerful than your 5 year old GPU, or in fact just lowering your game settings... that CPU usage, power consumption and heat will skyrocket.



I'll agree with you. The 1080 Ti has seen better days, but if you dare bring that up... 

Just Pascal not supporting multiplane overlays correctly (or rather, at all) is enough for me not to want one regardless of the circumstances. Flip model presentation makes games smooth as butter.



fevgatos said:


> But we have derbauer and igorslab testing for gaming efficiency, the 12900k is up to 50% more efficient than the 5950x in games. You think they are biased as well? And that's not anything new, the same applied to comet lake, they also were pretty efficient at gaming. The only intel cpus that were worse than zen 3 in gaming were rocketlake. I even had one, they were pretty atrocious



Efficiency as in...? Because I don't see any 50% delta in performance or power consumption from the 12900K or KS vs. the 5950X. The 12700K is nice, but so is the 5900X, too. Alder Lake has better idle consumption because no IOD with fixed power draw, but that's about it.


----------



## Deleted member 24505 (Jul 6, 2022)

Dr. Dro said:


> The 1080 Ti has seen better days, but if you dare bring that up.



Says the guy with the 3090. Still nothing wrong with a 1080ti, shame i don't have money to piss away on a 3090


----------



## Dr. Dro (Jul 6, 2022)

Tigger said:


> Says the guy with the 3090. Still nothing wrong with a 1080ti, shame i don't have money to piss away on a 3090



I mean, I hope that didn't come out the wrong way, because i'm not hitting on you for that one brother. It's that @Mussels has a point, it's already an aging card that isn't capable of running some of the most intensive applications right now, and people will defend it because they are quite fond of it (and for good reasons, too!). I've had people scoff at me when I called the 1080 Ti an aging design before.

But when you want to test a latest-generation high-end processor, having a more modern card is surely handy. I wouldn't even say 3090, if you want to see how much a processor weighs on your setup, i'd probably be looking at the 6900 XT, as that card is better for high frame rate gaming.


----------



## fevgatos (Jul 6, 2022)

ratirt said:


> Actually you did say that explicitly but anyway.


Yeah, I was wrong. The 7700k is better than the 3d


ratirt said:


> 5800x3d is slower in some apps but it is supposedly be a gaming CPU and if you consider that as your main goal it is a very good CPU and its app performance don't suck.


And you can say the same thing about the 7700k 


ratirt said:


> That's still decent performance. 7700k was overwhelmed by the changing landscape with more cores and higher utilization of resources.


Overwhelmed? It's still faster in games than any other CPU from it's era. What are you talking about?


ratirt said:


> Unlocked means nothing here since we are talking about AMD CPUs and these are different and this 'unlocked' literally means nothing in today's world with boosting frequency.


PBO alone increases all core performance by 20-25%. That's not nothing.



ratirt said:


> With the pricing you have to always remember what happened literally not so long ago with Covid and price hikes. It was different times back then and I'm sure you do realize that. That's how quickly things have developed.


The 12700f was released during covid as well. It doesn't cost a stupid amount like the 3d does



ratirt said:


> Embarrassment? I literally don't know what you are talking about especially, 5900x owners can elaborate on that how much embarrassed they are with their choice. It is literally laughable what you say. Intel released 3 gens of CPUs to stay competitive and yet people choosing AMD 5000 series should be embarrassed?


You are trying to make this intel vs amd, im not interested. Im just saying the 3d is vastly slower than much much much cheaper SKUS. The price of that thing is ridiculous. At 250-300 msrp it would be okayish.



Dr. Dro said:


> Efficiency as in...? Because I don't see any 50% delta in performance or power consumption from the 12900K or KS vs. the 5950X. The 12700K is nice, but so is the 5900X, too. Alder Lake has better idle consumption because no IOD with fixed power draw, but that's about it.


Run stock, pick a game of your choice with ingame benchmark, run 720p and post the results with fps and power consumption. The 12900k will absolutely annihilate your 5950x when it comes to fps / wattage.

And this is the test from igorslab, 720p, the 5950x consumes more than 50% per fps compared to the 12900k


----------



## Colddecked (Jul 6, 2022)

fevgatos said:


> Nah, I'm not saying that. Im saying *the 7700k was better back in 2017 than the 3d is in 2022*. I dont even think that it is a controversial statement. They are both way way slower on the vast majority of applications not only compared to price competitors but also compared to way cheaper cpus. They are both good basically at just games and released on basically dead end platforms
> 
> But what makes the 7700k better is the fact that it was much cheaper, unlocked, had an  igpu and was at least beating the r7 1700 in lightly threaded /single threaded / latency dependant applications, not just games. The 3d gets embarrassed by much cheaper offerings like the 12600k and the 12700f or even amds own 5900x



7700k was a really bad cpu/socket generation.  I owned one and used it on my main rig for quite sometime.  I mean sure it was ok in 2017, but less than a year later intel releases a new socket with 6, 8 core potential.  the x3d actually beats the 12900ks in many games as its intended.


----------



## fevgatos (Jul 6, 2022)

Colddecked said:


> 7700k was a really bad cpu/socket generation.  I owned one and used it on my main rig for quite sometime.  I mean sure it was ok in 2017, but less than a year later intel releases a new socket with 6, 8 core potential.  the x3d actually beats the 12900ks in many games as its intended.


Im completely in agreement, the 7700k was atrociously bad. But it also beat the 1800x in ALL games, not just some. That still doesn't make it a good cpu. So why does the fact that 3d beats the ks in some games make it a good cpu all of a sudden?


----------



## Colddecked (Jul 6, 2022)

fevgatos said:


> Im completely in agreement, the 7700k was atrociously bad. But it also beat the 1800x in ALL games, not just some. That still doesn't make it a good cpu. So why does the fact that 3d beats the ks in some games make it a good cpu all of a sudden?



Yeah at the time, it beats the 1800x in all games, but I wonder if that's still the case in ALL games now.  

I mean, it can beat the ks by alot in certain games while using 1/2~1/3rd the power depending on your ks overclock.  Its not a cpu for everyone for sure, but for people invested in AM4, if you wanted a substantial gaming upgrade and productivity matters little, its a compelling option.


----------



## Dr. Dro (Jul 6, 2022)

fevgatos said:


> Run stock, pick a game of your choice with ingame benchmark, run 720p and post the results with fps and power consumption. The 12900k will absolutely annihilate your 5950x when it comes to fps / wattage.



Uh, we're talking about sixteen-core high-performance desktop processors, and the kind used on machines equipped with thousand-watt power supplies and on liquid cooling (or at least high-end air). The higher idle socket power of MTS/VMR due to the IOD is frankly irrelevant in a real-world context. It looks bad on low/moderate load charts, but the fact that it completely flips around under full load tells you a thing or two. The Zen 3 cores themselves are highly power efficient. ADL is just as power efficient, provided... you don't do what the KS is designed to do, that is, completely disregard efficiency in exchange for brute performance.

I'll grant your wish, even though I'll maintain my point, it's rather irrelevant, lower-end games and benchmarks don't leverage the multicore muscle of these processors as you can see in the average active core count measured in HWiNFO here. When Superposition was first loading it peaked at 3.4 cores active, and that coincided with the 10-watt maximums per-core. On a system with PBO enabled, mind you...


----------



## fevgatos (Jul 6, 2022)

Colddecked said:


> Yeah at the time, it beats the 1800x in all games, but I wonder if that's still the case in ALL games now.
> 
> I mean, it can beat the ks by alot in certain games while using 1/2~1/3rd the power depending on your ks overclock.  Its not a cpu for everyone for sure, but for people invested in AM4, if you wanted a substantial gaming upgrade and productivity matters little, its a compelling option.


It also loses from the ks by ALOT in other certain games. Personally I think it's very unreasonable to compare it to the ks. Compare it to the 12700f, and you end up with a CPU that is 50% more expensive, gets embarrassed in most workloads, and wins by an average of 10% in 360p gaming.


----------



## Colddecked (Jul 6, 2022)

fevgatos said:


> It also loses from the ks by ALOT in other certain games. Personally I think it's very unreasonable to compare it to the ks. Compare it to the 12700f, and you end up with a CPU that is 50% more expensive, gets embarrassed in most workloads, and wins by an average of 10% in 360p gaming.



If you are building new it doesn't make sense really.  But like I said, if you are invested in am4 already, have good ddr4 etc, then a drop in upgrade like the x3d looks great.  

AMD is milking early adopters yes, but all your issues go away once AMD lowers the price.


----------



## fevgatos (Jul 6, 2022)

Dr. Dro said:


> Uh, we're talking about sixteen-core high-performance desktop processors, and the kind used on machines equipped with thousand-watt power supplies and on liquid cooling (or at least high-end air). The higher idle socket power of MTS/VMR due to the IOD is frankly irrelevant in a real-world context. It looks bad on low/moderate load charts, but the fact that it completely flips around under full load tells you a thing or two. The Zen 3 cores themselves are highly power efficient. ADL is just as power efficient, provided... you don't do what the KS is designed to do, that is, completely disregard efficiency in exchange for brute performance.
> 
> I'll grant your wish, even though I'll maintain my point, it's rather irrelevant, lower-end games and benchmarks don't leverage the multicore muscle of these processors as you can see in the average active core count measured in HWiNFO here. When Superposition was first loading it peaked at 3.4 cores active, and that coincided with the 10-watt maximums per-core. On a system with PBO enabled, mind you...
> 
> View attachment 253835


You may consider it irrelevant (I do too) but gaming efficiency was what I replied to and said that the 12900k is way more efficient than zen 3. Haven't run supo for a while, Im no34 on 4k optimized with a 3090 on air 



Dr. Dro said:


> Uh, we're talking about sixteen-core high-performance desktop processors, and the kind used on machines equipped with thousand-watt power supplies and on liquid cooling (or at least high-end air). The higher idle socket power of MTS/VMR due to the IOD is frankly irrelevant in a real-world context. It looks bad on low/moderate load charts, but the fact that it completely flips around under full load tells you a thing or two. The Zen 3 cores themselves are highly power efficient. ADL is just as power efficient, provided... you don't do what the KS is designed to do, that is, completely disregard efficiency in exchange for brute performance.
> 
> I'll grant your wish, even though I'll maintain my point, it's rather irrelevant, lower-end games and benchmarks don't leverage the multicore muscle of these processors as you can see in the average active core count measured in HWiNFO here. When Superposition was first loading it peaked at 3.4 cores active, and that coincided with the 10-watt maximums per-core. On a system with PBO enabled, mind you...
> 
> View attachment 253835


Ok, just run the same test, peak wattage was 49.2w, average was 41 and the score was 48k. So yeah, not only is the 12900k way faster, it also consumes way less...

Hwinfo shows 48.64 peak wattage, but during the run afterburner had a peak at 49.2w. Anyways, doesn't really make a difference


----------



## Dr. Dro (Jul 6, 2022)

fevgatos said:


> You may consider it irrelevant (I do too) but gaming efficiency was what I replied to and said that the 12900k is way more efficient than zen 3. Haven't run supo for a while, Im no34 on 4k optimized with a 3090 on air
> 
> 
> Ok, just run the same test, peak wattage was 49.2w, average was 41 and the score was 48k. So yeah, not only is the 12900k way faster, it also consumes way less...
> ...



No pics though? Also, see my GPU usage is nowhere near maxed out, yours shouldn't be either.

I noticed you have a Gamerock which has a higher power limit than my card, but even then, GPU usage was not high to begin with.


----------



## fevgatos (Jul 6, 2022)

Dr. Dro said:


> No pics though? Also, see my GPU usage is nowhere near maxed out, yours shouldn't be either.
> 
> I noticed you have a Gamerock which has a higher power limit than my card, but even then, GPU usage was not high to begin with.


My gpu usage was 94%. Sorry for no pic, I don't expect anyone to lie about it so I assumed people would assume I wouldn't either. Here is the screenshot


----------



## Dr. Dro (Jul 6, 2022)

fevgatos said:


> My gpu usage was 94%. Sorry for no pic, I don't expect anyone to lie about it so I assumed people would assume I wouldn't either. Here is the screenshot
> 
> View attachment 253846



No, i'm not accusing you of lying, just wondering why. Your GPU usage seems much higher. Maybe my run is bunk or it's something to do with the newer build of Windows. Neither CPU or GPU was under any meaningful load, and I just considered your card's higher PL, so that makes sense. Either way since we aren't comparing stock for stock, it's not really valid. Nice though.


----------



## fevgatos (Jul 6, 2022)

Dr. Dro said:


> No, i'm not accusing you of lying, just wondering why. Your GPU usage seems much higher. Maybe my run is bunk or it's something to do with the newer build of Windows. Neither CPU or GPU was under any meaningful load, and I just considered your card's higher PL, so that makes sense. Either way since we aren't comparing stock for stock, it's not really valid. Nice though.


I have higher usage cause the 12900k pushes the 3090 more than the 5950x. The power limit is irrelevant at this resolution, the 3090 never pulled more than 300watts. Yours should have been even lower judging by the temperature. 

There is no overclock in my run, with my oc it should score much higher actually, since I run 5.6ghz on 2 cores.


----------



## Chrispy_ (Jul 6, 2022)

Tigger said:


> Pointless arguing with the AMD heads, they will constantly just try to prove you wrong. My 12700k hardly uses anything gaming i know that, whether it is over 2 hours or 5 mins. It might spike up sometimes but i would expect all CPUs to do that.
> 
> We own Intel ADL CPU's we are flogging a dead horse on TPU trying to defend them. Sometimes I think maybe i should just switch to AM4/5800x and join in the pitch forking.


W1zzard did that Alder Lake power limit scaling article a few months back.
Single-threaded and low-thread-count stuff runs practically full speed within PL1 (125W)

It's only rendering or other workloads that require all threads at 100% which really push the consumption up. Since most games aren't pushing more than 4-6 threads, AL runs at similar power levels to an AMD equivalent, whilst also being marginally faster.


----------



## AusWolf (Jul 6, 2022)

fevgatos said:


> Also a normal gaming scenario is with the gpu pegged at 100%, else you are having a cpu bottleneck...


This!

If your GPU isn't pegged to 100% usage with Vsync disabled and no FPS cap, it means it could do much more for you, but your graphics settings, or your weak CPU won't allow it. Reviewers like using unrealistic settings to test for pure CPU performance, but let's be honest, no one ever plays at 720p minimum. It's just another form of media sensationalism, nothing more. Completely pointless.


----------



## fevgatos (Jul 6, 2022)

AusWolf said:


> This!
> 
> If your GPU isn't pegged to 100% usage with Vsync disabled and no FPS cap, it means it could do much more for you, but your graphics settings, or your weak CPU won't allow it. Reviewers like using unrealistic settings to test for pure CPU performance, but let's be honest, no one ever plays at 720p minimum. It's just another form of media sensationalism, nothing more. Completely pointless.


Just finished 12minutes of cod at CPU bound settings, 200 to 300 fps, average consumption was 68.5w and peak was 83.525w from hwinfo64. But for some reason people think the 12900k consumes 300w while gaming


----------



## SOAREVERSOR (Jul 8, 2022)

AusWolf said:


> This!
> 
> If your GPU isn't pegged to 100% usage with Vsync disabled and no FPS cap, it means it could do much more for you, but your graphics settings, or your weak CPU won't allow it. Reviewers like using unrealistic settings to test for pure CPU performance, but let's be honest, no one ever plays at 720p minimum. It's just another form of media sensationalism, nothing more. Completely pointless.



This is sort of half true.  The first thing with games is "what are you playing?" and that's also the biggest one.

I've got a 2080 super in one box and a 3090 in the other.  Outside of very few single player games neither are going to hit 100%.   If someone is playing overwatch, or quake, or the myriad of other games we play on 1080p 240hz monitors they are never going to stretch their legs and CPU performance at lower resolutions does matter.   Now playing single player games like Doom Eternal or Elden Ring on 4k monitors or ultrawides (multiple monitors FTW!) that situation changes.  But this is a usage scenario issue.   

The point of measuring absolute CPU performance is just that.  There's nothing normal about a 100% pegged GPU unless it's struggling or you are doing some sort of non gaming task load on it.  That sort of thinking is what leads to nonsense like hitting 1000 fps in the starcraft loading menu and saying "aha, I have found it!".

For me with a 240hz monitor, 1080p, 240fps, frame locked I am never going to hit 100% gpu usage in the games I need those frame rates for.  That reverses when playing SP games at 4k or higher.

I only peg my system when I'm doing workloads over VMs.


----------



## fevgatos (Jul 8, 2022)

SOAREVERSOR said:


> This is sort of half true.  The first thing with games is "what are you playing?" and that's also the biggest one.
> 
> I've got a 2080 super in one box and a 3090 in the other.  Outside of very few single player games neither are going to hit 100%.   If someone is playing overwatch, or quake, or the myriad of other games we play on 1080p 240hz monitors they are never going to stretch their legs and CPU performance at lower resolutions does matter.   Now playing single player games like Doom Eternal or Elden Ring on 4k monitors or ultrawides (multiple monitors FTW!) that situation changes.  But this is a usage scenario issue.
> 
> ...


Not true. The whole goal is to have your GPU running at 100%. That's because most of the time it's the most expensive part of your build. If you are playing at 1080 then you shouldn't have a 3090 in the first place, you should have a 3070 which would get you your 240 fps while being full or close to fully utilized.


----------



## SOAREVERSOR (Jul 8, 2022)

fevgatos said:


> Not true. The whole goal is to have your GPU running at 100%. That's because most of the time it's the most expensive part of your build. If you are playing at 1080 then you shouldn't have a 3090 in the first place, you should have a 3070 which would get you your 240 fps while being full or close to fully utilized.



Your usage case is not mine, imagine that!


----------



## fevgatos (Jul 8, 2022)

SOAREVERSOR said:


> Your usage case is not mine, imagine that!


Sure, your use case is playing at 1080p, that's why im saying you should get a card according to what you are doing. Usually a person that's going to play at ultra low resolution and is going to be primarily CPU bound will not buy a high end card, therefore his GPU usage will still be high.

I mean I kept my 1080ti for as long as had my 2560*1080p monitor, I only upgraded to a 3090 after I upgraded the monitor. What would be the point of getting a faster card and still end up being CPU bound? I mean what are you even suggesting, that people build their computers trying to be CPU bound instead of GPU bound?


----------



## Deleted member 24505 (Jul 8, 2022)

Also the only reason i still have a 1080ti is i don't have cash to piss away on expensive GPUs like some do. It still does a good job at 1440p and i am definitely not over taxing the CPU  


Why would anyone buy a 3090 to play at 1080p, now that is pissing cash away.


----------



## SOAREVERSOR (Jul 9, 2022)

Tigger said:


> Also the only reason i still have a 1080ti is i don't have cash to piss away on expensive GPUs like some do. It still does a good job at 1440p and i am definitely not over taxing the CPU
> 
> 
> Why would anyone buy a 3090 to play at 1080p, now that is pissing cash away.



Multiple monitors child multiple monitors.  I do MP at 1080p 240, I do SP games at 4k 60!  Imagine that!


----------



## Mussels (Jul 9, 2022)

Tigger said:


> Also the only reason i still have a 1080ti is i don't have cash to piss away on expensive GPUs like some do. It still does a good job at 1440p and i am definitely not over taxing the CPU
> 
> 
> Why would anyone buy a 3090 to play at 1080p, now that is pissing cash away.


high refresh rates, which is what your CPU is actually the best choice for 
I can buy 1080p 360Hz monitors right now, and literally - what other GPU could do it?


This is still a really weird argument to be having. If you're GPU limited, your CPU isn't using it's maximum wattage. While it's totally fair to bring up your results with your setup, it's still not indicative of the platform as a whole, or an experience other users will have (and this is why reviews do 720p testing, so that you get a baseline comparison without the GPU limitation)


----------



## AusWolf (Jul 9, 2022)

Mussels said:


> This is still a really weird argument to be having. If you're GPU limited, your CPU isn't using it's maximum wattage.


Your CPU will never use its maximum wattage in games, unless you're on a Core i3 or lower.

Extra CPU resources can always be used for something else (background tasks, work, etc), but if your GPU is under-utilised, then it was a waste of money.


----------



## Dr. Dro (Jul 9, 2022)

SOAREVERSOR said:


> Multiple monitors child multiple monitors.  I do MP at 1080p 240, I do SP games at 4k 60!  Imagine that!



I'll kind of have to agree with @Tigger here, AMD's card does quite a bit better than ours at 1080p high-refresh. Navi 21 is a raster monster  

The odd thing about the high end Ampere cards is that you can climb up scene complexity and resolution and they'll take the hit, but for low resolutions, they just don't deliver their weight (and cost) imo


----------



## Deleted member 24505 (Jul 9, 2022)

If anyone wants to buy me a better card to balance out my system better feel free, until then it is what it is.



SOAREVERSOR said:


> Multiple monitors child multiple monitors.  I do MP at 1080p 240, I do SP games at 4k 60!  Imagine that!


Child, who do you think you are. Eat it and like it.

So with a better GPU my CPU power use would be much higher? Why, surely with a better GPU it would still be pegged near 100% use gaming so the CPU would still not be running a high % so therefore not running high power use or temps. Surely that is the ideal situation for a gaming rig.

As it is now, my games don't run as high FPS as they could as only using the 1080ti but still acceptable. with a balanced system why would the CPU run high % gaming if the GPU is doing the work which is surely what you want. If the CPU is running high % then it is not running ideally is it.

I don't really get the point of high refresh 1080 but then i don't play the shitey(imo) games that people need it for.

Also $1100 for a 24" monitor, nice to have cash to piss away on that.


----------

