• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-12900KS

Joined
Feb 10, 2020
Messages
178 (0.11/day)
Wizz, thanks for being the perfect German :D ...

even if this wouldn't be a great review (your love for detail is just "adorable" in a very positive way ;) ) I'd celebrate you "hard" just for this comment

"In case you're wondering, this is a proper review, not an April Fools' prank."

:D ... Man, I read it only today because of time restrictions but this alone just made my day :). Thanks for the laugh :)
 
Joined
May 8, 2021
Messages
1,978 (1.69/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Waiting for my KS to arrive , I bought the 9900KS and never regretted it for a moment.
I’m not as keen on the recently announced 3090ti though, I’m already gaming in my pants after a couple of hours running a 3090 OC.- the heat is atrocious on a 3090, so a 3090ti will clearly be a lot worse, then there is the power draw….
I think they need to start thinking very seriously about limiting power draw before governments legislate and ruin it - as they do everything else.
They are robbing us blind now across Europe for electricity and this comes as british MP’s now get their electricity paid for by us the tax payers now on top of a £2200 pay rise, so clearly it doesn’t matter them, but “we are all in it together“ eh…
Ain't nobody is robbing you in Europe, at least not everywhere. VAT was removed for electricity and Lithuania said no to Russian gas, my own city makes more electricity than it consumes. Anyway, you can't expect a place that is not resource rich, to make tons of electricity and for cheap. And by the way California made some restrictions for wattage and it's totally okay. We don't need trash like 3090 Ti, Jensen and Lisa too long gave zero fucks about heat output and power usage, so it's all good that it bites them in the arse. Not sure about others, but I fucking hate, when GPU is scorching my legs under desk and is as loud as leafblower. If our future is requirement for AC, if you want to play games, then Jensen can stick his RTX 4090 up his arse.
 
Joined
Jun 14, 2020
Messages
3,010 (2.01/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Ain't nobody is robbing you in Europe, at least not everywhere. VAT was removed for electricity and Lithuania said no to Russian gas, my own city makes more electricity than it consumes. Anyway, you can't expect a place that is not resource rich, to make tons of electricity and for cheap. And by the way California made some restrictions for wattage and it's totally okay. We don't need trash like 3090 Ti, Jensen and Lisa too long gave zero fucks about heat output and power usage, so it's all good that it bites them in the arse. Not sure about others, but I fucking hate, when GPU is scorching my legs under desk and is as loud as leafblower. If our future is requirement for AC, if you want to play games, then Jensen can stick his RTX 4090 up his arse.
I dont understand your post. You dont need a 3090ti or a 4090 to play games. Unless im missing something, there are cards starting from 75w tdp. Buy one of those? You know nobody is forcing you to buy a 300 or 500 watt gpu, right? Right??
 
Joined
Jan 14, 2019
Messages
10,650 (5.29/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
I dont understand your post. You dont need a 3090ti or a 4090 to play games. Unless im missing something, there are cards starting from 75w tdp. Buy one of those? You know nobody is forcing you to buy a 300 or 500 watt gpu, right? Right??
That's true, although midrange parts have also been creeping up in power consumption. The 960 and 1060 ate 120 W, the 2060 160 W and the 3060 180 W if I remember right (not to mention that low profile / no power connector options have completely disappeared). The same is true for CPUs. The 7700 I had just about fit into its 65 W TDP running at advertised 4 GHz all-core, but my 11700 can only do 2.8 GHz in Cinebench R23 within the same 65 W limit. It needs liquid cooling and a good motherboard that can supply it with 160+ W to reach its factory turbo bins (it's a locked CPU, so overclocking isn't even in the picture). One can praise AMD for their recent innovations, but their chiplet design isn't easier to cool at all. I briefly tried a R5 3600 which surprisingly ran hotter than my 11700 with the same cooler and power limits. These are all midrange parts...
 
Joined
Feb 20, 2020
Messages
9,340 (5.80/day)
Location
Louisiana
System Name Ghetto Rigs z490|x99|Acer 17 Nitro 7840hs/ 5600c40-2x16/ 4060/ 1tb acer stock m.2/ 4tb sn850x
Processor 10900k w/Optimus Foundation | 5930k w/Black Noctua D15
Motherboard z490 Maximus XII Apex | x99 Sabertooth
Cooling oCool D5 res-combo/280 GTX/ Optimus Foundation/ gpu water block | Blk D15
Memory Trident-Z Royal 4000c16 2x16gb | Trident-Z 3200c14 4x8gb
Video Card(s) Titan Xp-water | evga 980ti gaming-w/ air
Storage 970evo+500gb & sn850x 4tb | 860 pro 256gb | Acer m.2 1tb/ sn850x 4tb| Many2.5" sata's ssd 3.5hdd's
Display(s) 1-AOC G2460PG 24"G-Sync 144Hz/ 2nd 1-ASUS VG248QE 24"/ 3rd LG 43" series
Case D450 | Cherry Entertainment center on Test bench
Audio Device(s) Built in Realtek x2 with 2-Insignia 2.0 sound bars & 1-LG sound bar
Power Supply EVGA 1000P2 with APC AX1500 | 850P2 with CyberPower-GX1325U
Mouse Redragon 901 Perdition x3
Keyboard G710+x3
Software Win-7 pro x3 and win-10 & 11pro x3
Benchmark Scores Are in the benchmark section
Hi,
Why do i keep hearing voices :laugh:
 
Joined
Jun 22, 2012
Messages
277 (0.06/day)
Processor Intel i7-12700K
Motherboard MSI PRO Z690-A WIFI
Cooling Noctua NH-D15S
Memory Corsair Vengeance 4x16 GB (64GB) DDR4-3600 C18
Video Card(s) MSI GeForce RTX 3090 GAMING X TRIO 24G
Storage Samsung 980 Pro 1TB, SK hynix Platinum P41 2TB
Case Fractal Define C
Power Supply Corsair RM850x
Mouse Logitech G203
Software openSUSE Tumbleweed
The 7700 I had just about fit into its 65 W TDP running at advertised 4 GHz all-core, but my 11700 can only do 2.8 GHz in Cinebench R23 within the same 65 W limit.

The i7-7700 was a 4-core processor, the i7-11700 an 8-core processor. Both were made on a 14nm lithography; performance can only improve so much without further shrinking transistor size and more fundamental architectural changes. Try seeing what happens by limiting the 11700 to 4 cores.
 
Joined
Jan 14, 2019
Messages
10,650 (5.29/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
The i7-7700 was a 4-core processor, the i7-11700 an 8-core processor. Both were made on a 14nm lithography; performance can only improve so much without further shrinking transistor size and more fundamental architectural changes. Try seeing what happens by limiting the 11700 to 4 cores.
A good point, although GPUs have been shrinking in lithography, but that doesn't reflect on their power consumption. I guess nvidia and AMD are trying to get so much performance out of everything they sell that efficiency gets thrown out of the window regardless of lithography.
 
Joined
May 8, 2021
Messages
1,978 (1.69/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
I dont understand your post. You dont need a 3090ti or a 4090 to play games. Unless im missing something, there are cards starting from 75w tdp. Buy one of those? You know nobody is forcing you to buy a 300 or 500 watt gpu, right? Right??
Pal, I don't think that anyone needs that. 75 watt parts are scarce, the last card with that was RX 560 or GTX 1650. They are too old and too weak for me to care. Anyway, I already have a card I need, I'm just not terribly excited about evolution of GPUs, as they tend to get worse and worse in everything else, other than absolute power. Performance per watt seems to be stagnant.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,010 (2.01/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
That's true, although midrange parts have also been creeping up in power consumption. The 960 and 1060 ate 120 W, the 2060 160 W and the 3060 180 W if I remember right (not to mention that low profile / no power connector options have completely disappeared). The same is true for CPUs. The 7700 I had just about fit into its 65 W TDP running at advertised 4 GHz all-core, but my 11700 can only do 2.8 GHz in Cinebench R23 within the same 65 W limit. It needs liquid cooling and a good motherboard that can supply it with 160+ W to reach its factory turbo bins (it's a locked CPU, so overclocking isn't even in the picture). One can praise AMD for their recent innovations, but their chiplet design isn't easier to cool at all. I briefly tried a R5 3600 which surprisingly ran hotter than my 11700 with the same cooler and power limits. These are all midrange parts...
You can always buy a lower end model though. I mean if you had a 1060 and you went to remain at the 120W tdp, you don't need to buy a 3060, you can go for the 3050 (that's also 120w tdp). I think the problem is, people want the performance of a 3090 with the TDP of a 3050, which of course can't happen and I don't think that's Jensen's or Lisa's fault.

Performance per watt seems to be stagnant.
I don't think that's true. There isn't a huge progress, but there is progress. For example a 6600 is 30% faster than a 1070 at lower TDP, the 6600xt is 60% faster at the same TDP. So we have a 60% increase in performance / watt compared to 2017. Not great, not terrible either.
 
Joined
May 31, 2016
Messages
4,383 (1.48/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
You can always buy a lower end model though. I mean if you had a 1060 and you went to remain at the 120W tdp, you don't need to buy a 3060, you can go for the 3050 (that's also 120w tdp). I think the problem is, people want the performance of a 3090 with the TDP of a 3050, which of course can't happen and I don't think that's Jensen's or Lisa's fault.
That's not the problem. From what people are saying and are concerned about, the problem is each new gen offers more performance but the power is up as well. Basically the nodes are more advanced, are shrinking but the power consumption grows. Were you had a 75watt cards capable of playing games in a decent res and detail now you need twice that much power to consider gameplay worth your attention.
I don't think that's true. There isn't a huge progress, but there is progress. For example a 6600 is 30% faster than a 1070 at lower TDP, the 6600xt is 60% faster at the same TDP. So we have a 60% increase in performance / watt compared to 2017. Not great, not terrible either.
You should compare graphics from the same producer but different architectures as that is an indicator of power consumption progress or power/performance progress for every new architecture release in a company. If you compare 1070 from Nvidia compare it to Turing or Ampere not AMD RDNA2 since that is a different company and they have totally different products.
 
Joined
Jun 22, 2012
Messages
277 (0.06/day)
Processor Intel i7-12700K
Motherboard MSI PRO Z690-A WIFI
Cooling Noctua NH-D15S
Memory Corsair Vengeance 4x16 GB (64GB) DDR4-3600 C18
Video Card(s) MSI GeForce RTX 3090 GAMING X TRIO 24G
Storage Samsung 980 Pro 1TB, SK hynix Platinum P41 2TB
Case Fractal Define C
Power Supply Corsair RM850x
Mouse Logitech G203
Software openSUSE Tumbleweed
What exactly are the detail level and resolutions that older 75W GPUs were capable of at decent levels, and at what framerate? I think that the view that there has been no improvement at the very least disregards that the commonly used display resolutions on desktop systems have increased over the years, as well as general demand for graphical detail.

I find highly unlikely that nodes shrunk without corresponding efficiency increases. There's no question that peak GPU power has been steadily increasing, but people need to tweak the power limit not just for overclocking, but also downward for efficient operation and ask reviewers to test this, not only peak performance as usual. Same story as with CPUs, really.
 
Joined
May 8, 2021
Messages
1,978 (1.69/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
I don't think that's true. There isn't a huge progress, but there is progress. For example a 6600 is 30% faster than a 1070 at lower TDP, the 6600xt is 60% faster at the same TDP. So we have a 60% increase in performance / watt compared to 2017. Not great, not terrible either.
There's such comparison (even if it's incorrect, since you compare ancient card with brand new card, compare RDNA with RDNA2 or Polaris with RDNA2) and there is "what can I buy" thing. Current line up has disproportionally too many high wattage cards and too little lower wattage cards. AMD doesn't even have anything at 50 watts or 75 watts. nVidia doesn't even have anything at less than 130 watts. I know it's personal, but my personal upper TDP limit is around 130 watts, but I also like 100 watt cards. That used to be xx60 model power usage. AMD only has 6600, 6500 xt could have been okay in perf/watt, if it wasn't recycled laptop chip, which somehow lacks features that cards from early 2010s had. The only choice I could have is either 3050 or 6600. That's not much of choice. And then 3050 sucks donkey balls, it is slower than 6600, uses more power and costs the same or more. 6600 is nothing special and frankly is just okay card at too high price too. I remember I bought RX 560 4GB and in its day it ran games at 1440p 50-60 fps medium-high settings, was 150 Euros and consumed just 37 watts (since I accidentally got version without 6 pin connector). Imagine doing that today, it's not the same anymore. Your only option would be detuned 1650.

What exactly are the detail level and resolutions that older 75W GPUs were capable of at decent levels, and at what framerate?
It depends, but 7000 series Radeons in their day were capable of 1080p medium-high at 45-50 fps. RX 560 that I have is 37.5 watt model and when it was new it could run CoD WW2, GTA 5 at 1440p medium-high with 50-60 fps. 1050 is RX 560, but a little bit faster and with only 2 GB VRAM. Still, it could have been decent 1080p card. They certainly weren't awful e-waste.

I think that the view that there has been no improvement at the very least disregards that the commonly used display resolutions on desktop systems have increased over the years, as well as general demand for graphical detail.
That's a bit of farce. Even 5 years ago nobody running games at ultra was really thinking "yeah this game looks like ass, I want more". At this point graphics quality improvements barely matter anymore and it's hard to tell high from ultra, sometimes even medium from ultra. Games don't look bad, haven't looked bad for pretty much decade now, that's not terribly important. At lower budget, you are happy to run games at medium, not ultra or high, so that's irrelevant argument too. And it's a farce, mostly because devs never asks us if we want that, they just "improve". At this point you just get more and more bloat, rather than legit improvements to graphics or whatever else. My main reason for upgrading last card wasn't because it sucked, but because lack of talent among developers left me no other choice. It's so bad, that I can run a game from decade ago at say high setting at 1440p with RX 560, it looks good and runs good, but with same card today, latest AAA title runs at 900p low with barely acceptable fps. I don't see how it's better or improved. Sounds like bullshit to me. Even if current AAA title ran well at same resolution, I don't think that I could tell that new game looks actually better at low settings, than old one at high settings. If anyone actually cared about games running well, nVidia and AMD would be out of business.
 
Joined
Jul 19, 2016
Messages
479 (0.16/day)
There will be some poor sap out there that will pair this 300W CPU with a 450W 3090 Ti and wonder why he's sitting there sweating when loading up the menu of Fortnite or other life wasting mainstream game.
 
Joined
May 31, 2016
Messages
4,383 (1.48/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
What exactly are the detail level and resolutions that older 75W GPUs were capable of at decent levels, and at what framerate? I think that the view that there has been no improvement at the very least disregards that the commonly used display resolutions on desktop systems have increased over the years, as well as general demand for graphical detail.

I find highly unlikely that nodes shrunk without corresponding efficiency increases. There's no question that peak GPU power has been steadily increasing, but people need to tweak the power limit not just for overclocking, but also downward for efficient operation and ask reviewers to test this, not only peak performance as usual. Same story as with CPUs, really.
Maybe that is another thing you can compare? Why do you ask me about it. I'm just saying, and agree with others, there were cards you could use to play games with a decent resolution and detail level (for that time).
With a node shrink, you can use that node to exert more power or performance than efficiency. It would seem that's the trend for graphics cards. Putt a large emphasis on the performance increase rather than efficiency or either balance.
If you think about performance as an improvement sure, it has increased no doubt but the power consumption has increased as well.
 
Joined
Jun 14, 2020
Messages
3,010 (2.01/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I know it's personal, but my personal upper TDP limit is around 130 watts, but I also like 100 watt cards.
They say vote with your wallet. You bought an rx 580 that draws 180w instead of a 1060 that draws 120w,. If everyone that cares about power consumption did what you did, no wonder there aren't many low power consumption cards, right? I mean why would nvidia stick the xx60 series to 120w when even people that care about power consumption (like you) opted for an rx580 instead?

There will be some poor sap out there that will pair this 300W CPU with a 450W 3090 Ti and wonder why he's sitting there sweating when loading up the menu of Fortnite or other life wasting mainstream game.
Im pretty sure my 12900k doesn't draw 300w in a fortnite loading screen. It barely hits 80-90 during games so yeah...NOPE
 
Joined
May 8, 2021
Messages
1,978 (1.69/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
They say vote with your wallet. You bought an rx 580 that draws 180w instead of a 1060 that draws 120w,. If everyone that cares about power consumption did what you did, no wonder there aren't many low power consumption cards, right? I mean why would nvidia stick the xx60 series to 120w when even people that care about power consumption (like you) opted for an rx580 instead?
Because I'm not mister money bags mate. Saw good deal and took it, but here's a thing. I also was willing to modify my card's vBIOS to my liking and now it's capped to 100 watts and 1100 MHz core clock. In practice, actual power consumption of card varies and is 70-90 watts in gaming. With such tune, I can only reach 100 watt power usage in very specific mining or distributed computing situations, maybe in Furmark too. Also my specific card wasn't 180 watts, it's 145 watt card from factory and in most conditions actually only consumed 130-140 watts. Why that happens? Because card hits clock speed, voltage wall and games don't utilize GPU resources ideally ever. Also you didn't really know that Polaris cards are phenomenal undervolters, I tried that out, but didn't stick with it. There are good gains, but alone weren't good enough for me. So I ended up with card, that is faster than RX 570, but slower than RX 480, while consuming less power than GTX 1060 and is slightly slower than GTX 1060, but performance per watt is higher. So, yeah I'm fucking menace of free market and of reasonable cards. Anyway, where I live, RX 580 which I bought was selling for 210 Euros new, meanwhile GTX 1060 6GB was going for over 300 Euros. As you see, I don't like to be robbed by Jen-chan and took proper action against that. Unfortunately, AMD clamped down on vBIOS modifications and now you can't do it in reasonable way anymore, so next time I will care about official wattage rating much more and next time I won't buy lowest end model of card either.

Edit:
A bit off-topic, but my whole PC with CPU and GPU mining at same time, uses only 230 watts or less. In gaming it uses 170-210 watts. I think I managed to reach 250-260 watts with prime95 small FFTs and Furmark at same time. At idle it sips 45-50 watts. When turned off, it consumes 0.2 watts. Come on dude, your i9 uses more power than my whole PC, you have some audacity to knock on its power usage.
 
Joined
Jun 14, 2020
Messages
3,010 (2.01/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Because I'm not mister money bags mate. Saw good deal and took it, but here's a thing. I also was willing to modify my card's vBIOS to my liking and now it's capped to 100 watts and 1100 MHz core clock. In practice, actual power consumption of card varies and is 70-90 watts in gaming. With such tune, I can only reach 100 watt power usage in very specific mining or distributed computing situations, maybe in Furmark too. Also my specific card wasn't 180 watts, it's 145 watt card from factory and in most conditions actually only consumed 130-140 watts. Why that happens? Because card hits clock speed, voltage wall and games don't utilize GPU resources ideally ever. Also you didn't really know that Polaris cards are phenomenal undervolters, I tried that out, but didn't stick with it. There are good gains, but alone weren't good enough for me. So I ended up with card, that is faster than RX 570, but slower than RX 480, while consuming less power than GTX 1060 and is slightly slower than GTX 1060, but performance per watt is higher. So, yeah I'm fucking menace of free market and of reasonable cards. Anyway, where I live, RX 580 which I bought was selling for 210 Euros new, meanwhile GTX 1060 6GB was going for over 300 Euros. As you see, I don't like to be robbed by Jen-chan and took proper action against that. Unfortunately, AMD clamped down on vBIOS modifications and now you can't do it in reasonable way anymore, so next time I will care about official wattage rating much more and next time I won't buy lowest end model of card either.

Edit:
A bit off-topic, but my whole PC with CPU and GPU mining at same time, uses only 230 watts or less. In gaming it uses 170-210 watts. I think I managed to reach 250-260 watts with prime95 small FFTs and Furmark at same time. At idle it sips 45-50 watts. When turned off, it consumes 0.2 watts. Come on dude, your i9 uses more power than my whole PC, you have some audacity to knock on its power usage.
Im not knocking on your pc's power usage, im knocking on you pretending to care about it when you buy an inefficient card Yes you can tune it, but so can you the 1060 and you are back to square one. I don't know about shops in your country but the 1060 was always cheaper than the rx580, since its' also 1 year older. The 580 came out in 2017, by that time 1060's in my country were 230-250.
 
Joined
May 8, 2021
Messages
1,978 (1.69/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Im not knocking on your pc's power usage, im knocking on you pretending to care about it when you buy an inefficient card Yes you can tune it, but so can you the 1060 and you are back to square one. I don't know about shops in your country but the 1060 was always cheaper than the rx580, since its' also 1 year older. The 580 came out in 2017, by that time 1060's in my country were 230-250.
RX 580 is just better card overall. Efficiency matters, but so does price, longevity. 1060 has 6GB VRAM, RX 580 has 8GB and it will last longer. 1060 was stupidly uneconomical to buy, again RX 580 wins. 1060 had slightly lower TDP stock, 1060 wins, but you make it sound like RX 580 is just complete garbage in that aspect. The difference is 30 watts. Not nothing, but not a lot either. If you claim that you aren't knocking my computer's power usage, how come you argue so much about 1060? Especially, when I clearly state, that it was stupidly priced, uncompetitive. Also 1060 didn't really come out earlier, RX 580 was refreshed RX 480, which came out in 2016. There's no architectural difference, just slightly better yield and efficiency thrown out of window. I have managed to make RX 480's vBIOS to work on RX 580, but figured, that even RX 480 wasn't efficient enough for me.

If you still haven't got the memo, I don't care about your country and its prices, I don't live there. You can't even write in which currency those numbers are. It was what it was in Lithuania and the only 1060 you could buy for similar price to 580 was that 3GB scam version, which was DOA. As if VRAM situation wasn't insulting already, they also cut down core. Again, it was incompetitive with RX 570, let alone RX 580.
 
Joined
Jan 14, 2019
Messages
10,650 (5.29/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
I find highly unlikely that nodes shrunk without corresponding efficiency increases. There's no question that peak GPU power has been steadily increasing, but people need to tweak the power limit not just for overclocking, but also downward for efficient operation and ask reviewers to test this, not only peak performance as usual. Same story as with CPUs, really.
GTX 1080: 16 nm, 180 W.
RTX 2070: 12 nm, 175 W.
RTX 3060: 8 nm, 170 W.

All three offer relatively the same performance with relatively the same power requirement. Where's the advantage of the shrunk node? Why should I tweak my power levels when graphics cards offered decent efficiency out of the box a couple years ago?

What exactly are the detail level and resolutions that older 75W GPUs were capable of at decent levels, and at what framerate? I think that the view that there has been no improvement at the very least disregards that the commonly used display resolutions on desktop systems have increased over the years, as well as general demand for graphical detail.
My 1050 Ti kicked ass with passive cooling and no power connector. Which current gen midrange graphics card can do the same?
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.08/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
How is this worth it to anybody, other than people who's literal job is benchmarking and overclocking?

GTX 1080: 16 nm, 180 W.
RTX 2070: 12 nm, 175 W.
RTX 3060: 8 nm, 170 W.

All three offer relatively the same performance with relatively the same power requirement. Where's the advantage of the shrunk node? Why should I tweak my power levels when graphics cards offered decent efficiency out of the box a couple years ago?


My 1050 Ti kicked ass with passive cooling and no power connector. Which current gen midrange graphics card can do the same?
the 3060 is a good 20-25% faster than the 1080 - 25% faster with the same power consumption is exactly what you want from a product two generations newer - an improvement.
 
Joined
Jan 14, 2019
Messages
10,650 (5.29/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
the 3060 is a good 20-25% faster than the 1080 - 25% faster with the same power consumption is exactly what you want from a product two generations newer - an improvement.
That's nothing considering that the 1060 6 GB is nearly twice as fast as the 960 with the same power consumption - THIS is what I want from a product ONE generation newer.

Or let's just continue playing the "20% improvement" game so I can keep my 2070 longer. :D
 
Joined
Mar 21, 2016
Messages
2,364 (0.78/day)
It's the high end halo cards for GPU's that represents the biggest problem just like the 12900KS. The die space required and the worse yields in relation to die space and in the case of GPU's the additional VRAM and other circuitry requirements. These things all diminish the likely hood of adequate supply and cost association of the lower tier brackets of cards below them that could be reproduced more affordably and abundantly. Fewer SKU's and a quicker architecture turn around product cycles with less outrageous halo tier products that are out of hand is what is needed. Standards to follow, adhere, and stick towards meeting is overdue. It's high time that we see higher efficiency standards for GPU's and CPU's and PSU's across the spectrum towards a more sustainable environmentally friendly and more open accessibility.

GPU's should go thru a generation of performance progress followed by efficiency progress on a tick tock cycle. Don't exceed power draw from the previous generation in the subsequent generation which would be use as a optimization focused one that also brings down the cost for everyone at the same time and improves availability to all. It's more environmentally sound and better from a societal fairness standpoint as well. The tech industry doesn't need to emulate the muscle car industry. There has already been enough consequences from poor decision making in regard to the auto industry the tech industry doesn't need to compound that problem with it's own set of misguided decision making.
 
Joined
Jun 14, 2020
Messages
3,010 (2.01/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
RX 580 is just better card overall. Efficiency matters, but so does price, longevity. 1060 has 6GB VRAM, RX 580 has 8GB and it will last longer. 1060 was stupidly uneconomical to buy, again RX 580 wins. 1060 had slightly lower TDP stock, 1060 wins, but you make it sound like RX 580 is just complete garbage in that aspect. The difference is 30 watts. Not nothing, but not a lot either. If you claim that you aren't knocking my computer's power usage, how come you argue so much about 1060? Especially, when I clearly state, that it was stupidly priced, uncompetitive. Also 1060 didn't really come out earlier, RX 580 was refreshed RX 480, which came out in 2016. There's no architectural difference, just slightly better yield and efficiency thrown out of window. I have managed to make RX 480's vBIOS to work on RX 580, but figured, that even RX 480 wasn't efficient enough for me.

If you still haven't got the memo, I don't care about your country and its prices, I don't live there. You can't even write in which currency those numbers are. It was what it was in Lithuania and the only 1060 you could buy for similar price to 580 was that 3GB scam version, which was DOA. As if VRAM situation wasn't insulting already, they also cut down core. Again, it was incompetitive with RX 570, let alone RX 580.
https://tpucdn.com/review/msi-rx-580-mech-2/images/power_average.png}

Techpowerup shows the 580 at 198w and the 1060 at 116 watts. The difference is huge. I know cause I bought a 1060 specifically for that reason, it had way lower power consumption. Both cards are too slow for the ram to make any difference but whatever

As i've said, people vote with their wallet, nvidia offered you almost twice the efficiency and you ignored it, so yeah, makes sense they got the memo that you don't care about efficiency.
 
Joined
Jun 22, 2012
Messages
277 (0.06/day)
Processor Intel i7-12700K
Motherboard MSI PRO Z690-A WIFI
Cooling Noctua NH-D15S
Memory Corsair Vengeance 4x16 GB (64GB) DDR4-3600 C18
Video Card(s) MSI GeForce RTX 3090 GAMING X TRIO 24G
Storage Samsung 980 Pro 1TB, SK hynix Platinum P41 2TB
Case Fractal Define C
Power Supply Corsair RM850x
Mouse Logitech G203
Software openSUSE Tumbleweed
GTX 1080: 16 nm, 180 W.
RTX 2070: 12 nm, 175 W.
RTX 3060: 8 nm, 170 W.

All three offer relatively the same performance with relatively the same power requirement. Where's the advantage of the shrunk node? Why should I tweak my power levels when graphics cards offered decent efficiency out of the box a couple years ago?
Transistor count has steadily increased, features have increased, VRAM has increased. It's not really a fair comparison considering that there's no real power constraint nor demand for it on desktop systems. The desktop RTX3060 still is about 20% faster than the GTX 1080 as already mentioned above.

If on the other hand you check out the mobile versions (where clocks are lower, allowing for more efficient operation due to a practical need for lower power), it becomes clearer that smaller nodes lead in principle to better efficiency, which should be an obvious statement anyway:

GTX 1080 Mobile: 150W
RTX 2070 Mobile: 115W
RTX 3060 Mobile: 80W

These should actually be performance-wise all within a few % from each other.

Why should I tweak my power levels when graphics cards offered decent efficiency out of the box a couple years ago?
Because most desktop gamers do not care as long as power remains within reasonable levels, and manufacturers have realized this. There's no need to artificially gimp performance when end-users can do that themselves if they want or need.

My 1050 Ti kicked ass with passive cooling and no power connector. Which current gen midrange graphics card can do the same?
Once current midrange GPUs will have the same inflation-adjusted price of midrange GPUs of when your 1050Ti was released, low-power, passive GPUs might start appearing as well.

Until then, this won't make economically sense neither for manufacturers nor end-users, also given that the latter can adjust power themselves and probably already run their cards passively or semi-passively given the massive coolers they generally come up with nowadays.

That's nothing considering that the 1060 6 GB is nearly twice as fast as the 960 with the same power consumption - THIS is what I want from a product ONE generation newer.
100% improvement after one generation is never going to happen (or: never again, if you were referring to some cases from the late 1990s-early 2000s)
 
Last edited:
Joined
Jun 14, 2020
Messages
3,010 (2.01/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
GTX 1080: 16 nm, 180 W.
RTX 2070: 12 nm, 175 W.
RTX 3060: 8 nm, 170 W.

All three offer relatively the same performance with relatively the same power requirement. Where's the advantage of the shrunk node? Why should I tweak my power levels when graphics cards offered decent efficiency out of the box a couple years ago?
It's a little bit of an unfair comparison, only because the 3060 is probably the worst Nvidia offering. It's just a bad card. If you compare the 3060ti to a 1080ti, the difference is massive. Τhe 1080ti consumes 25% more while being 15% slower. That's without even including all the goodies of the 3060ti (dlss / rt etc.). So yeah...

Even a 3070 is around 35-40% faster at lower tdp..
 
Top