• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-12900KS

Joined
Jan 14, 2019
Messages
10,988 (5.38/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Transistor count has steadily increased, features have increased, VRAM has increased. It's not really a fair comparison considering that there's no real power constraint nor demand for it on desktop systems. The desktop RTX3060 still is about 20% faster than the GTX 1080 as already mentioned above.
Features and VRAM have got little to do with the whole card's power consumption, as long as you have the same number of RAM chips of the same kind. Wait, the 1060 actually has more than the 960. ;)

If on the other hand you check out the mobile versions (where clocks are lower, allowing for more efficient operation due to a practical need for lower power), it becomes clearer that smaller nodes lead in principle to better efficiency, which should be an obvious statement anyway:

GTX 1080 Mobile: 150W
RTX 2070 Mobile: 115W
RTX 3060 Mobile: 80W

These should actually be performance-wise all within a few % from each other.
Then there should be no reason for nvidia and AMD to shoot their desktop cards' TDPs through the roof. Sure, the extra 5% will convince idiots people who still believe that 5% is visible while one's focus is on the game, but it also portrays their products as inefficient pieces of garbage that need gigantic coolers to run at acceptable temperatures... unless the same "all for 5%" people also believe that bigger is always better.

Because most desktop gamers do not care as long as power remains within reasonable levels, and manufacturers have realized this. There's no need to artificially gimp performance when end-users can do that themselves if they want or need.
I guess I'm in the minority with my love for small form factor / passively cooled hardware. I'm happier to see a modern game run on an iGPU or old / low profile PC than to see a hundred core CPU with a 3090 in action.

Once current midrange GPUs will have the same inflation-adjusted price of midrange GPUs of when your 1050Ti was released, low-power, passive GPUs might start appearing as well.
I hope you're right.

Until then, this won't make economically sense neither for manufacturers nor end-users, also given that the latter can adjust power themselves and probably already run their cards passively or semi-passively given the massive coolers they generally come up with nowadays.
Actually, I think we live in a time when it makes perfect sense. Low power cards need less electricity which isn't only good for the green movements, but also to counteract rising energy prices. They also need smaller heatsinks that are cheaper to manufacture. Nvidia / AMD shouldn't blame the high price of their products on resource costs when they themselves design them to be needing bigger coolers and beefier VRMs than they practically should. I mean, sure, copper and aluminium are expensive, but who said you must have 5 kg of them on your graphics card when with a little (factory) tweaking, it would work just fine with a lot less? Also, having fewer fans (or no fans) on your graphics card significantly decreases the chance of failure, also decreasing the amount of e-waste on the planet.

100% improvement after one generation is never going to happen (or: never again, if you were referring to some cases from the late 1990s-early 2000s)
Then maybe generations shouldn't come so soon after each other, either. At least to me, it doesn't make any sense to release a product that's barely better than the last one.

It's a little bit of an unfair comparison, only because the 3060 is probably the worst Nvidia offering. It's just a bad card. If you compare the 3060ti to a 1080ti, the difference is massive. Τhe 1080ti consumes 25% more while being 15% slower. That's without even including all the goodies of the 3060ti (dlss / rt etc.). So yeah...

Even a 3070 is around 35-40% faster at lower tdp..
True. Though x80 Ti cards have always been halo products. Only that nvidia decided to change that to x90 with Ampere - or just decided to release several halo products within the same generation (whichever way one looks at it). Also, you're comparing within two generational gaps again. If I saw the same improvement coming from Turing, I'd say it's great.

Not that I'm complaining, though. Like I said, with baby steps like these, I can probably keep my 2070 for a long time. :)
 
Last edited:
Joined
Jun 14, 2020
Messages
3,275 (2.15/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
True. Though x80 Ti cards have always been halo products. Only that nvidia decided to change that to x90 with Ampere - or just decided to release several halo products within the same generation (whichever way one looks at it). Also, you're comparing within two generational gaps again. If I saw the same improvement coming from Turing, I'd say it's great.

Not that I'm complaining, though. Like I said, with baby steps like these, I can probably keep my 2070 for a long time. :)
Well if there was no DLSS or RT I wouldn't replace my 1080ti with a 3070 for example, but we have to keep in mind that the 1080ti was a 700$ card while the 3070 was a 500$ card. So 2 gens for 40% more rasterization performance, cheaper price point, lower cost, lower consumption and RT + DLSS isn't bad at all in my opinion.

Just to give you an example though, my tuned 1080ti (watercooled) was basically half (and sometimes less than that!!) the performance of my 3090, while the first one was hitting 250w power consumption and the 2nd one ~400. It's an okay improvement in terms of performance / watt.
 
Joined
Jan 14, 2019
Messages
10,988 (5.38/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Well if there was no DLSS or RT I wouldn't replace my 1080ti with a 3070 for example, but we have to keep in mind that the 1080ti was a 700$ card while the 3070 was a 500$ card. So 2 gens for 40% more rasterization performance, cheaper price point, lower cost, lower consumption and RT + DLSS isn't bad at all in my opinion.
I would agree if the 3070 wasn't a $500 card only on paper. And again, we're talking about two generations here. There is a lot less improvement coming from Turing (basically nothing).

Ampere should have been named "Turing Refresh" in my opinion.
 
Joined
Mar 21, 2016
Messages
2,411 (0.79/day)
Features and VRAM have got little to do with the whole card's power consumption, as long as you have the same number of RAM chips of the same kind. Wait, the 1060 actually has more than the 960. ;)


Then there should be no reason for nvidia and AMD to shoot their desktop cards' TDPs through the roof. Sure, the extra 5% will convince idiots people who still believe that 5% is visible while one's focus is on the game, but it also portrays their products as inefficient pieces of garbage that need gigantic coolers to run at acceptable temperatures... unless the same "all for 5%" people also believe that bigger is always better.


I guess I'm in the minority with my love for small form factor / passively cooled hardware. I'm happier to see a modern game run on an iGPU or old / low profile PC than to see a hundred core CPU with a 3090 in action.


I hope you're right.


Actually, I think we live in a time when it makes perfect sense. Low power cards need less electricity which isn't only good for the green movements, but also to counteract rising energy prices. They also need smaller heatsinks that are cheaper to manufacture. Nvidia / AMD shouldn't blame the high price of their products on resource costs when they themselves design them to be needing bigger coolers and beefier VRMs than they practically should. I mean, sure, copper and aluminium are expensive, but who said you must have 5 kg of them on your graphics card when with a little (factory) tweaking, it would work just fine with a lot less? Also, having fewer fans (or no fans) on your graphics card significantly decreases the chance of failure, also decreasing the amount of e-waste on the planet.


Then maybe generations shouldn't come so soon after each other, either. At least to me, it doesn't make any sense to release a product that's barely better than the last one.


True. Though x80 Ti cards have always been halo products. Only that nvidia decided to change that to x90 with Ampere - or just decided to release several halo products within the same generation (whichever way one looks at it). Also, you're comparing within two generational gaps again. If I saw the same improvement coming from Turing, I'd say it's great.

Not that I'm complaining, though. Like I said, with baby steps like these, I can probably keep my 2070 for a long time. :)
I agree with most of this though I feel a tick/tock architecture transitioning between performance and optimization architectures is the way to go. Keep a ceiling threshold in mind for performance architecture and don't budge beyond it and during the optimization architecture bump up efficiency while scaling down the power draw increasing efficiency and reducing the cost of the previous generation. In particular scaling down the highest tier SKU's should be possible thanks to advancements in different area's like VRAM speeds and capacities along with node transitions and refinement plus better yields to the node itself over time. I think racing to chase performance beyond a very non favorable tipping point at the expensive of power draw and yields + added material costs is a clear mistake that hurts the majority of consumers and the environment in the process. Say what you will, but I don't consider it very wise to feed said wolf.
 
Joined
May 31, 2016
Messages
4,412 (1.47/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Well if there was no DLSS or RT I wouldn't replace my 1080ti with a 3070 for example, but we have to keep in mind that the 1080ti was a 700$ card while the 3070 was a 500$ card. So 2 gens for 40% more rasterization performance, cheaper price point, lower cost, lower consumption and RT + DLSS isn't bad at all in my opinion.

Just to give you an example though, my tuned 1080ti (watercooled) was basically half (and sometimes less than that!!) the performance of my 3090, while the first one was hitting 250w power consumption and the 2nd one ~400. It's an okay improvement in terms of performance / watt.
So you want to tell me that if you have a GPU that is 10x faster than a 1080 Ti consuming 2000W of power that would have been a great achievement and an improvement because of performance / watt?
Because your point of view strongly suggests that course of action.
 
Joined
May 8, 2021
Messages
1,978 (1.65/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
https://tpucdn.com/review/msi-rx-580-mech-2/images/power_average.png}

Techpowerup shows the 580 at 198w and the 1060 at 116 watts.
Bruh, that's not that model.


The difference is huge. I know cause I bought a 1060 specifically for that reason, it had way lower power consumption. Both cards are too slow for the ram to make any difference but whatever
lol at too slow. More VRAM, means better textures and anisotropic filtering now. For long term, it also means that your card doesn't start to stutter sooner than it should. VRAM is good thing too have and as much as you can.

BTW your signature shows that you have 3090, not 1060. It's also very "efficient" card.

As i've said, people vote with their wallet, nvidia offered you almost twice the efficiency and you ignored it, so yeah, makes sense they got the memo that you don't care about efficiency.
At twice the price, less VRAM and also while scamming people with that DOA 3GB model. All that says that they themselves don't give a shit about opinions, actual demand of products and feel all good about scamming people. Power consumption for them is not a priority and is just an externality of cut down core, that's all. And what I do is not really important, when 1060 outsold RX 580, RX 570, RX 480 and RX 470 combined. nVidia got the memo that people buy what is heavily advertised and don't care too much about anything else. If they actually "got the memo" about power consumption, how come 2060 consumes 30 watts more and basically the same with 3060? Also nVidia haven't mentioned a single thing about lower power usage or efficiency on their RTX 3060 page. That really shows how much they care about that. Probably a lot more than me with 100 watt RX 580. Their own page keeps yapping about some Ai shit, ray tracing, performance, creativity, drivers. meanwhile people, who buy xx60 tier cards mostly care about value and by value I mean fps/dollar ratio, which nV doesn't even mention. That also really shows that they don't have their heads in their arses. At least Polaris was advertised at that, demoed against 900 series and beat them soundly in power efficiency and value for gamers. We all know that refresh was garbage, but nothing what easy vBIOS mod wouldn't fix.
 
Joined
Jun 14, 2020
Messages
3,275 (2.15/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Bruh, that's not that model.
Doesn't matter, im looking at the reference numbers. Are you suggesting there is an rx 580 that consumes 130 watts? LOL

lol at too slow. More VRAM, means better textures and anisotropic filtering now. For long term, it also means that your card doesn't start to stutter sooner than it should. VRAM is good thing too have and as much as you can.
Not true. Vram, same as actual system ram is absolutely useless until you don't have enough of it. For the calibre of a 1060, 6gb is enough. I don't think you are going to be playing at anything over 1080p resolution, are you?

At twice the price
1060 is not twice the price of an rx 580. Im sorry but that is just lying. By the time the RX580 launched you could easily find a 1060 for 250 or less. I know cause I bought 2, an asus dual and an nvidia founders edition. You voted with your wallet, you don't mind your GPU to draw twice the power for the same performance for a tradeoff of vram. So now you can't complain about increasing power consumption on gpu's.
 
Joined
Jun 14, 2020
Messages
3,275 (2.15/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
So you want to tell me that if you have a GPU that is 10x faster than a 1080 Ti consuming 2000W of power that would have been a great achievement and an improvement because of performance / watt?
Because your point of view strongly suggests that course of action.
It's not my view, it's math. Yes a card that consumes 8 times as much power but performs 10 times as much IS an improvement in efficiency. It will consume 20% less power for the same workload. Meaning, I'll render a scene in 1 hour consuming 2.000watts while the 1080ti will render it in 10 hours consuming 2.500 watts. How is that not an improvement? LOL
 
Joined
May 8, 2021
Messages
1,978 (1.65/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Doesn't matter, im looking at the reference numbers.
There hasn't been reference RX 580 anywhere.

Are you suggesting there is an rx 580 that consumes 130 watts? LOL
It is task dependent, because different tasks don't use up all core components and therefore don't require full core amperage. Power usage is watts, which are volt * amps.

Not true. Vram, same as actual system ram is absolutely useless until you don't have enough of it. For the calibre of a 1060, 6gb is enough. I don't think you are going to be playing at anything over 1080p resolution, are you?
I'm running games at 1440p with RX 580. I'm damn sure that I can turn on slightly better textures than with your 1060 6GB.


1060 is not twice the price of an rx 580. Im sorry but that is just lying. By the time the RX580 launched you could easily find a 1060 for 250 or less. I know cause I bought 2, an asus dual and an nvidia founders edition.
Cool. Founders ed wasn't available in Lithuania ever, no retailer ever has founder cards from nV, only if other brand releases founders card. Something like Asus GTX 1060 turbo (fictional example, I don't think there was 1060 with blower). I also never claimed that 1060 was twice the price, I only said that it was 300 EUR card. Maybe on lucky day with lethargic cooler it was 280 EUR, but no less than that. So wtf you claim with "Im sorry but that is just lying"? That you lie to yourself? Again, you don't mention currency of "250", if it's in USD, then that's totally pointless to me, since there are import taxes, VAT, customs. And yeah, that's a nice flex mr. money bags, you can have your 2 1060s and bugger off.
 
Joined
Jun 14, 2020
Messages
3,275 (2.15/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
There hasn't been reference RX 580 anywhere.


It is task dependent, because different tasks don't use up all core components and therefore don't require full core amperage. Power usage is watts, which are volt * amps.


I'm running games at 1440p with RX 580. I'm damn sure that I can turn on slightly better textures than with your 1060 6GB.



Cool. Founders ed wasn't available in Lithuania ever, no retailer ever has founder cards from nV, only if other brand releases founders card. Something like Asus GTX 1060 turbo (fictional example, I don't think there was 1060 with blower). I also never claimed that 1060 was twice the price, I only said that it was 300 EUR card. Maybe on lucky day with lethargic cooler it was 280 EUR, but no less than that. So wtf you claim with "Im sorry but that is just lying"? That you lie to yourself? Again, you don't mention currency of "250", if it's in USD, then that's totally pointless to me, since there are import taxes, VAT, customs. And yeah, that's a nice flex mr. money bags, you can have your 2 1060s and bugger off.
The review has a reference 580, that's where I quoted the consumption from.

Yes different tasks require different wattage but that applies to both cards. So when an rx 580 will consume 130w the 1060 will consume 70. The difference will still be there.

Im talking about euros. Why would I be flexing with 1060's, lol, I had 2 PC so I bought 2 cards.And yes, you mentioned twice the price in your previous comment, so I assumed you were talking about the 1060. As far as I can remember the 1060 was around 280 to 300 on release, but when you bought the 580 (which has to be a year later, since it was actually released a year later) the prices were much lower. I bought my asus dual for 234€ and my FE for 250.
 
Joined
Mar 21, 2016
Messages
2,411 (0.79/day)
On the one hand you're arguing about a RX 580 vs 1060 on power draw then mention 1080Ti vs RX 3090 on power draw and it's in no way shape or form similar or the same. It's literally 150w difference with the latter and as per AMD and NVIDIA's own stated TDP's 65w difference between the former. So basically it's over double on the halo tier cards no real shocker there power rise with performance isn't very linear.
 
Joined
May 31, 2016
Messages
4,412 (1.47/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
It's not my view, it's math. Yes a card that consumes 8 times as much power but performs 10 times as much IS an improvement in efficiency. It will consume 20% less power for the same workload. Meaning, I'll render a scene in 1 hour consuming 2.000watts while the 1080ti will render it in 10 hours consuming 2.500 watts. How is that not an improvement? LOL
All know how math work but realistically that card would have been a total disappointment despite any math. There is a fine line that should not be crossed.
This situation correlates with bad architecture optimization and advancement. It also shows how manipulative math can be to prove a point but it is totally out of any sort of reasonable perspective.
 
Joined
Jun 14, 2020
Messages
3,275 (2.15/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
All know how math work but realistically that card would have been a total disappointment despite any math.
No, it certainly wouldn't. All 3d artists or whatever that care about efficiency would instantly jump on that card. You need to realize that power consumption on it's own is completely and utterly irrelevant. What determines efficiency is the work done for said consumption. It's not even just graphics cards. My AC? Yeah, I hope it consumed 50 times as much but produced 60 times more heat. That way I could open it for 1 minute instead of 1 hour and still save electricity since it's more efficient.
 
Joined
Mar 21, 2016
Messages
2,411 (0.79/day)
Just a reminder this thread is about the Intel chip being a power hungry pig, but this slightly off topic popcorn discussion is still good relevant to the overarching issue matter power consumption and ridiculous performance chasing at the expense of massive power inefficiency. GPU's were rightfully so called out as well on the issue, but this is still about the 12900KS being the Pentium 4/Bulldozer power hog CPU turd of 2022.
 
Joined
May 8, 2021
Messages
1,978 (1.65/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
The review has a reference 580, that's where I quoted the consumption from.
Which does not exist. There hasn't been a single RX 580 with reference cooling sold. Reference RX 580 is more like concept. Same goes for whole RX 500 series.

Yes different tasks require different wattage but that applies to both cards. So when an rx 580 will consume 130w the 1060 will consume 70. The difference will still be there.
Maybe, but since they are on different architectures, it wouldn't be such scaling as you say.

Im talking about euros. Why would I be flexing with 1060's, lol, I had 2 PC so I bought 2 cards.And yes, you mentioned twice the price in your previous comment, so I assumed you were talking about the 1060. As far as I can remember the 1060 was around 280 to 300 on release, but when you bought the 580 (which has to be a year later, since it was actually released a year later) the prices were much lower. I bought my asus dual for 234€ and my FE for 250.
Cool, but doesn't change anything about prices here in Lithuania. Just for reference, they are selling RX 580 new for over 800 EUR right now:

Still over 200 EUR for low end 1050 Ti:

The only deal there is RX 6600:

So, please, don't speak of Europe as a whole, when you don't know a shit about its regions and regional commerce. And no, those prices are quite normal for other shops too.
 
Joined
May 31, 2016
Messages
4,412 (1.47/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
No, it certainly wouldn't. All 3d artists or whatever that care about efficiency would instantly jump on that card. You need to realize that power consumption on it's own is completely and utterly irrelevant. What determines efficiency is the work done for said consumption. It's not even just graphics cards. My AC? Yeah, I hope it consumed 50 times as much but produced 60 times more heat. That way I could open it for 1 minute instead of 1 hour and still save electricity since it's more efficient.
Too bad we dont have those kind of cards. I wonder why? I'm sure it is not because AMD or NVidia could not build one with a chiplet tech.
 
Joined
Jun 14, 2020
Messages
3,275 (2.15/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Too bad we dont have those kind of cards. I wonder why? I'm sure it is not because AMD or NVidia could not build one with a chiplet tech.
We have the 4090 that supposedly can hit 600watts. The reason we don't have 2000 watts cards is practical. It will be huge, won't fit in any normal case, and won't be easy to cool. It's definitely not because of the efficiency. It's the same thing with the CPU's, you cannot cool a die through an IHS at 400 watts, that's why even the 3990x stops at 280w tdp. If we can find a way to cool 400+ watts off of a tiny surface area like a CPU die then there will be 400 watts CPUs.
 

MerrHeLL

New Member
Joined
Apr 6, 2022
Messages
6 (0.01/day)
Fast, hot, expensive. Don't need it. Want it anyway.
I'm kinda there too... I love the fastest! But My 5950X crunches through everything on 1/3 to 1/2 the electricity depending on app... and it makes far less than 1/4th the heat. 442 watts on a stress test! That's nuts.
Fast, hot, expensive. Don't need it. Want it anyway.

We have the 4090 that supposedly can hit 600watts. The reason we don't have 2000 watts cards is practical. It will be huge, won't fit in any normal case, and won't be easy to cool. It's definitely not because of the efficiency. It's the same thing with the CPU's, you cannot cool a die through an IHS at 400 watts, that's why even the 3990x stops at 280w tdp. If we can find a way to cool 400+ watts off of a tiny surface area like a CPU die then there will be 400 watts CPUs.
2000 watts would pop the breaker on most 15 amp household circuits. My cryptorigs tell me so.
 
Joined
Jun 14, 2020
Messages
3,275 (2.15/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
2000 watts would pop the breaker on most 15 amp household circuits. My cryptorigs tell me so.
Not in Europe, but yeah, I get you. As i've said, practical reasons
 

MerrHeLL

New Member
Joined
Apr 6, 2022
Messages
6 (0.01/day)
Not in Europe, but yeah, I get you. As i've said, practical reasons
the standard rooms in the US are usually only 15 amps. We have 20 and 30 amp circuits but only if you ask for them when building, or specify/change later. Type F at 16 amps would still trip. 2000 watts is 16.666 amps.... then add in the rest of the system, the cpu, monitors... well over 20 amps
 
Joined
Jan 14, 2019
Messages
10,988 (5.38/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
No, it certainly wouldn't. All 3d artists or whatever that care about efficiency would instantly jump on that card. You need to realize that power consumption on it's own is completely and utterly irrelevant. What determines efficiency is the work done for said consumption. It's not even just graphics cards. My AC? Yeah, I hope it consumed 50 times as much but produced 60 times more heat. That way I could open it for 1 minute instead of 1 hour and still save electricity since it's more efficient.
Who said only 3D artists care about efficiency?
 
Joined
May 31, 2016
Messages
4,412 (1.47/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
We have the 4090 that supposedly can hit 600watts. The reason we don't have 2000 watts cards is practical. It will be huge, won't fit in any normal case, and won't be easy to cool. It's definitely not because of the efficiency. It's the same thing with the CPU's, you cannot cool a die through an IHS at 400 watts, that's why even the 3990x stops at 280w tdp. If we can find a way to cool 400+ watts off of a tiny surface area like a CPU die then there will be 400 watts CPUs.
We have? where? I don't think we do as of now. Either way, just because NV is going to release one doesn't mean it should be that way. You can't blindly go and praise whatever companies release, saying from now on it will be like that. What they are doing now is raising the power usage to achieve better performance among other slight improvements. It would seem they have stopped trying to make cards better. No huge improvements, they shrink nodes saying 10%-25% better power usage but you cant see it. That is suppose to be gen over gen. They pursuit performance so badly that they don't even try to make a decent card. So what if 4090 will be faster if it will use shit ton more power. If you crank it down to 250W how much faster it will be really? I hope someone will have a chance to check that out at some point.
We dont have them because with such a power usage efficiency doesn't matter. If that is the case what matters most? I guess power the card will draw which in this case would have been atrocious. Looking at cards release, the new gen over gen. The cards use a lot of power and you say it is fine unless they are more efficient. It would seem there is a fine line, how much power one gpu can draw for me it starts with 400W and I'm not gonna praise any card for the efficiency if the power it uses hits the roof.
Try harder NV and AMD
 
Last edited:
Joined
Jan 14, 2019
Messages
10,988 (5.38/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
We have? where? I don't think we do as of now. Either way, just because NV is going to release one doesn't mean it should be that way. You can't blindly go and praise whatever companies release, saying from now on it will be like that. What they are doing now is raising the power usage to achieve better performance among other slight improvements. It would seem they have stopped trying to make cards better. No huge improvements, they shrink nodes saying 10%-25% better power usage but you cant see it. That is suppose to be gen over gen. They pursuit performance so badly that they don't even try to make a decent card. So what if 4090 will be faster if it will use shit ton more power. If you crank it down to 250W how much faster it will be really? I hope someone will have a chance to check that out at some point.
We dont have them because with such a power usage efficiency doesn't matter. If that is the case what matters most? I guess power the card will draw which in this case would have been atrocious. Looking at cards release, the new gen over gen. The cards use a lot of power and you say it is fine unless they are more efficient. It would seem there is a fine line, how much power one gpu can draw for me it starts with 400W and I'm not gonna praise any card for the efficiency if the power it uses hits the roof.
Try harder NV and AMD
I'm more radical than that - my line (currently) is at 250 W. My Seasonic Prime Ultra Platinum 550 W is the best PSU I've ever had and it's still well within warranty thanks to Seasonic's amazing warranty policy (10 years for Focus, 12 for Prime). It wasn't too cheap, either, so I'd rather not buy another one just because Intel, nvidia and AMD decided to go balls to the walls with performance. 200 W for CPU, 250 W for GPU and 100 W for everything else, including a little safety headroom should be enough.

I'm quite radical with PC size, too. I love mini-ITX systems, even though I have a micro-ATX one at the moment, which is the biggest size I'm happy with. Full towers with lots of unused expansion slots are a thing of the past, imo. I also don't like how they look. I know, one needs the space for airflow, but all the void inside makes the case look empty and bigger than it really should be - kind of the same way people buy SUVs for going to the supermarket once a week. My PL-unlocked i7-11700 and RTX 2070 are kind of the maximum of what I can comfortably cool in this size.
 
Joined
Jun 14, 2020
Messages
3,275 (2.15/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
We have? where? I don't think we do as of now. Either way, just because NV is going to release one doesn't mean it should be that way. You can't blindly go and praise whatever companies release, saying from now on it will be like that. What they are doing now is raising the power usage to achieve better performance among other slight improvements. It would seem they have stopped trying to make cards better. No huge improvements, they shrink nodes saying 10%-25% better power usage but you cant see it. That is suppose to be gen over gen. They pursuit performance so badly that they don't even try to make a decent card. So what if 4090 will be faster if it will use shit ton more power. If you crank it down to 250W how much faster it will be really? I hope someone will have a chance to check that out at some point.
We dont have them because with such a power usage efficiency doesn't matter. If that is the case what matters most? I guess power the card will draw which in this case would have been atrocious. Looking at cards release, the new gen over gen. The cards use a lot of power and you say it is fine unless they are more efficient. It would seem there is a fine line, how much power one gpu can draw for me it starts with 400W and I'm not gonna praise any card for the efficiency if the power it uses hits the roof.
Try harder NV and AMD
But you are just WRONG. They are not raising the power usage to achieve better performance. Of course it is all rumours at this point, but the 4090 is supposedly almost twice as fast as the 3090. So yeah, a 20% consumption increase for a 100% performance increase is an insane efficiency jump. I don't understand how you do not get this.
 
Top