• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-12900K

Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
That's why I mentioned heat transfer efficiency.
But what is "heat transfer efficiency" if not that compound measurement of many different factors that I mentioned? And how do you define it in a way that accounts for variables like hotspot placement? In short: you can't. So you have to compromise in some way.
Seems like average temperature of whole IHS would be the best way to do that, while also stating hotspot temp.
But then you have three numbers: power draw (W), tIHSavg and tIHSpeak. How do you balance the three when designing a cooler? And how do you measure the three at all? With a reference cooler? Without a cooler?
I would argue that they should do it anyway, as OEMs like Dell, HP or Acer have really poor reputation for many overheating machines. They cannot be making crap forever at some point it will hurt their sales.
You can argue that all you want, the most important priority for them is simplifying their production lines and system configurations to increase profit margins. You're not going to convince them to invest millions in complex thermal testing regimes. The only effective way of doing this is enforcing this on a component manufacturer level, ideally through either industry body or government standardization.
Watts aren't a problem, if you measure watts and then heat transfer from chip to IHS efficiency, what is then left unclear or misinforming? That covers odd chips like Ryzens.
Again: "heat transfer efficiency" is an incredibly complex thing, and cannot be reduced to a simple number or measurement without opening the door for a lot of variance. And it doesn't cover odd chip placements unless that number has huge margins built in, in which case it then becomes misleading in the first place.
GN did a video about AMD's TDP and asked Cooler Master about this, they said that TDP doesn't mean much and that it's certainly not clear how capable coolers they should design.
That's because all CPUs boost past TDP and people expect that performance to last forever. This is down to the idiotic mixed usage of TDP (cooler spec and marketing number), as well as the lack of any TDP-like measure for peak/boost power draws, despite all chips far exceeding TDP. If CM (or anyone else) built a cooler following the TDP spec for 105W, it would be fully capable of running a stock 5950X, but the CPU wouldn't be able to maintain its 141W boost spec over any significant amount of time, instead throttling back to whatever it can maintain within thermal limits at the thermal load the cooler can dissipate - i.e. base clock or a bit more. The issue here is that nobody would call that acceptable - you'd have a CPU running at near 90 degrees under all-core loads and losing significant performance. Most would call that thermal throttling, despite this being inaccurate (it would need to go below base clock for that to be correct), but either way it's easy to see that the cooler is insufficient despite fulfilling the spec. That is why TDP is currently useless for manufacturers, not that the measurement doesn't work but that it doesn't actually cover the desired use cases and behaviours of customers.
 
Joined
Feb 1, 2019
Messages
3,667 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Just keep in mind the DDR5 used here is 6000.

Computerbase.de shows DDR4-3200 C12 and DDR4-3800 walking all over DDR5-4400. This is not surprising, but I doubt early adopters are going to be running DDR5-4400.

Tom's used DDR5-4800 C36 and DDR4-3200 on Alder Lake and the older platforms, a bit more realistic but they didn't specify the DDR4 settings that I saw. The kit was DDR5-6000 but they set it to one of the more normal speeds.

So far my take is that 'good' DDR4 is faster than DDR5, at least the obtainable DDR5-5200. I think that is not necessarily true of the DDR5-6000+ when it is full speed, but you can't actually buy that stuff.

Yeah I do usually consider latency as more important than bandwidth, I remember my old 3000CL12 config out performing 3200CL14. Some workloads do benefit from bandwidth a lot though so as you said it will depend.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Just keep in mind the DDR5 used here is 6000.

Computerbase.de shows DDR4-3200 C12 and DDR4-3800 walking all over DDR5-4400. This is not surprising, but I doubt early adopters are going to be running DDR5-4400.

Tom's used DDR5-4800 C36 and DDR4-3200 on Alder Lake and the older platforms, a bit more realistic but they didn't specify the DDR4 settings that I saw. The kit was DDR5-6000 but they set it to one of the more normal speeds.

So far my take is that 'good' DDR4 is faster than DDR5, at least the obtainable DDR5-5200. I think that is not necessarily true of the DDR5-6000+ when it is full speed, but you can't actually buy that stuff.
3200c12 is pretty uneralistic though - yes, you can OC there, but is there even a single kit on the market with those timings? It's clear that available DDR5 is slower than available DDR4, simply as available DDR4 is highly mature and DDR5 is not. What becomes clear from more balanced testing like AnandTech's comprehensive testing at JEDEC speeds (3200c20 and 4800c40) is that the latency disadvantage expected from DDR5 is much less in practice than the numbers would seem to indicate (likely down to more channels and other differences in how data is transferred), and that at those settings - both of which are bad, but of which the DDR5 settings ought to be worse - DDR5 mostly outperforms DDR4 by a slim margin.

That still means that fast DDR4 will be faster until we get fast(er) DDR5 on the market, but it also means that we won't need DDR5-8000c36 to match the performance of DDR4-4000c18.
 
Joined
Jan 27, 2015
Messages
1,746 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
3200c12 is pretty uneralistic though - yes, you can OC there, but is there even a single kit on the market with those timings? It's clear that available DDR5 is slower than available DDR4, simply as available DDR4 is highly mature and DDR5 is not. What becomes clear from more balanced testing like AnandTech's comprehensive testing at JEDEC speeds (3200c20 and 4800c40) is that the latency disadvantage expected from DDR5 is much less in practice than the numbers would seem to indicate (likely down to more channels and other differences in how data is transferred), and that at those settings - both of which are bad, but of which the DDR5 settings ought to be worse - DDR5 mostly outperforms DDR4 by a slim margin.

That still means that fast DDR4 will be faster until we get fast(er) DDR5 on the market, but it also means that we won't need DDR5-8000c36 to match the performance of DDR4-4000c18.

Latency is just one factor, and specifically the CL it's how long (in clock cycles, not time) it takes for the first word of a read to be available on the output pins of the memory. After that, the first number (3200, 4400, 4800, etc) is how fast data transmits.

I think in general for 'normal' applications high MT/s (like 5200) is better, for games lower latency is better. There are plenty of exceptions, especially when you get into the 'scientific' side of 'applications', but for normal user apps I think high MT/s helps.

So just to note, here at TPU they used DDR5-6000 C36 Gear 2 (1:2 ratio). This is some freaky fast DDR5 for now, probably more reflective of what will be available in 1H 2022. The DDR4 used on older platforms is quite good too though, DDR4-3600 C16-20-20-34 1T Gear 1 and 1:1 IF for AMD is no slouch. I think these are putting the older platforms pretty close to their best footing, that 90% of folks can get to run properly.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Latency is just one factor, and specifically the CL it's how long (in clock cycles, not time) it takes for the first word of a read to be available on the output pins of the memory. After that, the first number (3200, 4400, 4800, etc) is how fast data transmits.

I think in general for 'normal' applications high MT/s (like 5200) is better, for games lower latency is better. There are plenty of exceptions, especially when you get into the 'scientific' side of 'applications', but for normal user apps I think high MT/s helps.

So just to note, here at TPU they used DDR5-6000 C36 Gear 2 (1:2 ratio). This is some freaky fast DDR5 for now, probably more reflective of what will be available in 1H 2022. The DDR4 used on older platforms is quite good too though, DDR4-3600 C16-20-20-34 1T Gear 1 and 1:1 IF for AMD is no slouch. I think these are putting the older platforms pretty close to their best footing, that 90% of folks can get to run properly.
Uhm ... what, exactly, in my post gave you the impression that you needed to (rather poorly, IMO) explain the difference between RAM transfer rates and timings to me? And even if this was necessary (which it really wasn't), how does this change anything I said?

Your assumption is also wrong: Most consumer applications are more memory latency sensitive than bandwidth sensitive, generally, though there are obviously exceptions. That's why something like 3200c12 can perform as well as much higher clocked memory with worse latencies. Games are more latency sensitive than most applications, but there are very few realistic consumer applications where memory bandwidth is more important than latency. (iGPU gaming is the one key use case where bandwidth is king outside of server applications, which generally love bandwidth - hence why this is the focus for DDR5, which is largely designed to align with server and datacenter owners' desires.)

And while DDR5-6000 C36 might be fast for now (it's 6 clock cycles faster than the JEDEC 6000A spec, though "freaky fast" is hardly suitable IMO), it is slow compared to the expected speeds of DDR5 in the coming years. That's why I was talking about mature vs. immature tech. DDR5 JEDEC specifications currently go to DDR5-6400, with standards for 8400 in the works. For reference, the absolute highest DDR4 JEDEC specification is 3200. That means we haven't even seen the tip of the iceberg yet of DDR5 speed. So, again, even DDR5-6000c36 is a poor comparison to something like DDR4-3600c16, as one is below even the highest current JEDEC spec (let alone future ones), while the other is faster than the highest JEDEC spec several years into its life cycle.

The comment you responded to was mainly pointing out that the comparison you were talking about from Computerbase.de is deeply flawed, as it compares one highly tuned DDR4 kit to a near-base-spec DDR5 kit. The DDR4 equivalent of DDR5-4400 would be something like DDR4-2133 or 2400. Also, the Computerbase DDR5-4400 timings are JEDEC 4400A timings, at c32. That is a theoretical minimum latency of 14,55ms of latency compared to 7,37ms for DDR4-3800c14. You see how that comparison is extremely skewed? Expecting anything but the DDR4 kits winning in those scenarios would be crazy. So, as I said, mature, low latency, high speed DDR4 will obviously, be faster, especially in (mostly) latency-sensitive consumer workloads. What more nuanced reviews show, such as Anandtech's more equal comparison (both at JEDEC speed), is that the expected latency disadvantage of DDR5 is much less than has been speculated.
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
But then you have three numbers: power draw (W), tIHSavg and tIHSpeak. How do you balance the three when designing a cooler? And how do you measure the three at all? With a reference cooler? Without a cooler?
Cooler makes should only specify what they can dissipate. You as consumer would buy a chip, calculate what cooler you need from TDP (fixed) and efficiency. That's all. You as consumer are free to accommodate to peak or not.

You can argue that all you want, the most important priority for them is simplifying their production lines and system configurations to increase profit margins. You're not going to convince them to invest millions in complex thermal testing regimes. The only effective way of doing this is enforcing this on a component manufacturer level, ideally through either industry body or government standardization.
It's not that expensive to determine what coolers they would need and savings of metals will quickly outweigh modest RnD costs.

Again: "heat transfer efficiency" is an incredibly complex thing, and cannot be reduced to a simple number or measurement without opening the door for a lot of variance. And it doesn't cover odd chip placements unless that number has huge margins built in, in which case it then becomes misleading in the first place.
I don't see that happening tbh.

That's because all CPUs boost past TDP and people expect that performance to last forever. This is down to the idiotic mixed usage of TDP (cooler spec and marketing number), as well as the lack of any TDP-like measure for peak/boost power draws, despite all chips far exceeding TDP. If CM (or anyone else) built a cooler following the TDP spec for 105W, it would be fully capable of running a stock 5950X, but the CPU wouldn't be able to maintain its 141W boost spec over any significant amount of time, instead throttling back to whatever it can maintain within thermal limits at the thermal load the cooler can dissipate - i.e. base clock or a bit more. The issue here is that nobody would call that acceptable - you'd have a CPU running at near 90 degrees under all-core loads and losing significant performance. Most would call that thermal throttling, despite this being inaccurate (it would need to go below base clock for that to be correct), but either way it's easy to see that the cooler is insufficient despite fulfilling the spec. That is why TDP is currently useless for manufacturers, not that the measurement doesn't work but that it doesn't actually cover the desired use cases and behaviours of customers.
Might as well educate buyers that boost is not guaranteed, but Intel has been doing it for at least a decade and it ended up this way. Perhaps new Alder Lake measurements just make sense.
 
Joined
Apr 30, 2011
Messages
2,716 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
@W1zzard , did you enable SAM on the AM4's board UEFI when testing? I am almost sure Intel doesn't support it as much.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
This will, no doubt, finish AMD not.

I'm not even convinced AMD needs to change pricing on any of its products.
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
This will, no doubt, finish AMD not.

I'm not even convinced AMD needs to change pricing on any of its products.
That's kinda obvious, but techtube says otherwise and many people listen to them.
 
Joined
May 24, 2007
Messages
5,433 (0.85/day)
Location
Tennessee
System Name AM5
Processor AMD Ryzen R9 7950X
Motherboard Asrock X670E Taichi
Cooling EK AIO Basic 360
Memory Corsair Vengeance DDR5 5600 64 Gb - XMP1 Profile
Video Card(s) AMD Reference 7900 XTX 24 Gb
Storage Crucial Gen 5 1 TB, Samsung Gen 4 980 1 TB / Samsung 8TB SSD
Display(s) Samsung 34" 240hz 4K
Case Fractal Define R7
Power Supply Seasonic PRIME PX-1300, 1300W 80+ Platinum, Full Modular
Anyone know where Intel is fabbing these?
 
D

Deleted member 215115

Guest
Yeah because no other cpu has been worth upgrading to over that old cpu. lol get real. And a solid 144 is important to you yet you held on this long to a cpu that cant even maintain 60 fps in some games.
This is why you never bother with these low-spec 4K60 peasants. Their statements are so dumb that they could actually work as bait. As soon as you see their 4K monitor and their 10 y/o CPU paired with a 2080 you just know they live in fantasy land. They are so deluded that they lose basic understanding of how tech works. They actually believe that their CPU is still good enough and nothing you do or say will change their mind because keeping that CPU for so long is the only meaningful achievement in their life so they have to defend it. It's like arguing with women. Just don't do it. Total waste of time, especially since staff and other members will always defend them for some reason.

On topic: The 12900K is great & efficient, hail Intel, AMD sucks, yadda yadda yadda. Gonna go buy an i9 right now and keep it for 15 years.
 
Joined
Nov 4, 2005
Messages
12,014 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Joined
Apr 21, 2010
Messages
578 (0.11/day)
System Name Home PC
Processor Ryzen 5900X
Motherboard Asus Prime X370 Pro
Cooling Thermaltake Contac Silent 12
Memory 2x8gb F4-3200C16-8GVKB - 2x16gb F4-3200C16-16GVK
Video Card(s) XFX RX480 GTR
Storage Samsung SSD Evo 120GB -WD SN580 1TB - Toshiba 2TB HDWT720 - 1TB GIGABYTE GP-GSTFS31100TNTD
Display(s) Cooler Master GA271 and AoC 931wx (19in, 1680x1050)
Case Green Magnum Evo
Power Supply Green 650UK Plus
Mouse Green GM602-RGB ( copy of Aula F810 )
Keyboard Old 12 years FOCUS FK-8100
from business view and by looking at Die size , Intel new arch does cost more than Zen3 , therefore Intel had to chose a path in which is less profit than Zen3.
 
Joined
Jan 14, 2019
Messages
12,586 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
At least, this time they have balls to also admit that they can use nearly 300 watts of power. Not sure about you, but I treat TDP or base power are maximum expected power usage at base clocks without any turbo clocks. But I tested my own i5 and in prime95 small FFTs it uses less than 65 watts (I think it was up to 50 watts) of power with turbo off, so I guess any power number that Intel or AMD releases doesn't mean anything.
So far, TDP on Intel has meant PL1, that is long term power limit enforced by the motherboard by default. My 11700 can do 2.8 GHz (300 MHz above base clock) in Cinebench all-core while maintaining the factory 65 W limit. I'm not sure with Alder Lake, though.

As for AMD, TDP is nothing more than a recommendation for cooler manufacturers (bull****). It has nothing to do with power.

In FX 9590 era, we called that desperate, in 2021 we call that excellent. "Editor's Choice" and "Highly Recommended".
"Highly recommended"... to slap a huge liquid cooler on it. :D
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,967 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
@W1zzard , did you enable SAM on the AM4's board UEFI when testing? I am almost sure Intel doesn't support it as much.
Enabled on all platforms, it's supported just fine everywhere
 
Joined
Nov 21, 2010
Messages
2,355 (0.46/day)
Location
Right where I want to be
System Name Miami
Processor Ryzen 3800X
Motherboard Asus Crosshair VII Formula
Cooling Ek Velocity/ 2x 280mm Radiators/ Alphacool fullcover
Memory F4-3600C16Q-32GTZNC
Video Card(s) XFX 6900 XT Speedster 0
Storage 1TB WD M.2 SSD/ 2TB WD SN750/ 4TB WD Black HDD
Display(s) DELL AW3420DW / HP ZR24w
Case Lian Li O11 Dynamic XL
Audio Device(s) EVGA Nu Audio
Power Supply Seasonic Prime Gold 1000W+750W
Mouse Corsair Scimitar/Glorious Model O-
Keyboard Corsair K95 Platinum
Software Windows 10 Pro
Enabled on all platforms, it's supported just fine everywhere

By any chance is Alder Lake ever going to be benched using DDR4?
 
Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
Alder lake is still vulnerable to attack, what’s the performance going to be when they fix it.

All CPUs featuring out of order speculative execution are vulnerable to Spectre class attacks. No matter if they are Intel, AMD, ARM or MIPS.

Each review of new Intel CPUs has seen at least one person blaming Intel for not fixing HW vulnerabilities. It's a sort of tradition nowadays.

A nice overview of affected CPU architectures and their status is on Wikipedia.
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
"Highly recommended"... to slap a huge liquid cooler on it. :D
I really wonder why it was given that award. It's more or less the same as recommending FX 9590, but this time Intel has performance edge at least, but unlike FX 9590, i9 is uncoolable. If 280mm AIO and D15 fails to cool it, then what can? Now minimum spec for it is 360mm AIO or custom loop? Good one Intel, I will wait till their flagship chips will need LN2 pot as minimum cooler.
 
Joined
Dec 26, 2006
Messages
3,862 (0.59/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
"the new Socket AM5. An LGA package with 1,718 pins, AM5"

AM5 will have more pins therefore it must be faster ;)
 
Joined
Apr 29, 2018
Messages
129 (0.05/day)
Mind your attitude. :slap::slap:

@Valantar Thanks for the technical explanation, it does sound about right. I knew it was for reasons along these lines and said so in my post.

And yes, it's funny how some people get all personal over a friggin' CPU. :laugh:

And yeah, it's been wonderous for my wallet. Contrary to that immature child above, my CPU does well over 60fps in the all games I play, even the latest, but it can't reach the magic 144fps, or even 120fps in many cases although the experience is still surprisingly smooth. This thing probably has something like an 80-100% performance increase over my aged CPU so will have no trouble at all hitting those highs. Can't wait! :cool:
can maintain well over 60 fps in the latest games with a 2700k? thanks for proving just how delusional I thought you were with that asinine claim. you are so full of it that it is laughable. even my oced 4700k, which is quite a bit faster, could not maintain 60 fps in some games even 3 years ago and most certainly had plenty of drops well below 60 fps. knock yourself out with the last word as no point in fooling with someone like you.
 
Last edited:
Joined
Jun 29, 2018
Messages
542 (0.23/day)
A nice overview of affected CPU architectures and their status is on Wikipedia.
Sadly that's incomplete, missing 7 CVEs from Intel guidance and a few recent microarchitectures.

Edit: Looking closer at the Intel site it looks like Alder Lake is indeed vulnerable to CVE-2020-24511 and CVE-2020-8698 that Rocket Lake wasn't. Supposedly fixed in microcode and hardware respectively, so most likely release BIOSes are safe.
 
Last edited:
Joined
Jan 29, 2021
Messages
1,876 (1.32/day)
Location
Alaska USA
Fair enough advice I suppose (if you don't need/want a new PC right now, especially in the light of ridiculous graphics cards prices, but I suspect in this particular case, he'll hold on to his 2080 for the time being anyway), however the problem is, that we don't know which skus will get the 3D treatment; some indications say only octa-cores and up, some even only 12&16 and none mention the 6-core and the latter is just mind boggling. Not only have they already pretty much abandoned the sub $300 class so far, but with 5600x remaining all they will offer here, they'll lose it completely. Even if they drop its price to $200 (unlikely), there still won't be any competition for the 12600k, especially given that motherboard availability and pricing will only improve with time.
The 5600X looks to be dead in the water if AMD doesn't lower its price.

 
Joined
May 9, 2012
Messages
8,545 (1.85/day)
Location
Ovronnaz, Wallis, Switzerland
System Name main/SFFHTPCARGH!(tm)/Xiaomi Mi TV Stick/Samsung Galaxy S23/Ally
Processor Ryzen 7 5800X3D/i7-3770/S905X/Snapdragon 8 Gen 2/Ryzen Z1 Extreme
Motherboard MSI MAG B550 Tomahawk/HP SFF Q77 Express/uh?/uh?/Asus
Cooling Enermax ETS-T50 Axe aRGB /basic HP HSF /errr.../oh! liqui..wait, no:sizable vapor chamber/a nice one
Memory 64gb DDR4 3600/8gb DDR3 1600/2gbLPDDR3/8gbLPDDR5x/16gb(10 sys)LPDDR5 6400
Video Card(s) Hellhound Spectral White RX 7900 XTX 24gb/GT 730/Mali 450MP5/Adreno 740/Radeon 780M 6gb LPDDR5
Storage 250gb870EVO/500gb860EVO/2tbSandisk/NVMe2tb+1tb/4tbextreme V2/1TB Arion/500gb/8gb/256gb/4tb SN850X
Display(s) X58222 32" 2880x1620/32"FHDTV/273E3LHSB 27" 1920x1080/6.67"/AMOLED 2X panel FHD+120hz/7" FHD 120hz
Case Cougar Panzer Max/Elite 8300 SFF/None/Gorilla Glass Victus 2/front-stock back-JSAUX RGB transparent
Audio Device(s) Logi Z333/SB Audigy RX/HDMI/HDMI/Dolby Atmos/KZ x HBB PR2/Moondrop Chu II + TRN BT20S
Power Supply Chieftec Proton BDF-1000C /HP 240w/12v 1.5A/USAMS GAN PD 33w/USAMS GAN 100w
Mouse Speedlink Sovos Vertical-Asus ROG Spatha-Logi Ergo M575/Xiaomi XMRM-006/touch/touch
Keyboard Endorfy Thock 75%/Lofree Edge/none/touch/virtual
VR HMD Medion Erazer
Software Win10 64/Win8.1 64/Android TV 8.1/Android 14/Win11 64
Benchmark Scores bench...mark? i do leave mark on bench sometime, to remember which one is the most comfortable. :o
well, reviews out, although wrong contender ...
nonetheless after "reviewing the reviews" ( errrr ... :laugh: :oops: ) a 5600/5600x will be more profitable for me if i want an upgrade (not that the 3600 is a slouch, for my usage ) the 300$ 12600K is well ... ~50 more than the cheapest 5600X i can find ...

add that to the fact that i can keep my mobo ... and avoid win 11 (for now) and avoid skittish big.INTEL eerrrr i mean LITTLE scheduler, and for all the cons i see in the reviews kinda outweight the pros...
kinda ironic, "improved efficiency"/"not as efficient as Zen 3" (not the only one, but the most striking for me )

wait and see is the right thing to do at the moment (and keep some upgrade path i guess ... without going full throttle on a new platform )


The 5600X looks to be dead in the water if AMD doesn't lower its price.

well that's quite true ... ahhhh whatever ... at least i could be a buyer of a dead in the water CPU at 249 for a 5600X found some in listings, also even cheaper from second hand ('round 149/199), i guess future early adopters are selling, not gonna complain (will still be cheaper than a new mobo+cpu :ohwell: )

mmhhh, Intel innovated i reckon although not really appealing to make the switch again... later maybe, who knows ...
and no, the gap between the 2 is not abyssal, they retook the crown but given all, if looking at the whole picture (regardless of next amd product) the advantage is not that clear (pros/cons in account)
12600k is over a 5600X but consumption is higher too, price wise same same (not factoring new mobo/ram ofc, well my previous 6600K was priced around that, as it was 289 at the time i got it )

always take everything in account, before making a choice/opinion.
 
Joined
Feb 14, 2012
Messages
2,356 (0.50/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
That 4K gaming summary, everything 95-100%. I'll be using my 9900KS for a long time. Because it doesn't matter how fast my Excel is or isn't, it's fast enough.
 
Top