• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-7960X 16-core/32-thread Processor Detailed, Benchmarked

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,680 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel is preparing to tackle AMD's first Ryzen Threadripper parts with two Core "Skylake-X" HEDT socket LGA2066 processor launches in quick succession, over Q3-2017. The first one to come out will be the 12-core/24-thread Core i9-7920X; which will be closely followed by the 16-core/32-thread Core i9-7960X. The company will ultimately end 2017 with the 18-core/36-thread Core i9-7980XE. The i9-7920X, detailed in our older article, could either command a $200 premium over the $999 10-core/20-thread i9-7900X; or displace it to a slightly lower price-point (say, $800). The i9-7960X, however, could retain a premium price-point owing to performance leadership over the Ryzen Threadripper 1950X, if early benchmarks are to be believed.

The Core i9-7960X is endowed with 16 cores, HyperThreading enabling 32 threads, 1 MB of L2 cache per core, and 22 MB of shared L3 cache. It features the chip's full 44-lane PCI-Express gen 3.0 root complex, and a quad-channel DDR4 memory interface. The chip is expected to be clocked even lower than its 12-core sibling, with a nominal clock of a mere 2.50 GHz, and a yet unknown max Turbo Boost frequency. Put through Geekbench 4.1.0, the chip scored 33,672 points in the multi-threaded test, which is higher than the 27,000-ish scores we've been hearing of for the Threadripper 1950X; but a single-thread score of 5,238, which pales in comparison to that of the i7-7740X, due to the lower clock speeds, and a slightly older micro-architecture.



View at TechPowerUp Main Site
 
Last edited:
So, definitely fake/old threadripper scores and terrible, but kind off expected 7960x scores and specs. May score better after optimizations relative to other cpu's, but with a 2.5ghz baseclock these scores seem about right.

Unless it's powered by fairy dust, I'm buying a 1950x anyway, however, since a 2.5ghz base clock, mediocre TIM and 1700 dollar price tag aren't very appealing and I refuse to delid anything I'm still actively using!
 
One thing is for sure - it will be far more expensive than 16-core AMD Threadripper. It's still questionable whether it will be faster or not, but even if we assume that it will, real differences will be marginal.
At present time it's not the brightest idea to buy any Intel CPU, except Pentium G4560 or G4600 which are excellent for their price tag.
 
One thing is for sure - it will be far more expensive than 16-core AMD Threadripper. It's still questionable whether it will be faster or not, but even if we assume that it will, real differences will be marginal.
At present time it's not the brightest idea to buy any Intel CPU, except Pentium G4560 or G4600 which are excellent for their price tag.

I doubt intel can compete with 12 cores + in desktop and server market.

Intel's offerings seem to clock higher, thus as long as the TDP is low enough they can leverage that.
My reasoning in short for conclusion.

Intel's lower offerings such as 6,8,10 core will be the good ones there due to clock speed and the 12core + will be a waste it seems due to TDP limited clocks.

AMD just have so much more efficiency up till a certain point (3.7ghz)
 
I doubt intel can compete with 12 cores + in desktop and server market.

Intel's offerings seem to clock higher, thus as long as the TDP is low enough they can leverage that.
My reasoning in short for conclusion.

Intel's lower offerings such as 6,8,10 core will be the good ones there due to clock speed and the 12core + will be a waste it seems due to TDP limited clocks.

AMD just have so much more efficiency up till a certain point (3.7ghz)
The main problem for AMD is that most consumers are heavily influenced by the marketing/propaganda rather than using common sense, so that goes exclusively into Intel's favor. Even at 4, 6 or 8-core AMD is a far better option because you get quite a lot for the reasonable amount of money. For now, Intel's CPU's slightly benefit from higher clocks, but even these benefits are marginal, just like it is the case with current i5/i7 and similarly priced Ryzen's. I'm sure that in the near future AMD will be tune-up their chips design making them suitable for 4+ GHz speeds.

For gaming, Core i9 and Threadripper will be totally unnecessary, a waste of money, although some future games will likely use more than 4 cores. So far there's absolutely no point in buying anything more than 4C/8T or, at most, 6C/12T CPU for gaming and it will remain so at least for a couple of years.

The energy consumption is a nice thing but also a total nonsense, because anyone who aims for the upper-middle class or high end CPU shouldn't be concerned about it. It's like someone who has money to buy an BMW M3 starts worrying about high fuel consumption - yeah, right...
 
Intel can't compete with AMD on price/performance until they have something resembling Infinity fabric for similar production cost. Their present marginal performance advantage,which is already at the ceiling, is also going to get smaller as AMD refine their Cpu's even more.
 
Soooooo is Thermal Design Point factually equal to power consumption these days or has everyone simply forgotten that the two are different things? I honestly have no idea.
 
Really sad day where people of the Internet & normies with little or no PC knowledge thinks that TDP = power consumption. That's not how TDP works...
 
Really sad day where people of the Internet & normies with little or no PC knowledge thinks that TDP = power consumption. That's not how TDP works...

My bad. :)

What I should have written instead was "With the system pulling more than 400W from the wall"

Better?
 
That might be the entire system though, but still.

No this is wattage tested on my 7900X:
- stock/auto ~1.06V = ~200W CPU / 280W+ system( CPU load only )
- 1.15V = ~300W CPU / 380W+ system ( CPU load only )
- 1.20V = ~340W CPU / 420W+ system ( CPU load only )
- full load CPU at 1.2V + single GTX1080Ti = 620W ( CPU+GPU load )
measured with TP-Link Smart Plug
 
No this is wattage tested on my 7900X:
- stock/auto ~1.06V = ~200W CPU / 280W+ system( CPU load only )
- 1.15V = ~300W CPU / 380W+ system ( CPU load only )
- 1.20V = ~340W CPU / 420W+ system ( CPU load only )
- full load CPU at 1.2V + single GTX1080Ti = 620W ( CPU+GPU load )
measured with TP-Link Smart Plug
Jeez..... And ryzen tends to consume less power than its tdp. Suppose the 7900x's TDP is so low because the TIM, lid size and die itself combined aren't meant to get rid of more than 140W in heat! ;)
 
Ryzen in software ( like hwinfo64 ) is showing higher wattage than TDP under full load but I haven't been testing how much is from the wall. On Ryzen there is additional issue as every motherboard is setting a bit different voltages and for instance 1700X/1800X have 1.35V VID but voltage can go up to 1.45V. During low load ( not idle ) it can run at 1.25V. Auto vcore on 3 different boards which I had was between 1.35 and 1.45V (1700X under load).

TDP supposed to be maximum heat generated by CPU in watts ( not CPU power ) but it's seems like it's counted from average VID under typical load. Modern processors don't have one specified voltage so on each setup, heat and wattage can be different. All these power saving features and some other things are affecting what you really get.
 
AMD and Intel's definition of TDP both are different. Neither describes max draw but, Intel describes it as the amount of power used when running under realistic full load where AMD describes it as something like average power doing business tasks or something like that. Either way, neither actually describes full-tilt power use. It's always power use with some kind of restriction or criteria. These are nominal values, not real values.
 
AMD and Intel's definition of TDP both are different. Neither describes max draw but, Intel describes it as the amount of power used when running under realistic full load where AMD describes it as something like average power doing business tasks or something like that. Either way, neither actually describes full-tilt power use. It's always power use with some kind of restriction or criteria. These are nominal values, not real values.
What's Intel's realistic load than? Vanilla minecraft at the lowest settings on peaceful?
 
AMD and Intel's definition of TDP both are different. Neither describes max draw but, Intel describes it as the amount of power used when running under realistic full load where AMD describes it as something like average power doing business tasks or something like that. Either way, neither actually describes full-tilt power use. It's always power use with some kind of restriction or criteria. These are nominal values, not real values.

So TDP is actually power draw according to them? I just have to adjust my brain.
 
So TDP is actually power draw according to them? I just have to adjust my brain.

Where do you think all of that "drawn power" goes? There must be a reason why computers have fans and why more power draw means more heat!

Every watt that your computer pulls off the wall is being converted to thermal or kinetic energy.
 
Where do you think all of that "drawn power" goes? There must be a reason why computers have fans and why more power draw means more heat!

Every watt that your computer pulls off the wall is being converted to thermal or kinetic energy.
Resistance.
 
Back
Top