Tuesday, July 25th 2017

Intel Core i9-7960X 16-core/32-thread Processor Detailed, Benchmarked

Intel is preparing to tackle AMD's first Ryzen Threadripper parts with two Core "Skylake-X" HEDT socket LGA2066 processor launches in quick succession, over Q3-2017. The first one to come out will be the 12-core/24-thread Core i9-7920X; which will be closely followed by the 16-core/32-thread Core i9-7960X. The company will ultimately end 2017 with the 18-core/36-thread Core i9-7980XE. The i9-7920X, detailed in our older article, could either command a $200 premium over the $999 10-core/20-thread i9-7900X; or displace it to a slightly lower price-point (say, $800). The i9-7960X, however, could retain a premium price-point owing to performance leadership over the Ryzen Threadripper 1950X, if early benchmarks are to be believed.

The Core i9-7960X is endowed with 16 cores, HyperThreading enabling 32 threads, 1 MB of L2 cache per core, and 22 MB of shared L3 cache. It features the chip's full 44-lane PCI-Express gen 3.0 root complex, and a quad-channel DDR4 memory interface. The chip is expected to be clocked even lower than its 12-core sibling, with a nominal clock of a mere 2.50 GHz, and a yet unknown max Turbo Boost frequency. Put through Geekbench 4.1.0, the chip scored 33,672 points in the multi-threaded test, which is higher than the 27,000-ish scores we've been hearing of for the Threadripper 1950X; but a single-thread score of 5,238, which pales in comparison to that of the i7-7740X, due to the lower clock speeds, and a slightly older micro-architecture.
Source: Geekbench Database
Add your own comment

26 Comments on Intel Core i9-7960X 16-core/32-thread Processor Detailed, Benchmarked

#1
Unregistered
So, definitely fake/old threadripper scores and terrible, but kind off expected 7960x scores and specs. May score better after optimizations relative to other cpu's, but with a 2.5ghz baseclock these scores seem about right.

Unless it's powered by fairy dust, I'm buying a 1950x anyway, however, since a 2.5ghz base clock, mediocre TIM and 1700 dollar price tag aren't very appealing and I refuse to delid anything I'm still actively using!
#2
Komshija
One thing is for sure - it will be far more expensive than 16-core AMD Threadripper. It's still questionable whether it will be faster or not, but even if we assume that it will, real differences will be marginal.
At present time it's not the brightest idea to buy any Intel CPU, except Pentium G4560 or G4600 which are excellent for their price tag.
Posted on Reply
#3
Imsochobo
KomshijaOne thing is for sure - it will be far more expensive than 16-core AMD Threadripper. It's still questionable whether it will be faster or not, but even if we assume that it will, real differences will be marginal.
At present time it's not the brightest idea to buy any Intel CPU, except Pentium G4560 or G4600 which are excellent for their price tag.
I doubt intel can compete with 12 cores + in desktop and server market.

Intel's offerings seem to clock higher, thus as long as the TDP is low enough they can leverage that.
My reasoning in short for conclusion.

Intel's lower offerings such as 6,8,10 core will be the good ones there due to clock speed and the 12core + will be a waste it seems due to TDP limited clocks.

AMD just have so much more efficiency up till a certain point (3.7ghz)
Posted on Reply
#4
Noztra
So how many people think it will actually use 140 Watts TDP?
Posted on Reply
#5
Komshija
ImsochoboI doubt intel can compete with 12 cores + in desktop and server market.

Intel's offerings seem to clock higher, thus as long as the TDP is low enough they can leverage that.
My reasoning in short for conclusion.

Intel's lower offerings such as 6,8,10 core will be the good ones there due to clock speed and the 12core + will be a waste it seems due to TDP limited clocks.

AMD just have so much more efficiency up till a certain point (3.7ghz)
The main problem for AMD is that most consumers are heavily influenced by the marketing/propaganda rather than using common sense, so that goes exclusively into Intel's favor. Even at 4, 6 or 8-core AMD is a far better option because you get quite a lot for the reasonable amount of money. For now, Intel's CPU's slightly benefit from higher clocks, but even these benefits are marginal, just like it is the case with current i5/i7 and similarly priced Ryzen's. I'm sure that in the near future AMD will be tune-up their chips design making them suitable for 4+ GHz speeds.

For gaming, Core i9 and Threadripper will be totally unnecessary, a waste of money, although some future games will likely use more than 4 cores. So far there's absolutely no point in buying anything more than 4C/8T or, at most, 6C/12T CPU for gaming and it will remain so at least for a couple of years.

The energy consumption is a nice thing but also a total nonsense, because anyone who aims for the upper-middle class or high end CPU shouldn't be concerned about it. It's like someone who has money to buy an BMW M3 starts worrying about high fuel consumption - yeah, right...
Posted on Reply
#6
TheGuruStud
NoztraSo how many people think it will actually use 140 Watts TDP?
Doesn't the fine print say actual TDP is x2? :P
Posted on Reply
#7
chief-gunney
Intel can't compete with AMD on price/performance until they have something resembling Infinity fabric for similar production cost. Their present marginal performance advantage,which is already at the ceiling, is also going to get smaller as AMD refine their Cpu's even more.
Posted on Reply
#8
Frick
Fishfaced Nincompoop
Soooooo is Thermal Design Point factually equal to power consumption these days or has everyone simply forgotten that the two are different things? I honestly have no idea.
Posted on Reply
#9
Unregistered
FrickSoooooo is Thermal Design Point factually equal to power consumption these days or has everyone simply forgotten that the two are different things? I honestly have no idea.
TDP is how much heat the cpu is designed to handle.
#10
Woomack
NoztraSo how many people think it will actually use 140 Watts TDP?
7900X has ~200W at auto/stock settings under full load, not to mention 7960X ...
Posted on Reply
#11
Unregistered
Woomack7900X has ~200W at auto/stock settings under full load, not to mention 7960X ...
That might be the entire system though, but still.
#12
Frick
Fishfaced Nincompoop
Hugh MungusTDP is how much heat the cpu is designed to handle.
I know, but that is not what people talk about.
Posted on Reply
#13
Tsukiyomi91
Really sad day where people of the Internet & normies with little or no PC knowledge thinks that TDP = power consumption. That's not how TDP works...
Posted on Reply
#14
Noztra
Tsukiyomi91Really sad day where people of the Internet & normies with little or no PC knowledge thinks that TDP = power consumption. That's not how TDP works...
My bad. :)

What I should have written instead was "With the system pulling more than 400W from the wall"

Better?
Posted on Reply
#15
Woomack
Hugh MungusThat might be the entire system though, but still.
No this is wattage tested on my 7900X:
- stock/auto ~1.06V = ~200W CPU / 280W+ system( CPU load only )
- 1.15V = ~300W CPU / 380W+ system ( CPU load only )
- 1.20V = ~340W CPU / 420W+ system ( CPU load only )
- full load CPU at 1.2V + single GTX1080Ti = 620W ( CPU+GPU load )
measured with TP-Link Smart Plug
Posted on Reply
#16
EarthDog
NoztraSo how many people think it will actually use 140 Watts TDP?
at stock? Me. Thats why the rating is there. ;)

Tdp is close enough to power used...
Posted on Reply
#17
Unregistered
WoomackNo this is wattage tested on my 7900X:
- stock/auto ~1.06V = ~200W CPU / 280W+ system( CPU load only )
- 1.15V = ~300W CPU / 380W+ system ( CPU load only )
- 1.20V = ~340W CPU / 420W+ system ( CPU load only )
- full load CPU at 1.2V + single GTX1080Ti = 620W ( CPU+GPU load )
measured with TP-Link Smart Plug
Jeez..... And ryzen tends to consume less power than its tdp. Suppose the 7900x's TDP is so low because the TIM, lid size and die itself combined aren't meant to get rid of more than 140W in heat! ;)
#18
Woomack
Ryzen in software ( like hwinfo64 ) is showing higher wattage than TDP under full load but I haven't been testing how much is from the wall. On Ryzen there is additional issue as every motherboard is setting a bit different voltages and for instance 1700X/1800X have 1.35V VID but voltage can go up to 1.45V. During low load ( not idle ) it can run at 1.25V. Auto vcore on 3 different boards which I had was between 1.35 and 1.45V (1700X under load).

TDP supposed to be maximum heat generated by CPU in watts ( not CPU power ) but it's seems like it's counted from average VID under typical load. Modern processors don't have one specified voltage so on each setup, heat and wattage can be different. All these power saving features and some other things are affecting what you really get.
Posted on Reply
#20
Aquinus
Resident Wat-man
AMD and Intel's definition of TDP both are different. Neither describes max draw but, Intel describes it as the amount of power used when running under realistic full load where AMD describes it as something like average power doing business tasks or something like that. Either way, neither actually describes full-tilt power use. It's always power use with some kind of restriction or criteria. These are nominal values, not real values.
Posted on Reply
#21
Unregistered
AquinusAMD and Intel's definition of TDP both are different. Neither describes max draw but, Intel describes it as the amount of power used when running under realistic full load where AMD describes it as something like average power doing business tasks or something like that. Either way, neither actually describes full-tilt power use. It's always power use with some kind of restriction or criteria. These are nominal values, not real values.
What's Intel's realistic load than? Vanilla minecraft at the lowest settings on peaceful?
#22
Frick
Fishfaced Nincompoop
AquinusAMD and Intel's definition of TDP both are different. Neither describes max draw but, Intel describes it as the amount of power used when running under realistic full load where AMD describes it as something like average power doing business tasks or something like that. Either way, neither actually describes full-tilt power use. It's always power use with some kind of restriction or criteria. These are nominal values, not real values.
So TDP is actually power draw according to them? I just have to adjust my brain.
Posted on Reply
#23
Aquinus
Resident Wat-man
FrickSo TDP is actually power draw according to them? I just have to adjust my brain.
Where do you think all of that "drawn power" goes? There must be a reason why computers have fans and why more power draw means more heat!

Every watt that your computer pulls off the wall is being converted to thermal or kinetic energy.
Posted on Reply
#24
Unregistered
AquinusWhere do you think all of that "drawn power" goes? There must be a reason why computers have fans and why more power draw means more heat!

Every watt that your computer pulls off the wall is being converted to thermal or kinetic energy.
Resistance.
#25
Frick
Fishfaced Nincompoop
AquinusWhere do you think all of that "drawn power" goes? There must be a reason why computers have fans and why more power draw means more heat!

Every watt that your computer pulls off the wall is being converted to thermal or kinetic energy.
Herp derp blerghhhh. What I mean is this: Has the term Thermal Design Power officially replaced CPU power dissipation as the term to describe power draw? I doubt it, and as said elsewhere a 140W TDP from either Intel or AMD is a poor indicator or actual power draw.

Consider:
NoztraSo how many people think it will actually use 140 Watts TDP?
I know what he means, but it's wrong. It obviously doesn't matter, but still.
Posted on Reply
Add your own comment
Jun 3rd, 2024 05:16 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts