# More Core i7-4960X "Ivy Bridge-E" Benchmarks Surface



## btarunr (Jul 18, 2013)

More benchmarks of Intel's upcoming socket LGA2011 flagship client processor, the Core i7-4960X "Ivy Bridge-E," surfaced on the web. Tom's Hardware scored an engineering sample of the chip, and wasted no time in comparing it with contemporaries across three previous Intel generations, and AMD's current generation. These include chips such as the i7-3970X, i7-4770K, i7-3770K, i7-2700K, FX-8350, and A10-5800K. 

In synthetic tests, the i7-4960X runs neck and neck with the i7-3970X, offering a 5 percent performance increment at best. It's significantly faster than the i7-3930K, Intel's $500-ish offering for over 7 quarters running. Its six cores and twelve SMT threads give it a definite edge over quad-core Intel parts in multi-threaded synthetic tests. In single-threaded tests, the $350 i7-4770K is highly competitive with it. The only major surprise on offering is power-draw. Despite its TDP being rated at 130W, on par with the i7-3960X, the i7-4960X "Ivy Bridge-E" offers significantly higher energy-efficiency, which can be attributed to the 22 nm process on which it's built, compared to its predecessor's 32 nm process. Find the complete preview at the source.



 

 

 

 



*View at TechPowerUp Main Site*


----------



## matar (Jul 18, 2013)

Big fail intel Ive Bridge is only a die shrink form 32nm to 22nm Sandy & Ive both have the same Architecture...
What we want is haswell-E


----------



## cdawall (Jul 18, 2013)

So its not faster than a 9590


----------



## drdeathx (Jul 18, 2013)

matar said:


> Big fail intel Ive Bridge is only a die shrink form 32nm to 22nm Sandy & Ive both have the same Architecture...
> What we want is haswell-E



there is a bit more to the die shrink


----------



## Supercrit (Jul 18, 2013)

Intel does not want to give customers the newest and the fastest at the same time.


----------



## MxPhenom 216 (Jul 18, 2013)

matar said:


> Big fail intel Ive Bridge is only a die shrink form 32nm to 22nm Sandy & Ive both have the same Architecture...
> What we want is haswell-E



So im guessing you don't know about Intel's Tick Tock road map.....:shadedshu



Supercrit said:


> Intel does not want to give customers the newest and the fastest at the same time.



They don't have much of a reason too, what competition do they truly have right now to make them push the envelope.

All I wanted to see from LGA2011 is better power consumption, and looks like Ivy Bridge E will bring that.


----------



## Jstn7477 (Jul 18, 2013)

cdawall said:


> So its not faster than a 9590



Sorry, but being decent in roughly 7 out of 40 tests at Hardware Canucks doesn't bode well for a processor that is clocked 1200MHz above the competition, costs 1.5-3x more unless you are comparing to 3970X, gets poopy frame rates in games compared to a mainstream Intel CPU, and consumes a ridiculous amount of power for the rather lackluster performance you get, although you've already stated that you don't care if your chip takes almost 300w. 

Since I am entitled to an opinion just as you are, I'll stick with my relatively quiet computer, fairly cool room and a ~100w processor that provides great, predictable performance across the board because I don't want my HD 7970 and 120Hz monitor being under-utilized in non-DX11 games which already happens even with my processor. Have fun with your FX 9370 or whatever you ordered.


----------



## dj-electric (Jul 18, 2013)

And what exactly people expected?
"ohhh, the differance between 2600K and 3770K is very very small, i bet the 4960X will be much faster than the 3970X"


duhhhhh


----------



## HumanSmoke (Jul 18, 2013)

cdawall said:


> So its not faster than a 9590


You were expecting the 4960X to be clocked at 4.701GHz+ ? or just out fishing for the big one?


----------



## buggalugs (Jul 18, 2013)

And Intel expect us to put this $500-$1000 CPU on a old crippled X79 motherboard with 2 sata 6GB/s ports and outdated components? lol. No thanks.


----------



## dj-electric (Jul 18, 2013)

buggalugs said:


> And Intel expect us to put this $500-$1000 CPU on a old crippled X79 motherboard with 2 sata 6GB/s ports and outdated components? lol. No thanks.



Don't worry about that mate, wait for the end of the summer for some news


----------



## Fourstaff (Jul 18, 2013)

Definitely better than the SB it replaces in all metric, but as usual not enough to force an upgrade from previous gen. A story we are all familiar with, I'm sure.


----------



## fullinfusion (Jul 18, 2013)

drdeathx said:


> there is a bit more to the die shrink


Ya shrink?  but ya know it all huh.

I think Intel has just stalled in the water for the time being, till they make another break through in architecture.
It looks good, but to upgrade? Na I don't think so. If anything i think AMD is going to knock our socks off soon and sway the blue team to move back to the red side but that's just my opinion 

Time will tell the story, but for now we just allllll got to wait it out


----------



## Aquinus (Jul 18, 2013)

Power consumption numbers look pretty nice to me.


----------



## cdawall (Jul 18, 2013)

Jstn7477 said:


> Sorry, but being decent in roughly 7 out of 40 tests at Hardware Canucks doesn't bode well for a processor that is clocked 1200MHz above the competition, costs 1.5-3x more unless you are comparing to 3970X, gets poopy frame rates in games compared to a mainstream Intel CPU, and consumes a ridiculous amount of power for the rather lackluster performance you get, although you've already stated that you don't care if your chip takes almost 300w.
> 
> Since I am entitled to an opinion just as you are, I'll stick with my relatively quiet computer, fairly cool room and a ~100w processor that provides great, predictable performance across the board because I don't want my HD 7970 and 120Hz monitor being under-utilized in non-DX11 games which already happens even with my processor. Have fun with your FX 9370 or whatever you ordered.



The one in hardware canucks was throttling. Thats why it under performed. Although you intel folks don't seem to like any responses othrr than intel is god. This is a maybe 5% performance increase what a waste of a release.


----------



## LAN_deRf_HA (Jul 18, 2013)

So performance is up almost solely from the 100mhz speed boost? That's a bit weird. Ivy should have brought at least a tiny bit of IPC improvement.


----------



## Relayer (Jul 18, 2013)

I'm wondering how bad the CPU bottleneck is going to be, in games that are demanding of the CPU, with 20nm GPU's next year? The efficiency increase is real nice, but the total lack of performance increase is disheartening.


----------



## radrok (Jul 18, 2013)

Aquinus said:


> Power consumption numbers look pretty nice to me.
> http://www.techpowerup.com/img/13-07-18/143e.jpg



yeah this is the biggest improvement over SB-E so far which will be amazing on the Xeon side because it's probably the reason they can up core count while remaining within a decent TDP.

 What keeps me interested is how will this Ivy Hexa overclocks thanks to 22nm shrink and soldered IHS combo.


Also lol at delusional AMD trolls.


----------



## repman244 (Jul 18, 2013)

Intel doesn't care about the 2011 socket for consumers.
The real deal are Xeon based IB (it should have more cores/cache than SB).

And I also don't get why a lot of people are shocked here, we all knew how will IB 2011 perform.


----------



## ensabrenoir (Jul 18, 2013)

HumanSmoke said:


> You were expecting the 4960X to be clocked at 4.701GHz+ ? or just out fishing for the big one?
> http://www.captionthis.org/pics/052010/1274791640-briefcase-fishing.jpg





I can come up with no  logical explanation for the situation in that photo.......

on topic:

I know were enthusiast an all but seriously people.... times are changing  its all about low power now.  The highend  hardware is decades ahead of its software requirements.  Sandy bridge can pretty much chew through anything now and the next 5 years coming .... Intel focus has shifted.   Until the need for uber power returns this is what we get.   amd has alot of time to become current...


----------



## Jstn7477 (Jul 18, 2013)

cdawall said:


> The one in hardware canucks was throttling. Thats why it under performed. Although you intel folks don't seem to like any responses othrr than intel is god. This is a maybe 5% performance increase what a waste of a release.



Actually, I used AMD pretty much exclusively until late 2011 when I got my 2600K. I even had a bunch of Phenom II systems last year and still have some although I sold a few off as their PPD/watt sucked. My parents own my FX-8150 which ended up being my last AMD system because the CPU took so much power and ran hot at 4GHz while still getting crappy PPD. I did also buy an i7-870 system last year but sold it earlier this year because it was older 45nm tech like the PII stuff.

Why should I be forced to buy something "worse" for the sake of not being called a fanboy? I mainly use my computer for gaming, and not at 60Hz, which means the CPU better have every ounce of performance and feed my video card the best it can as most of the games I play regularly are equally or more CPU bound than GPU bound (TF2, Planetside 2, Minecraft, Skyrim...). This is why I've also avoided the LGA 2011 platform, as it's rather dated. 

Let's not forget that my 4770K is overclocked by 900MHz, and at stock it was dominating all but maybe two game tests where the 9590 has a ~2-5 FPS lead. How else am I supposed to convince you that I want to buy whatever is good at the time? But no, it's not AMD, so Jstn7477 is a "fanboy" just because.


----------



## lilhasselhoffer (Jul 18, 2013)

Statement 1: I've got a 3930k.  If you want to call me an AMD troll stuff it.

Statement 2: Where are the numbers for a 4930k or the equivalent?  The 4960x is interesting, but not double the price (assuming the same relationship of SB-e is maintained in IB-e) interesting.

Statement 3: Haswell-e is a rather pointless endeavor.  The same minor CPU performance increases mean that there won't be a lot of performance gains.  The big benefit to Haswell-e is going to be a better PCH and host of new features.  DDR4, full SATA III, and a few other features are what is going to make Haswell-e a viable platform.

Statement 4: Intel has no competition at this price point.  There is almost no market, and the market that does exist is a monopoly.  People will pay for these processors, but they don't have to make huge leaps to compare to their competition.  This means more development dollars are going to graphics and CPU component integration.  It's bad for those that want a calculation beast, but Intel isn't catering to this market any more.


My opinion, I'm not spending money on IB-e.  The performance increases over SB-e are minor.  The lack of an announced successor to X79 means there isn't going to be better board options, and the options that still exist are pretty depressing.  An decrease in the consumed power is great, but it'd take a decade to justify spending $500 on a new processor for it.  

I'm really hoping that the 4 core IB-e is worth picking up.


----------



## MxPhenom 216 (Jul 18, 2013)

I would take an Ivy Bridge E chip over SD-E for the power consumption improvements alone.


----------



## haswrong (Jul 18, 2013)

ensabrenoir said:


> I can come up with no  logical explanation for the situation in that photo.......
> 
> on topic:
> 
> I know were enthusiast an all but seriously people.... times are changing  its all about low power now.  The highend  hardware is decades ahead of its software requirements.  Sandy bridge can pretty much chew through anything now and the next 5 years coming .... Intel focus has shifted.   Until the need for uber power returns this is what we get.   amd has alot of time to become current...



are you really sure no one can utilize more computational power? you are talking about enthusiast platform. i dont care if intels focus shifted. if they shift it the wrong way, im not buying and intel goes bankrupt. intel clearly doesnt want new money from customers.


----------



## JThorpe (Jul 18, 2013)

Dj-ElectriC said:


> Don't worry about that mate, wait for the end of the summer for some news



What does that mean?


----------



## haswrong (Jul 18, 2013)

JThorpe said:


> What does that mean?



i think making new motherboards for ib-e is a losing trade for mb manufacturers. how many will upgrade to ib-e just to lower the consumption? most people who fancy 2011 dont care about consumption that much that it would be their main priority for upgrade. why bother with upgrade, when haswell-e will have a new socket? its a complete waste of money and material for no visible improvement. if intel want me to upgrade, theyd better present me an offer i cant refuse.


----------



## drdeathx (Jul 18, 2013)

cdawall said:


> The one in hardware canucks was throttling. Thats why it under performed. Although you intel folks don't seem to like any responses othrr than intel is god. This is a maybe 5% performance increase what a waste of a release.



I agree. Intel is failing ATM allowing AMD to race up their sphinkter.


----------



## TheHunter (Jul 18, 2013)

They gained some extra space might as well make it a 8core. But nooo


----------



## ensabrenoir (Jul 18, 2013)

haswrong said:


> are you really sure no one can utilize more computational power? you are talking about enthusiast platform. i dont care if intels focus shifted. if they shift it the wrong way, im not buying and intel goes bankrupt. intel clearly doesnt want new money from customers.



i actually have an asrock x79 extreme 9 with a 3820 i got only because i wanted to wait for ivy-e (pc atm).  I plan on going to a six core with ivy-e.  Your right some do need more comp power though as some use their rigs for more than gaming.  The xeon users prob won't be complaining about anything though.  Intel wants more of the everydayer/upgrade everytime a new ones out/facebookers-tablet-tweeters money

Haswell -e if i remember correctly will shift the enthusiast line to 6 & 8 core cpus with  the next chipset.


----------



## Am* (Jul 18, 2013)

Aquinus said:


> Power consumption numbers look pretty nice to me.
> http://www.techpowerup.com/img/13-07-18/143e.jpg



Agreed, power consumption numbers look pretty impressive, especially for an engineering sample. If these chips are done right and use fluxless solder instead of that shitty thermal paste, they may well have a killer enthusiast chip on their hands that will clock past the 4.5GHz mark.


----------



## Jstn7477 (Jul 18, 2013)

Am* said:


> Agreed, power consumption numbers look pretty impressive, especially for an engineering sample. If these chips are done right and use fluxless solder instead of that shitty thermal paste, they may well have a killer enthusiast chip on their hands that will clock past the 4.5GHz mark.



They're soldered. There was a report of one being delidded a couple weeks back and it destroyed the die. http://www.overclockers.com/forums/showthread.php?t=733766


----------



## xorbe (Jul 18, 2013)

that power consumption == where's my 8 core


----------



## Lionheart (Jul 18, 2013)

Jstn7477 said:


> Sorry, but being decent in roughly 7 out of 40 tests at Hardware Canucks doesn't bode well for a processor that is clocked 1200MHz above the competition, costs 1.5-3x more unless you are comparing to 3970X, gets poopy frame rates in games compared to a mainstream Intel CPU, and consumes a ridiculous amount of power for the rather lackluster performance you get, although you've already stated that you don't care if your chip takes almost 300w.
> 
> Since I am entitled to an opinion just as you are, I'll stick with my relatively quiet computer, fairly cool room and a ~100w processor that provides great, predictable performance across the board because I don't want my HD 7970 and 120Hz monitor being under-utilized in non-DX11 games which already happens even with my processor. Have fun with your FX 9370 or whatever you ordered.



A simple comment really brings out the Intel fanboy in you


----------



## cdawall (Jul 18, 2013)

drdeathx said:


> I agree. Intel is failing ATM allowing AMD to race up their sphinkter.



That's what happens when you get left in every other market. They still can't beat amd in cores per cluster and the server market is moving that way. Oh well let the performance hounds have their fun. Games still play the same between this and the 9590 at the same price.


----------



## 1c3d0g (Jul 18, 2013)

xorbe said:


> that power consumption == where's my 8 core



Word on the street has it that Haswell-E will be the one to have 8 cores.


----------



## radrok (Jul 18, 2013)

cdawall said:


> Oh well let the performance hounds have their fun. Games still play the same between this and the 9590 at the same price.



On CPU heavy games? No way on earth, you are dreaming.

Higher IPC from Intel CPUs also helps a ton on minimum framerates, average doesn't tell the whole story.


----------



## cdawall (Jul 18, 2013)

radrok said:


> On CPU heavy games? No way on earth, you are dreaming.
> 
> Higher IPC from Intel CPUs also helps a ton on minimum framerates, average doesn't tell the whole story.



Depends on the game higher IPC is cool until it is heavily multithreaded like most modern games and then AMD starts to take the lead.


----------



## MxPhenom 216 (Jul 18, 2013)

radrok said:


> On CPU heavy games? No way on earth, you are dreaming.
> 
> Higher IPC from Intel CPUs also helps a ton on minimum framerates, average doesn't tell the whole story.



Don't worry he defends anything AMD like it's his own child, it's okay.


----------



## radrok (Jul 18, 2013)

cdawall said:


> Depends on the game higher IPC is cool until it is heavily multithreaded like most modern games and then AMD starts to take the lead.



You really are convinced of what you are saying 

Are you completely sure that an AMD Octa can/starts to take the lead on a multithreaded game scenario against an Intel Hexa?

If so please show me proof because every single overclock review I've seen shows the AMD CPU way behind.


----------



## cdawall (Jul 18, 2013)

radrok said:


> You really are convinced of what you are saying
> 
> Are you completely sure that an AMD Octa can/starts to take the lead on a multithreaded game scenario against an Intel Hexa?
> 
> If so please show me proof because every single overclock review I've seen shows the AMD CPU way behind.



There are multiple times it actually leads in minimum FPS.


----------



## radrok (Jul 18, 2013)

Heavy GPU bound situations do not reflect CPU performance, imho.

I'm betting the CPUs here don't even get properly stressed.

Throw in one or a couple more GPUs and you'll see the difference.


----------



## cdawall (Jul 18, 2013)

radrok said:


> Heavy GPU bound situations do not reflect CPU performance, imho.
> 
> I'm betting the CPUs here don't even get properly stressed.
> 
> Throw in one or a couple more GPUs and you'll see the difference.



Are those not acceptable frame rates in multiple AAA titles at a resolution people play at?

Win loose or draw it shows the minimum FPS on the AMD side for multiple games. That means there is something within the AMD setup allowing better performance.

You can argue GPU bound vs CPU bound all you want, but in the real world those graphs show it all there will be no difference in actual performance while gaming. Mind you that is a basically stock clocked 9590 as well.


----------



## radrok (Jul 18, 2013)

Actually those minimum framerates are in the margin of error of GPU rendering performance.

Wanna bet the average/maximum and minimum framerate with a 5GHz 3930/3960 will always be favouring the Intel CPU?

Take a look here, how the game changes when you add more than one GPU 

http://www.anandtech.com/show/6985/choosing-a-gaming-cpu-at-1440p-adding-in-haswell-/7


----------



## cdawall (Jul 18, 2013)

radrok said:


> Actually those minimum framerates are in the margin of error of GPU rendering performance.
> 
> Wanna bet the average/maximum and minimum framerate with a 5GHz 3930/3960 will always be favouring the Intel CPU?
> 
> ...



That game sucks.






Two GPU's Dirt 3 and it is within 10%. I would honestly be curious how it did with one of the boards running PCI-E 3.0 vs 2.0. It appears to have madea huge difference with the intel chips.


----------



## radrok (Jul 19, 2013)

That's better, for sure.

What I really don't like about the AMD CPU (for gaming) is the fact it blows on poorly threaded games.

It's a really good multithreaded CPU but you can feel it lacks on the IPC department and that's a deal killer for me.

They need to increase that to get back into the game and not get laughed at.


----------



## MxPhenom 216 (Jul 19, 2013)

cdawall said:


> There are multiple times it actually leads in minimum FPS.
> 
> http://www.kitguru.net/wp-content/uploads/2013/07/AvP5.png http://www.kitguru.net/wp-content/uploads/2013/07/sleeping-dogs4.png
> 
> ...



Are you for real?


----------



## cdawall (Jul 19, 2013)

radrok said:


> That's better, for sure.
> 
> What I really don't like about the AMD CPU (for gaming) is the fact it blows on poorly threaded games.
> 
> ...



Hopefully steamroller will fix that issue. For now AMD owners just have to deal with the added multithreading performance. This will be my first FX chip so it will be interesting. 



MxPhenom 216 said:


> Are you for real?
> 
> Using those benchmarks to determine CPU performance in gaming is wrong. Set the resolution and settings the lowest it will go, and then you have relevant benchmarks to determine overall CPU performance in games.



I personally couldn't give two shits how it performs at 640x480 I care about actual real world performance. If it can play my game just as good during eyefinity as an Intel why swap.


----------



## radrok (Jul 19, 2013)

Anyway, let's get out of gaming for a second, where GPU is way more important than CPU.

From the same review






Really shows you AMD has some serious catch up if they want to price their products where they have.

Doesn't even begin to match a STOCK i7 hexa.


----------



## cdawall (Jul 19, 2013)

radrok said:


> Anyway, let's get out of gaming for a second, where GPU is way more important than CPU.
> 
> From the same review
> 
> ...



Use an encoder that utilizes AVX or FMA4 and the performance is very close.


----------



## radrok (Jul 19, 2013)

That's the issue, to get the best performance out of it you need to use this and use that.

Why do you have to compromise when you can get a product that performs well everywhere?


----------



## cdawall (Jul 19, 2013)

radrok said:


> That's the issue, to get the best performance out of it you need to use this and use that.
> 
> Why do you have to compromise when you can get a product that performs well everywhere?



Same reason those CPU's are taking over the server market pretty quickly. They do perform well. I don't use any single application were the performance of an FX CPU is so awful I cannot happily/quickly complete my task, but there are applications I use that it works exceedingly well. (3D CAD with proper rendering tools/the games I play)


----------



## radrok (Jul 19, 2013)

So basically Intel is secretly awful and no one recognizes that?

Please.


----------



## MxPhenom 216 (Jul 19, 2013)

cdawall said:


> Hopefully steamroller will fix that issue. For now AMD owners just have to deal with the added multithreading performance. This will be my first FX chip so it will be interesting.
> 
> 
> 
> I personally couldn't give two shits how it performs at 640x480 I care about actual real world performance. If it can play my game just as good during eyefinity as an Intel why swap.



Uh yeah disregard what I said. I was at work when I said that, then on my way home I thought about it and facepalmed myself. 

What I find interesting though is the fact that CPU performance differences in gaming becomes far less when game resolutions are increased and all that. You are right that AMD and Intel are pretty close in gaming performance, but that's only close, Intel still has the performance crown, and they will continue to have it till AMD does a big overhaul on their sockets and chipsets.


----------



## cdawall (Jul 19, 2013)

radrok said:


> So basically Intel is secretly awful and no one recognizes that?
> 
> Please.



Well actually...link



MxPhenom 216 said:


> Uh yeah disregard what I said. I was at work when I said that, then on my way home I thought about it and facepalmed myself.
> 
> What I find interesting though is the fact that CPU performance differences in gaming becomes far less when game resolutions are increased and all that. You are right that AMD and Intel are pretty close in gaming performance, but that's only close, Intel still has the performance crown, and they will continue to have it till AMD does a big overhaul on their sockets and chipsets.



Technically only close is with both reasonably overclocked...FX at 5ghz and SB-E@4.4ghz neither is really low wattage I feel stock vs stock the 9590 and 3960x would be close in all ways as far as games go. 

The Haswell mainstream chips are a nice toss into the mix, but the half ass heatspreader and heat issue would make me mad, not to mention the extra cost of a entire system change for me makes it not worth it.


----------



## ViperXTR (Jul 19, 2013)

well, them FX chips seems to be better in Crysis 3's grassy areas where teh grass is physicially simulated by the CPUs, more threads the better.


----------



## ensabrenoir (Jul 19, 2013)

.....don't know who you people are ...or how you got into my computer but stop all this arguing!!! your giving my HDD the willies!!!!

The amd or intel debate is just like the chicken or the egg debate.... It don't matter.... people eat'em both.


----------



## Deadlyraver (Jul 19, 2013)

Oh, a super expensive chip for my gaming? Why not! I am sure I will be hijacking NORAD soon.

Plus it ain't Haswell, why bother?


----------



## HumanSmoke (Jul 19, 2013)

cdawall said:


> Same reason those CPU's are taking over the server market pretty quickly.


Define "pretty quickly". From the shipping estimates I've seen, AMD's clawing back of server/HPC market share is predicated upon ARM-based solutions from SeaMicro, and a fairly leisurely ramp for conventional x86 based Opterons.
From Mercury Research's own analysis (presumably "96%" is rounded up from 95.7%) :


> Intel controlled 96 percent of the market for servers that run on PC processors in the fourth quarter, according to Cave Creek, Arizona-based Mercury Research Inc. Advanced Micro Devices Inc. (AMD), Intel’s only rival in that market, had 4.3 percent.


As for the hoohah about gaming and CPUs...the bulk of games are of course graphics constrained, so a user would need to have some pretty narrow focus to argue gaming as a fundamental feature of a $1K processor IMO. Personally, if its a gaming benchmark comparison, I'd be more inclined to check out games that actually keep a CPU gainfully employed - i.e. strong AI routines, comprehensive CPU physics etc. - usually the province of RTS games - i.e. Skyrim





It's little wonder that games such as Tomb Raider and MLL are showing little differentiation between CPUs or clockspeeds


----------



## cdawall (Jul 19, 2013)

I swear If I see one more shitty skyrim benchmark.  Its poorly coded amd sucks with it we fucking get it.


----------



## radrok (Jul 19, 2013)

cdawall said:


> I swear If I see one more shitty skyrim benchmark.  Its poorly coded amd sucks with it we fucking get it.



Have to agree with you here, the engine on which Skyrim runs is old as the universe itself and it should have been dropped with Oblivion.



On a side note I'd really want both AMD and Intel to increase core count for the Desktop platform.


----------



## HumanSmoke (Jul 20, 2013)

radrok said:


> Have to agree with you here, the engine on which Skyrim runs is old as the universe itself and it should have been dropped with Oblivion


Unfortunately, RTS titles used in gaming benchmark suites are a fairly small subset of a fairly small subset (games that are CPU intensive) now that Civ5 has largely passed from benchmarking suites. Likewise, large maps and competent AI aren't synonymous with most games used in benchmarking- which tend towards the corridor-shooter FPS variety...and even when you find a game that is taxing the CPU, it may be the product of stalls due to coding and/or memory usage...Borderlands 2 likely fitting some of those parameters.





Feel free to stone the messenger, but until sites utilize a CPU intensive game (preferably one that scales with core count adequately), you're stuck with what *is* benchmarked in CPU comparative game testing - and of course, the paucity of games that fit the bill says something in itself about the general requirement.


----------



## cdawall (Jul 20, 2013)

Skyrim is still and always will be a worthless benchmark. Doesn't scale with cores, doesn't multithread...POS benchmark and you know it.


----------



## MxPhenom 216 (Jul 20, 2013)

cdawall said:


> Skyrim is still and always will be a worthless benchmark. Doesn't scale with cores, doesn't multithread...POS benchmark and you know it.



Agreed. You could disable all but one core, and overclock it to the sky, and get the same performance as with the cores enabled and overclocked to the same speed lol.

Reason AMD gets bad performance in Skyrim is because of just that too, AMD FX chips are not known for their single thread performance, where as Intel is.



radrok said:


> Have to agree with you here, the engine on which Skyrim runs is old as the universe itself and it should have been dropped with Oblivion.
> 
> 
> 
> On a side note I'd really want both AMD and Intel to increase core count for the Desktop platform.



There's not much of a reason though. Software that everyday consumers use has kind of hit a wall in terms of the amount of cores they need. Once software begins to demand more cores, then we will begin to see more cores in our hardware.


----------



## radrok (Jul 20, 2013)

MxPhenom 216 said:


> Software that everyday consumers use has kind of hit a wall in terms of the amount of cores they need.



This is also the reason why Intel is more appealing than AMD right now.

AMD seems to have kinda forgotten that single threaded performance is also important.


----------



## cdmikelis (Aug 15, 2013)

*Do not scorn truck if you need a car*



Jstn7477 said:


> Sorry, but being decent in roughly 7 out of 40 tests at Hardware Canucks doesn't bode well for a processor that is clocked 1200MHz above the competition, costs 1.5-3x more unless you are comparing to 3970X, gets poopy frame rates in games compared to a mainstream Intel CPU, and consumes a ridiculous amount of power for the rather lackluster performance you get, although you've already stated that you don't care if your chip takes almost 300w.
> 
> Since I am entitled to an opinion just as you are, I'll stick with my relatively quiet computer, fairly cool room and a ~100w processor that provides great, predictable performance across the board because I don't want my HD 7970 and 120Hz monitor being under-utilized in non-DX11 games which already happens even with my processor. Have fun with your FX 9370 or whatever you ordered.



--
Why people gets so upset about SB/IB-E if all you need is Gaming oriented i5-K chip? Intel give "E's" to us who need it, not for thoose who don't. It's like a truck: It does consume x-times more fuel than a personal vehicle, it's not good on cornering, is noisy and heats a lot. But try to convince some cargo company, that using a family car would be better for them. I do a lot of video rendering: Secods faster are multiplyed many times and at the end of day I save tens of minutes, at the end of month I save hours at the end of year I save days of my valuable working time. My ROI for "only" 10% faster chip or next generation of GPU is 3-6 months. But for office computer, for making invoices, I still have E6800 with 4GB ram. (and will have it for a while). 

2nd: Intel CAN NOT give us Haswell-E until server platform wil do that move. Server owners does not care about computer games, but cares about of TCO and ROI, so changing platform every year would instantly penalize Intel and reward compeption. That's it.


----------



## TheHunter (Aug 15, 2013)

cdawall said:


> Well actually...link
> 
> 
> 
> ...



What heat issue? I have 4770k OC'ed to 4.6ghz@ 1.238v and i never saw per core temperature over 60C, usually its 47-55C in any game so far.


In something like Cinebench11.5 or Any video converter using 8threads x264 codec it hits 62-74C it will be better in winter though and im only at H90, atm outside 40C inside 29C helps too xD


----------



## fullinfusion (Aug 16, 2013)

TheHunter said:


> What heat issue


The heat issue all reviews report on the IB and Hasswell cpu's

That H90 you have is one hell of a cooler at the claimed 29c inside temperature.

:shadedshu

Try loading it up under OCCT or better yet Prime95! 
I betcha you stop the program once you see the temp shoot to the moon


----------



## Aquinus (Aug 16, 2013)

TheHunter said:


> In something like Cinebench11.5 or Any video converter using 8threads x264 codec it hits 62-74C



Sounds a lot like my 3820; it consumes a lot of power and I only have a Zalman CNPS 9900 cooling it to get similar temperatures with a similar OC. I wouldn't call that optimal. I read that IVB-E has the heatspreader soldered to the die, but I'm not sure if it's true or not though.


----------



## brandonwh64 (Aug 16, 2013)

Not that big of a performance increase to justify the price.


----------



## TheHunter (Aug 16, 2013)

fullinfusion said:


> The heat issue all reviews report on the IB and Hasswell cpu's
> 
> That H90 you have is one hell of a cooler at the claimed 29c inside temperature.
> 
> ...



Yep its how it performs in real world scenario. 


I ran IBT, linx, Prime95 it and yeah it was 80-90C, so? I will never ever see that in any app. or game so im not bothered with those tools.. Even intel said its useless and not a real stability indicator.. And its true, I passed all torture tests and yet failed in Bf3 lol 



> Sounds a lot like my 3820; it consumes a lot of power and I only have a Zalman CNPS 9900 cooling it to get similar temperatures with a similar OC. I wouldn't call that optimal. I read that IVB-E has the heatspreader soldered to the die, but I'm not sure if it's true or not though.



I never said its optimal and that's the worse case scenario in summer. When i get higher static pressure fans 2.7mm h2o it will change a lot (atm 1.5mm h20)


About soldering, someone leaked IB-e deliding and it showed proper soldering.


----------



## radrok (Aug 16, 2013)

TheHunter said:


> Yep its how it performs in real world scenario.
> 
> 
> I ran IBT, linx, Prime95 it and yeah it was 80-90C, so? I will never ever see that in any app. or game so im not bothered with those tools.. Even intel said its useless and not a real stability indicator.. And its true, I passed all torture tests and yet failed in Bf3 lol
> ...



If you manage to pass 1 hr LinX AVX enabled I'm pretty sure you won't find anything that makes your PC crash.

I use that as stability test and I've never had my pc hang after passing that.

Beware it shoots your temps to the moon (unlike Linx non-AVX), makes my 3930K go up to 80C on a gigantic custom water loop.


----------



## Mydog (Aug 31, 2013)

Not that big difference from 3960X and with a lot more vcore is my experience like this guy.

http://m.hardocp.com/article/2013/08/30/intel_ivy_bridgee_core_i74960x_ipc_ocing_review/6#.UiHP7NJkNyx


----------



## dj-electric (Aug 31, 2013)

Ditto here


----------

