# Core i7-4960X "Ivy Bridge-E" Roughly 10% Faster than i7-3970X: Early Tests



## btarunr (Apr 25, 2013)

PC enthusiast "Toppc" with the Coolaler.com, with access to a Core i7 "Ivy Bridge-E" sample clocked to match specifications of the Core i7-4960X, wasted no time in comparing the chip to a Core i7-3970X "Sandy Bridge-E." The two chips share a common socket LGA2011 design, and run on motherboards with Intel X79 Express chipset. An MSI X79A-GD45 Plus, with V17.1 BIOS was used to run the two chips. Among the tests Toppc put the chip through, are overclocker favorites SuperPi mod 1.6, CPU Mark '99, WPrime 1.63, Cinebench 11.5, 3DMark Vantage (CPU score), and 3DMark 06 (CPU score).

The Ivy Bridge-E chip outperformed its predecessor by roughly 5-10 percent across the board. In Cinebench, the i7-4960X scored 10.94 points in comparison to the i7-3970X' 10.16; SuperPi 32M was crunched by the i7-4960X in 9m 22.6s compared to the 9m 55.4s of the i7-3970X; CPU Mark scores between the two are 561 vs. 533, respectively; 3DMark Vantage CPU score being 38,644 points vs. 35,804, respectively; and 3DMark 06 scores 8,586 points vs. 8,099 points, respectively. In WPrime, the i7-4960X crunched 32M in 4.601s, compared to its predecessor's 5.01s. Below are the test screenshots, please note that they're high-resolution images, so please open each in a new tab.



*Cinebench 11.5*




*SuperPi and CPU Mark*




*3DMark Vantage CPU score*




*3DMark 06 CPU score and WPrime 1.63*




*View at TechPowerUp Main Site*


----------



## Jorge (Apr 25, 2013)

NOT very impressive but we knew this already.


----------



## Nordic (Apr 25, 2013)

Agreed. Not really a surprise.


----------



## CameronBanna (Apr 25, 2013)

*overclocking?*

How's the overclocking on this chip?


----------



## LAN_deRf_HA (Apr 25, 2013)

Are these soldered? That would be just about the only interesting data point, seeing if a soldered Ivy runs cooler than a soldered Sandy, but beyond that we knew the performance of this chip before it even had a name.


----------



## HumanSmoke (Apr 25, 2013)

DDR3-1600 at 11-11-11-28 ? 
Really pushing the envelope.


----------



## RejZoR (Apr 25, 2013)

If Intel will keep on doing these idiotic 10% bumps for every series at full price every time, they can have them. Lazy money milking bastards.


----------



## Inceptor (Apr 25, 2013)

RejZoR said:


> If Intel will keep on doing these idiotic 10% bumps for every series at full price every time, they can have them. Lazy money milking bastards.



Sandy Bridge, in 2010, was a huge jump in performance, and I think it has skewed everyone's expectations.
Intel has been squeezing optimizations and modifications of the same basic architecture for quite a few years now.  It should not be surprising that the performance bump is now in the 5-10% range for each new iteration.  Especially when taking into consideration the problems encountered when shrinking the design to smaller and smaller process nodes.


----------



## HumanSmoke (Apr 25, 2013)

RejZoR said:


> If Intel will keep on doing these idiotic 10% bumps for every series at full price every time, they can have them. Lazy money milking bastards.


Well, it's not as if you *HAVE* to buy it.
Judging by your system specs, you've already proven that you can readily do without Gulftown, Sandy Bridge, Ivy Bridge, and Sandy Bridge-E (not to mention Bulldozer and Piledriver).

Just as a point of interest:
Core i7 920...$284 four+ years ago
Core i7 3820...$300 now

Seems like a reasonable progression in performance given the fiercely competitive x86 market


----------



## Prima.Vera (Apr 25, 2013)

fiercely competitive ?? )


----------



## RejZoR (Apr 25, 2013)

HumanSmoke said:


> Well, it's not as if you *HAVE* to buy it.
> Judging by your system specs, you've already proven that you can readily do without Gulftown, Sandy Bridge, Ivy Bridge, and Sandy Bridge-E (not to mention Bulldozer and Piledriver).
> 
> Just as a point of interest:
> ...



The difference is that Core i7 (Nehalem) has shaken up the entire CPU industry at the time and is still going strong even after all the years (mainly because of 4 cores + 4 HT cores). Can't say the same for anything released after Nehalem...


----------



## birdie (Apr 25, 2013)

Seems like I'm the only person who's noticed that the 4960X has a higher base frequency, and supposedly a higher turbo frequency.

Which kinda negates the enthusiasm over the new CPUs since the older ones can be OC'ed.


----------



## arterius2 (Apr 25, 2013)

RejZoR said:


> The difference is that Core i7 (Nehalem) has shaken up the entire CPU industry at the time and is still going strong even after all the years (mainly because of 4 cores + 4 HT cores). Can't say the same for anything released after Nehalem...



you can't expect intel to "shake up the entire CPU industry" with every release, especially with AMD out of the picture. they would need to come out with an entirely new architecture. come to your senses.


----------



## HumanSmoke (Apr 25, 2013)

Prima.Vera said:


> fiercely competitive ?? )



I was toying with  rather than , but that seemed akin to waving a red rag at a bull (dozer?).


----------



## NeoXF (Apr 25, 2013)

Here's hoping AMD has a 5 or 6 module Steamroller up the pipe for us by the end of the year, that should make short work of this... hopefully for less than half the price. Otherwise there are no upsides to these almost pathetic speed-bumps...


----------



## Frick (Apr 25, 2013)

RejZoR said:


> The difference is that Core i7 (Nehalem) has shaken up the entire CPU industry at the time and is still going strong even after all the years (mainly because of 4 cores + 4 HT cores). Can't say the same for anything released after Nehalem...



The same story as with Core 2 then. As that other guy said, come to your senses.

Anyway, helllooo my new desktop! If only.


----------



## NeoXF (Apr 25, 2013)

I think we can sum it up as... Core 2 was great, Nehalem/first generation of "i" CPUs where great and Sandy Bridge (second generation of "i" CPUs) was great... but now, Ivy Bridge mediocre and as it looks Ivy Bridge-E and Haswell, mediocre as well.

Hell, all in all, ignoring the fact that they're behind, I'd say AMD has made bigger advancements (not necessarily only performance-wise) than Intel, in these last 2 years or so...


----------



## boogerlad (Apr 25, 2013)

A 10% ipc boost is better than the pathetic sandy bridge -> ivy bridge ipc boost on lga1155. This should be equivalent to Haswell in ipc I think.


----------



## Aquinus (Apr 25, 2013)

HumanSmoke said:


> Well, it's not as if you *HAVE* to buy it.
> Judging by your system specs, you've already proven that you can readily do without Gulftown, Sandy Bridge, Ivy Bridge, and Sandy Bridge-E (not to mention Bulldozer and Piledriver).
> 
> Just as a point of interest:
> ...



You need to check Micro Center because the both the 3820 and 3770k are going for 230 USD if you can make your way into a store.


----------



## xenocide (Apr 25, 2013)

NeoXF said:


> I think we can sum it up as... Core 2 was great, Nehalem/first generation of "i" CPUs where great and Sandy Bridge (second generation of "i" CPUs) was great... but now, Ivy Bridge mediocre and as it looks Ivy Bridge-E and Haswell, mediocre as well.
> 
> Hell, all in all, ignoring the fact that they're behind, I'd say AMD has made bigger advancements (not necessarily only performance-wise) than Intel, in these last 2 years or so...



You're comparing apples and oranges.

Core 2 was great, it gave Intel a huge lead over AMD.  But you're forgetting Tick Tock.  Conroe was a game changer, but was Wolfdale (E6xxx vs. E8xxx) a massive game changer?  No, it wasn't.  What about the difference between Kentsfield (Q6xxx) and Yorksfield (Q7\8\9xxx)?  Also not too massive.  Then with Nahelem they got a little wonky with it.  The fact is, Ivy Bridge was never intended to be a massive improvement on Sandy Bridge, Sandy Bridge _was_ the massive improvement.  If you remember, Sandy Bridge matched Nahelem's top dog for a fraction of the price, there's a reason the things flew off shelves.


----------



## Jacez (Apr 25, 2013)

xenocide said:


> You're comparing apples and oranges.
> 
> Core 2 was great, it gave Intel a huge lead over AMD.  But you're forgetting Tick Tock.  Conroe was a game changer, but was Wolfdale (E6xxx vs. E8xxx) a massive game changer?  No, it wasn't.  What about the difference between Kentsfield (Q6xxx) and Yorksfield (Q7\8\9xxx)?  Also not too massive.  Then with Nahelem they got a little wonky with it.  The fact is, Ivy Bridge was never intended to be a massive improvement on Sandy Bridge, Sandy Bridge _was_ the massive improvement.  If you remember, Sandy Bridge matched Nahelem's top dog for a fraction of the price, there's a reason the things flew off shelves.



Right, but if Haswell, which is supposed to be the big improvement, is only 10% better, then Intel fails.


----------



## HumanSmoke (Apr 25, 2013)

Aquinus said:


> You need to check Micro Center because the both the 3820 and 3770k are going for 230 USD if you can make your way into a store.


Too long a walk I'm afraid - I live in New Zealand...although at those prices it does make me acutely aware of the cost of living at the last stop before Antarctica.

At $230, RejZor's argument is taking on more water than a cardboard submarine.


----------



## Fourstaff (Apr 25, 2013)

Jacez said:


> Right, but if Haswell, which is supposed to be the big improvement, is only 10% better, then Intel fails.



The graphics department, which is the main focus of Haswell (among other things), will most definitely be more than 10% better.


----------



## BiggieShady (Apr 25, 2013)

Fourstaff said:


> The graphics department, which is the main focus of Haswell (among other things), will most definitely be more than 10% better.



In that case it's clearly that Tick-Tock cycle is no longer relevant for per-core performance expectations, since Intel can choose the focus of micro architecture improvements - cpu cores or iGPU. Essentialy, Haswell is a tock only for iGpu.


----------



## badtaylorx (Apr 25, 2013)

i think the problem here is just that SandyBridge was just _THAT_ good!!!


----------



## RejZoR (Apr 25, 2013)

HumanSmoke said:


> Too long a walk I'm afraid - I live in New Zealand...although at those prices it does make me acutely aware of the cost of living at the last stop before Antarctica.
> 
> At $230, RejZor's argument is taking on more water than a cardboard submarine.



3770k is 309 EUR here and it's the cheapest importer for my country where i buy most of components. Still soaking up like a cardboard submarine? And if you do this every time for a mere 10% boost... do the math...

I'm placing big hopes for Skylake architecture when it arrives sometime next year (probably) but who knows.


----------



## Fourstaff (Apr 25, 2013)

RejZoR said:


> 3770k is 309 EUR here and it's the cheapest importer for my country where i buy most of components. Still soaking up like a cardboard submarine? And if you do this every time for a mere 10% boost... do the math...
> 
> I'm placing big hopes for Skylake architecture when it arrives sometime next year (probably) but who knows.



How much was 920 in your place back when it was the most popular chip? Without knowing that there is no comparison. Regardless of what is costs, even if the 920 is cheaper than €309 the problem lies with your country's importers, not Intel. On top of that no one forces you to shell out every time Intel releases a new chip. New chip releases works wonders only for people either looking for more performance regardless of cost, or those with older chips (still rocking in their E8xxxx chips etc).


----------



## RejZoR (Apr 25, 2013)

Digged the order mail from June 2009. Core i7 920 D0 that i bought was 262 EUR at that time (same store). And i know i paid tiny bit extra just to get the D0 version specifically. It'll soon be exactly 4 years ago when i bought it and i still think it's superb. Paired it with Antec H2O 920 cooler and overclocked it and it's up to every gaming task that i have and all the compression/encoding needs. 8 threads at 4+ GHz is not that weak despite the CPU age.

Granted, no one forces me to shell out such amount of money, but with such tiny boosts it makes even less sense. I was looking at Q6600 back then and then decided to shell out some more and take the newer Core i7 920. And i made a great decision. I don't think any of the current CPU's would last as long as this one did.


----------



## AsRock (Apr 25, 2013)

HumanSmoke said:


> Too long a walk I'm afraid - I live in New Zealand...although at those prices it does make me acutely aware of the cost of living at the last stop before Antarctica.
> 
> At $230, RejZor's argument is taking on more water than a cardboard submarine.



And even if you live in America don't mean ya be able to get to one our closest one is 315 miles away LMAO


----------



## Fourstaff (Apr 25, 2013)

RejZoR said:


> Digged the order mail from June 2009. Core i7 920 D0 that i bought was 262 EUR at that time (same store). And i know i paid tiny bit extra just to get the D0 version specifically. It'll soon be exactly 4 years ago when i bought it and i still think it's superb. Paired it with Antec H2O 920 cooler and overclocked it and it's up to every gaming task that i have and all the compression/encoding needs. 8 threads at 4+ GHz is not that weak despite the CPU age.
> 
> Granted, no one forces me to shell out such amount of money, but with such tiny boosts it makes even less sense. I was looking at Q6600 back then and then decided to shell out some more and take the newer Core i7 920. And i made a great decision. I don't think any of the current CPU's would last as long as this one did.



In trays of 1000s, i7 920 sold for USD284 in 2009, which is about €200 with June 2009 exchange rate. You paid €262, which is about 30% markup
In trays of 1000s, 3770K is selling for $332, which is about €255. Shop is selling for €309 for 20% markup. 

Performance difference between both is much more than 20% in most cases(with or without overclock), so I would say that you are not getting an inferior product in any way (from 1000s tray prices, to exchange rates, to mark up). Inflation has not been taken to account yet. On top of that, its cheaper to assemble a system with 3770K rather than a 920 system, iirc it costs about $1500 for a fully functioning 920 system (with graphics card etc), while you will need to shell out about $1000 for an equivalent system (equivalent used extremely loosely here).  

Have you considered 3570K instead? Much cheaper than 3770k, and performance wise not too far behind. Granted the upgrade from 920 to 3570K is not as significant but that is large due to the strength and performance of the 920 more than anything else. 

http://www.anandtech.com/bench/Product/47?vs=551


----------



## mlee49 (Apr 25, 2013)

RejZoR said:


> If Intel will keep on doing these idiotic 10% bumps for every series at full price every time, they can have them. Lazy money milking bastards.



Good news is the 3k series chips will come down in price some 

edit: I'm still on X58, my 970 Hex-core has just been great for the last 3 years.


----------



## ensabrenoir (Apr 25, 2013)

NeoXF said:


> Here's hoping AMD has a 5 or 6 module Steamroller up the pipe for us by the end of the year, that should make short work of this... hopefully for less than half the price. Otherwise there are no upsides to these almost pathetic speed-bumps...






seriously though the entire cpu landscape is changing..... raw power is not the dominate factor anymore. 10% is more than enough for anything out there. Better temps and power consumption are king right now.  Software is years behind.  And it'll prob take a 20 CORE amd to match it. Most Intel users are fine with this  and paying for a new board.   ....There is not a lot of price haggling on Porsche lots...this is High End Desk Top. Mainstream is the value minded....and mainstream intel usually beats highend Amd


----------



## midnightoil (Apr 25, 2013)

~2 year wait for a ~10% performance increase.  That's pretty dire.

Shame that the desktop/enthusiast 'E' line only exists as a dumping ground for defective or very low bin Xeon-E parts.  If this weren't the case, Intel might have actually skipped straight to Haswell-E ... but if they'd done that, nowhere to dump.

Really hope Steamroller and Excavator are on time and up to expectations.  It's become patently clear that Intel won't make any push unless absolutely forced to.


----------



## midnightoil (Apr 25, 2013)

LAN_deRf_HA said:


> Are these soldered? That would be just about the only interesting data point, seeing if a soldered Ivy runs cooler than a soldered Sandy, but beyond that we knew the performance of this chip before it even had a name.



They are soldered, yes.  'E' enthusiast / desktop parts are rebranded low-binned / defective Xeon-E parts.


----------



## arterius2 (Apr 25, 2013)

meh, 10% is good enough for me to shell out


----------



## Octavean (Apr 25, 2013)

I think the reality is that Intel's priorities have changed somewhat.  

AMD x86 / x64 chips are no longer much of a threat but ARM processors are selling like mad and the market for ARM based hardware is still growing. Whereas the PC industry is seeing negative growth.  Intel knows they need to answer the ARM initiative. A big way of doing that is with decent performance and power efficiency not raw processing power.  So this is likely where Intel is focusing their efforts.

Intel should be able to engineer monster performance chips for desktop, workstations, servers and so on while engineering power efficient chips for tablets, phones and so on,.... but that doesn't mean that they wont cut corners by focusing on power efficiency on the platforms that would better benefit from raw power,..

***edit***

Also an upcoming Intel Core i7 4960X Ivy Bridge-E processor is likely to cost just as much as its predecessor in the ~$1,000+ USD range.  I'm not willing to spend that kind of money on a processor and judging from the system specs in this thread not many if any people here would either.  Its all very academic to argue about the finer points of something you'll likely never buy,....


----------



## midnightoil (Apr 25, 2013)

Octavean said:


> I think the reality is that Intel's priorities have changed somewhat.
> 
> AMD x86 / x64 chips are no longer much of a threat but ARM processors are selling like mad and the market for ARM based hardware is still growing. Whereas the PC industry is seeing negative growth.  Intel knows they need to answer the ARM initiative. A big way of doing that is with decent performance and power efficiency not raw processing power.  So this is likely where Intel is focusing their efforts.
> 
> Intel should be able to engineer monster performance chips for desktop, workstations, servers and so on while engineering power efficient chips for tablets, phones and so on,.... but that doesn't mean that they wont cut corners by focusing on power efficiency on the platforms that would better benefit from raw power,..



There's no possible way for Intel to match ARM though, and they know that.  The only way they can even come close is using vastly more expensive, complicated and smaller processes ... but they're still a long way behind in performance per watt and massively behind in performance per $.

There's no way that Intel can make any inroads into the phone, tablet or embedded devices market, and ARM will continue to chew through vast swathes of the server (and soon workstation) market.  The rate of market share loss is likely to increase exponentially for Intel once the the 64bit ARM chips start making their way into systems later this year.

It's going to get to the point soon where they either admit complete defeat or they start making more use of their existing ARM licenses (Intel are heavy licensees of ARM IP, contrary to popular belief).

The main reason Intel aren't pushing the envelope in desktop or E Xeons is because whilst they have a reasonable lead over AMD in absolute performance and performance per watt, they need the absolute maximum return possible on the minimum investment.  Outside of the SSD business, their margins and marketshares are dropping like a stone everywhere else.


----------



## Octavean (Apr 25, 2013)

Intel may make no inroads into the ARM segment of the market but that doesn't mean they wont try. The same goes for Microsoft and their efforts with Windows 8 and Windows RT with respect to the mobile market.


----------



## midnightoil (Apr 25, 2013)

Fourstaff said:


> In trays of 1000s, i7 920 sold for USD284 in 2009, which is about €200 with June 2009 exchange rate. You paid €262, which is about 30% markup
> In trays of 1000s, 3770K is selling for $332, which is about €255. Shop is selling for €309 for 20% markup.
> 
> Performance difference between both is much more than 20% in most cases(with or without overclock), so I would say that you are not getting an inferior product in any way (from 1000s tray prices, to exchange rates, to mark up). Inflation has not been taken to account yet. On top of that, its cheaper to assemble a system with 3770K rather than a 920 system, iirc it costs about $1500 for a fully functioning 920 system (with graphics card etc), while you will need to shell out about $1000 for an equivalent system (equivalent used extremely loosely here).
> ...



the i7-920 was a much, much more expensive chip to produce, at the time, than the i7-3770k.  there is a much, much higher margin for intel.


----------



## RejZoR (Apr 25, 2013)

I don't care what was the price for 1000 pieces. I don't buy thousand of them, i only need 1. And the price of one is never this low...


----------



## ensabrenoir (Apr 25, 2013)

midnightoil said:


> There's no possible way for Intel to match ARM though, and they know that.  The only way they can even come close is using vastly more expensive, complicated and smaller processes ... but they're still a long way behind in performance per watt and massively behind in performance per $.
> 
> There's no way that Intel can make any inroads into the phone, tablet or embedded devices market, and ARM will continue to chew through vast swathes of the server (and soon workstation) market.  The rate of market share loss is likely to increase exponentially for Intel once the the 64bit ARM chips start making their way into systems later this year.
> 
> ...



True.....the landscape is chaning hence why intel's greater focus and resouces will be else where...  But they do have the resources though so i wouldnt count them out just yet.  Arm....was a larger pivot point than many realize...its evolve or die and im sure intel knows this
Myn new rig is an x79 in a cosmos 2(only posted on your pcatm) i got a 3820(on an asrock x79 exteme 9) to hold me over until the 6 core ivy-e comes out.


----------



## EarthDog (Apr 25, 2013)

rejzor said:


> I don't think any of the current cpu's would last as long as this one did.


b o l o g n a.


----------



## HD64G (Apr 25, 2013)

I cannot understand that everyone thinks there is a 10% bump in IPC! It isn't! Max gain is almost 10%. In the majority of the tests the gain is 5-7%. And that is sad since the clocks are the same as IB. Total gain is none to move somone to upgrade. I hope AMD's SR is what expected. Only then Intel is going to bring faster CPUs or lower the prices.


----------



## tastegw (Apr 25, 2013)

Stock tests on these chips don't mean a whole lot, show us the overclocking comparison between the two on non engineering samples, that would make for a better article.


----------



## Mindweaver (Apr 25, 2013)

10 percent over the old chip will be good if it overclocks as well. I mean a 10% increase and then you can bump it up another 10% with overclock would be a great upgrade, but if there isn't any overclocking head room then ho hum.. I'll wait to upgrade my i7 2600k.


----------



## buggalugs (Apr 25, 2013)

Intel killed the highend for me with sandy-e and X79. Ivy-e doesn't look much better. Haswell looks a much better option.


----------



## Octavean (Apr 25, 2013)

HD64G said:


> I cannot understand that everyone thinks there is a 10% bump in IPC! It isn't! Max gain is almost 10%. In the majority of the tests the gain is 5-7%. And that is sad since the clocks are the same as IB. Total gain is none to move somone to upgrade. I hope AMD's SR is what expected. Only then Intel is going to bring faster CPUs or lower the prices.



This is an Extreme class processor on an enthusiast level platform (currently X79). The prices are not going to go down.


----------



## ensabrenoir (Apr 25, 2013)

HD64G said:


> I cannot understand that everyone thinks there is a 10% bump in IPC! It isn't! Max gain is almost 10%. In the majority of the tests the gain is 5-7%. And that is sad since the clocks are the same as IB. Total gain is none to move somone to upgrade. *I hope AMD's SR is what expected. Only then Intel is going to bring faster CPUs or lower the *prices.



...oh geeeshh lets not start that again


----------



## Sasqui (Apr 25, 2013)

birdie said:


> Seems like I'm the only person who's noticed that the 4960X has a higher base frequency, and supposedly a higher turbo frequency.
> 
> Which kinda negates the enthusiasm over the new CPUs since the older ones can be OC'ed.



You're not the only one who noticed.

3.5 ghz (3970X) vs 3.6 ghz(4960X) turbo and not being able to see the MB for the 4960X makes the whole article from coolaler.com WORTHLESS.


----------



## radrok (Apr 25, 2013)

If Intel wants us X79 hexa users to upgrade they need to give us either an unlocked 8 core cpu or an unlocked qpi 6 core to play with on enthusiast 2p mobos.

Not worthy draining my loop for a 5% increase in IPC.

Might aswell bump my 3930k to 5,3Ghz and call it a day until Haswell-E comes.


----------



## 15th Warlock (Apr 25, 2013)

IB-E doesn't excite me the least, I hate how the "extreme" processors are almost a year behind mainstream processors in terms of architecture releases...

Haswell on the other hand would've been a better upgrade option, but I read Haswell-E will support DDR4, so there's simply no chance it'll work on X79 boards, by the looks of it, we won't be seeing it until mid 2014 anyways...


----------



## Octavean (Apr 25, 2013)

radrok said:


> If Intel wants us X79 hexa users to upgrade they need to give us either an unlocked 8 core cpu or an unlocked qpi 6 core to play with on enthusiast 2p mobos.
> 
> Not worthy draining my loop for a 5% increase in IPC.
> 
> Might aswell bump my 3930k to 5,3Ghz and call it a day until Haswell-E comes.



If you look here:

http://www.techpowerup.com/182238/Intel-Core-i7-quot-Ivy-Bridge-E-quot-HEDT-Lineup-Detailed.html

You'll see that the entry level Ivy Bridge-E processor is listed as a Core i7 4820*K* which if correct could mean all three processors in the lineup are fully unlocked. 


That might mean something,....or not,...


----------



## qwerty_lesh (Apr 25, 2013)

Id be quite happy still on my old 920 if I didn't have the opportunity to shift up to a 3930k free of cost.

The way I see it, IVY-E was never going to be appealing enough to upgrade to.

Its just like Gulftown was, on the tylersburg platform.. Nice to know its out there but not to actually go out and spend on.

The great thing now is if you have practically anything of the last several generations which has good processing power, you can sit on it for many years and be happy.

Conroe/Wolfdale Nehalem and Sandybridge were all big steps forward, and all quite sufficient for almost everybody.

To me, this is not really bad news at all, it doesn't phase me that its not a leap forward.
The same goes with Haswell, I can see why many will want to see a big performance jump when its out but if it doesn't happen, personally I'm not really concerned. 

Oh i need to update my specs on here. lawl


----------



## Hayder_Master (Apr 25, 2013)

seems no different at all, thanks for the 100mhz more which is make the different


----------



## Fourstaff (Apr 25, 2013)

radrok said:


> Might aswell bump my 3930k to 5,3Ghz and call it a day until Haswell-E comes.



Best decision for almost all users with a chip more powerful than 920. After all, Intel's focus is no longer brute power at all cost. They are taking a breather to balance other factors like iGPU and power consumption etc.


----------



## radrok (Apr 25, 2013)

Fourstaff said:


> Best decision for almost all users with a chip more powerful than 920. After all, Intel's focus is no longer brute power at all cost. They are taking a breather to balance other factors like iGPU and power consumption etc.



If my RE2 didn't crap out I'd still be using my Gulftown chip.

I literally kept my 3930k inside its box for 6 months before upgrading.


----------



## Octavean (Apr 25, 2013)

Anyone who had an LGA2011 based Sandy Bridge-E processor probably didn't expect much of a performance bump from Ivy Bridge-E.  

Due to the Nomenclature as well it makes sense considering Sandy Bridge to Ivy Bridge on the same LGA1155 socket wasn't earth shattering either but at least there was an upgrade for the socket before moving on,....

Thats not small point either because think about the bitching and complaining that occurs when there are no upgrades before moving on to a new socket,....

At least the option is there.

Also note that not everyone is upgrading.  Sometimes people and businesses find a need for an additional system or systems.   Thats additive, so what would you buy if your in need of another system, the same old Sandy Bridge-E or the new Ivy Bridge-E.  In that case I would probably buy the new Ivy Bridge-E,...but I still have to see retail product / reviews first. 

Keep in mind that we have heard that Haswell will ship with the USB bug / older stepping chipset initially,...


----------



## Dent1 (Apr 25, 2013)

HD64G said:


> I cannot understand that everyone thinks there is a 10% bump in IPC! It isn't! Max gain is almost 10%. In the majority of the tests the gain is 5-7%. And that is sad since the clocks are the same as IB. Total gain is none to move someone to upgrade.



Ivy Bridge-E isnt mean to be an upgrade from Sandy Bridge or Ivy Bridge builds. It's for new builds from the Core 2 Duo or Phenom II generation or older wanting to jump on the fastest available.  You don't need to jump on the latest architecture every round!




HD64G said:


> I hope AMD's SR is what expected Only then Intel is going to bring faster CPUs or lower the prices.



It doesn't work like that. Intel could release the fastest CPU on the planet, but the yield will always be disappointing without the proper software optimisation. We saw this with Bulldozer, on paper it should have outperformed Sandy Bridge but without software support the results didn't materialise.

Steamroller won't change much, Intel can have a slower CPU priced higher and it'll still generate just as much sales.  We've seen this in history with the Athlon/Athlon XP/Duron/Sempon vs P3/Celeron/P4/Pentium D. Lower performance doesn't always mean less sales or lower prices for Intel.


----------



## Aquinus (Apr 25, 2013)

Dent1 said:


> You don't need to jump on the latest architecture every round!



Just to prove your point a bit more, even this statement is incorrect. IVB-E is not a new architecture. It is a die shrink. SB-E and IVB-E for the most part will work exactly the same (like SB and IVB,) but there are a couple different features and smaller circuitry inside the CPU. That's it. Nothing earth shattering, nothing special, just simply a die shrink.


----------



## Nordic (Apr 25, 2013)

midnightoil said:


> here's no way that Intel can make any inroads into the phone, tablet or embedded devices market



I don't know about that. Amd has some really promising low power x86 chips coming that look reallllly good on paper. It has yet to be shown if it can compete with arm, but if amd can make it so can intel.


----------



## Octavean (Apr 25, 2013)

james888 said:


> I don't know about that. Amd has some really promising low power x86 chips coming that look reallllly good on paper. It has yet to be shown if it can compete with arm, but if amd can make it so can intel.



But AMD is also just as likely to slap some really good graphics on an ARM SoC since they have much less to lose from the success of ARM then Intel.


----------



## Jstn7477 (Apr 25, 2013)

Hey guys, have you considered that there is probably a nice power consumption decrease with these chips vs. SB-E? Apparently nobody seems to understand that what Ivy Bridge does with 80w is comparable to what Sandy Bridge does at around 130w (my 3770K @ 4.3GHz/1.18v vs. my 2600K @ 4.3GHz 1.32v). Just saying...


----------



## EarthDog (Apr 25, 2013)

SB = 95W while IB =77W. Perhaps IB-e will fit in to sub 100W package. On the surface a ~25% decrease is nice, however unless you are running the thing 24/7... doesnt translate to much dollar wise.


----------



## radrok (Apr 25, 2013)

Still not worth to upgrade based on power consumption.

I bet these chips will draw a ton of power too if kept clocked at what I run my 3930k (5 Ghz), we are talking upwards of 300w


----------



## EarthDog (Apr 25, 2013)

I bet it will draw less than your 3930K...


----------



## radrok (Apr 25, 2013)

I agree on that but will the difference be worth it?


----------



## buggalugs (Apr 25, 2013)

Aquinus said:


> Just to prove your point a bit more, even this statement is incorrect. IVB-E is not a new architecture. It is a die shrink. SB-E and IVB-E for the most part will work exactly the same (like SB and IVB,) but there are a couple different features and smaller circuitry inside the CPU. That's it. Nothing earth shattering, nothing special, just simply a die shrink.



 Well its not just a die shrink, ivy bridge uses tri-gate transistors, that's a pretty major design change.

 The biggest improvement is likely memory performance/latency, power consumption, and overclocking. 

 The biggest letdown for me is Intel still relying on X79 boards for this new chip. an unfinished platform with no Intel USB 3.0 only 2 Intel sata 6Gb/s etc. Its weird when the mainstream platform has better motherboard features than the highend that costs twice as much.


----------



## Aquinus (Apr 26, 2013)

buggalugs said:


> Well its not just a die shrink, ivy bridge uses tri-gate transistors, that's a pretty major design change.



Which has no impact on performance. This is a heat/power optimization more than anything else, also I think Intel would be hard pressed to not use a multi-gate transistor at 22nm considering the physical limitations with circuits that small.



buggalugs said:


> The biggest improvement is likely memory performance/latency, power consumption, and overclocking.


Power consumption is due to the multi-gate transistors and the die shrink.
Overclocking isn't any better than SB. IVB just has a better IPC so each Mhz goes a bit further (like 10% further ,) so even though you might not get clocks as high as a SB chip, you're getting more work done because it's doing 10% more in the same amount of time with the same frequency.
You can thank the die shrink for the better memory latencies too.

So yeah, most of the performance benefits came from the die shrink. The power consumption improvements come from both the shrink and the multi-gate transistors.



buggalugs said:


> The biggest letdown for me is Intel still relying on X79 boards for this new chip. an unfinished platform with no Intel USB 3.0 only 2 Intel sata 6Gb/s etc. Its weird when the mainstream platform has better motherboard features than the highend that costs twice as much.



Don't call it a letdown unless you own one and have legitimately have been let down by it. I'm perfectly happy with my X79 board and I think that most people who insult skt2011 don't really know what they're talking about. I find it astonishing the people complain about really stupid things like X79 not having enough SATA 6 ports or not many USB 3.0 ports (mine has 6 on the back, plus headers for another 4 so that's a matter of opinion,) when the CPU has 40 PCI-E lanes. You need more ports? Get a RAID card. They didn't load the CPU full of PCI-E slots and lanes for nothing.

I would also like to see your 3770k use VT-d and run 64Gb of ram like my 3820 can. What about features again?

People complain about X79 when the real power house is SB-E. The PCH does so little now, it almost hardly matters if you really need more than what it offers. The PCH does enough and if you need more, you really should get dedicated hardware. Remember, the PCH is on DMI not QPI or PCI-E. It can only do so much.


----------



## jihadjoe (Apr 26, 2013)

I don't see why anybody is too surprised or disappointed by this considering we've already seen what the transition from Sandy to Ivy did on LGA1155.


----------



## LAN_deRf_HA (Apr 26, 2013)

EarthDog said:


> SB = 95W while IB =77W. Perhaps IB-e will fit in to sub 100W package. On the surface a ~25% decrease is nice, however unless you are running the thing 24/7... doesnt translate to much dollar wise.



I think power consumption tests have shown that the power difference is much smaller, like 4 watts. The TDPs aren't that relevant these days.


----------



## Patriot (Apr 26, 2013)

HumanSmoke said:


> DDR3-1600 at 11-11-11-28 ?
> Really pushing the envelope.



Nice catch.... Cinebench loves high frequency and low latency... cas 11 ddr3 1600 is no doubt hampering the performance.


----------



## GamerGuy (Apr 26, 2013)

I hopped on LGA2011 platform at launch, and from what I had read then, IB-E could be a consideration for an upgrade even when LGA1155 is replaced by LGA1150. But, looking at what IB-E has to offer (still a hexacore, IF it had been octocore, I might be mentally masturbating over this), or lack thereof, I'd be perfectly happy with this setup I have for a while more. My board has 4x USB 3.0 ports at the back, and I have 2x USB 3.0 ports used on my case.....I use these ports mainly for USB 3.0 external HDDs. All other peripheral devices that use USB can be done on the USB2.0 ports. A couple of the reasons why I'd gone LGA2011 is the PCIe lanes on this chipset, native 40 PCIe 3.0 lanes (my cards doing PCIe 3.0 x16/x8/x16) as well as a host of features the chipset support.


----------



## AsRock (Apr 26, 2013)

EarthDog said:


> SB = 95W while IB =77W. Perhaps IB-e will fit in to sub 100W package. On the surface a ~25% decrease is nice, however unless you are running the thing 24/7... doesnt translate to much dollar wise.



Well i have known both of my i5 and i7 idle around 72w-77w. But no were near 100w for either of them idle.

You gotta be careful on how this is tested too as some mobo takes a load more power even at idle..  My Asus Maximus (x38) used to run 190w idle were with another board around 100w with the same chip.



LAN_deRf_HA said:


> I think power consumption tests have shown that the power difference is much smaller, like 4 watts. The TDPs aren't that relevant these days.



It is about 4w when idle and that's if it disable HT or not compering my 2 chips..


----------



## jihadjoe (Apr 26, 2013)

RejZoR said:


> I don't think any of the current CPU's would last as long as this one did.



My opinion is totally different. ANY modern cpu will probably last as long, if not even longer because applications aren't getting much more demanding.

I mean how much can a game demand before any extra CPU is totally irrelevant? If you can map the game world at 60fps, maybe throw in a few cycles for AI and stuff that's in-motion, that's about all it's ever going to need. Everything else goes to the GPU.

Office apps, even more so. I can only imagine how many trillions of clock cycles are wasted while Word waits for your next keystroke. The fact of the matter is CPUs now are more than good enough for what we need, and Intel's direction in optimizing toward greater integration and lower power (as opposed to more outright computing power) is totally justified.


----------



## Aquinus (Apr 26, 2013)

AsRock said:


> Well i have known both of my i5 and i7 idle around 72w-77w. But no were near 100w for either of them idle.
> 
> You gotta be careful on how this is tested too as some mobo takes a load more power even at idle.. My Asus Maximus (x38) used to run 190w idle were with another board around 100w with the same chip.



Were you measuring the 8-pin EPS connector to get those numbers? My rig idles at 200-watts but that doesn't mean the CPU is idling at that. According to Cadaveca's review of my board, the VRMs use very little power when the CPU is idle, so the majority of that must be my video cards and hard drives.

This is a good read: http://www.linuxfordevices.com/c/a/...PU-power-consumption-a-challenge-authors-say/


----------



## EarthDog (Apr 26, 2013)

AsRock said:


> Well i have known both of my i5 and i7 idle around 72w-77w. But no were near 100w for either of them idle.
> 
> You gotta be careful on how this is tested too as some mobo takes a load more power even at idle..  My Asus Maximus (x38) used to run 190w idle were with another board around 100w with the same chip.
> 
> ...


Im more than certain that is your entire SYSTEM idling at that wattage. Its what I idle at with a 3770K at stock with power saving features on. 



Aquinus said:


> Were you measuring the 8-pin EPS connector to get those numbers? My rig idles at 200-watts but that doesn't mean the CPU is idling at that. According to Cadaveca's review of my board, the VRMs use very little power when the CPU is idle, so the majority of that must be my video cards and hard drives.
> 
> This is a good read: http://www.linuxfordevices.com/c/a/...PU-power-consumption-a-challenge-authors-say/


HDD's are NOTHING at idle (or when spun up for that matter, several watts). Your GPUs however, compared to the 7 series, dont drop to a 3W idle state, so I would imagine its that, the mobo itself, and the CPU


----------



## LAN_deRf_HA (Apr 27, 2013)

AsRock said:


> It is about 4w when idle and that's if it disable HT or not compering my 2 chips..



I meant load. Ivy didn't do much for power savings, certainly less than what the TDP implies. http://media.bestofmicro.com/B/A/334774/original/average%20power.png


----------



## Delta6326 (Apr 27, 2013)

jihadjoe said:


> My opinion is totally different. ANY modern cpu will probably last as long, if not even longer because applications aren't getting much more demanding.
> 
> I mean how much can a game demand before any extra CPU is totally irrelevant? If you can map the game world at 60fps, maybe throw in a few cycles for AI and stuff that's in-motion, that's about all it's ever going to need. Everything else goes to the GPU.
> 
> Office apps, even more so. I can only imagine how many trillions of clock cycles are wasted while Word waits for your next keystroke. The fact of the matter is CPUs now are more than good enough for what we need, and Intel's direction in optimizing toward greater integration and lower power (as opposed to more outright computing power) is totally justified.



I agree with you any current or next gen CPU can realistically last for ages(4-8years) My Q6600 is still rocking. I'm a very light gamer and people need to realize that Intel is thinking about the other 90% of it's sells, Energy is a key factor now.

Side note jihadjoe what volts are you running on your Q6600?, I've been thinking of OC'ing mine, it should last me to the end of this year, then I will go Haswell then wait another 5years+


----------



## NeoXF (Apr 27, 2013)

Delta6326 said:


> I agree with you any current or next gen CPU can realistically last for ages(4-8years) My Q6600 is still rocking. I'm a very light gamer and people need to realize that Intel is thinking about the other 90% of it's sells, Energy is a key factor now.
> 
> Side note jihadjoe what volts are you running on your Q6600?, I've been thinking of OC'ing mine, it should last me to the end of this year, then I will go Haswell then wait another 5years+



If you held out this "long", might as well wait for mainstream Intel hexacores (or AMD 5-module+) or at least DDR4... or see how A12 Kaveri turn out...

Otherwise I'd shrug thinking physically nothing consistent has changed from such an upgrade, just IPC, new instructions and added HyperThreading.


Then again, Q6600 to Haswell i5/i7 is a 5-6 generation jump... which then AGAIN, makes it even more sad that so little has changed.


----------



## Nordic (Apr 27, 2013)

NeoXF said:


> sad that so little has changed.



Change for change sake is pointless. What do you think has needs changing.


----------



## de.das.dude (Apr 27, 2013)

if i take a guess, it will be 10% faster but should cost 50% more?


----------



## Aquinus (Apr 27, 2013)

EarthDog said:


> HDD's are NOTHING at idle (or when spun up for that matter, several watts). Your GPUs however, compared to the 7 series, dont drop to a 3W idle state, so I would imagine its that, the mobo itself, and the CPU



Last I checked the 6950 wasn't a 7-series card either. We're talking about ASRock's computer that has a 6000-series card in it, not your with your power sipping 7-series card. 

The point is that despite the numbers being low, that's not all the CPU and there are other components that use power and even if you say those numbers are low, low numbers add up pretty quick when you're usage isn't a whole lot to begin with.

A hard drive consuming 7 watts on a machine that draws 200 like mine is nothing, but on a computer that draws 70 watts at idle (I'm assuming the drives aren't spinning down,) that 7 watts just went from being well under 4% of your power usage to 10% of your consumed power. So the ratio of those smaller usages impact you more because the number is already so low.

Also spinning up and slowing down drives a lot on a regular basis actually puts more stress on the motor in the drive. I've had the best luck leaving drives spun up because I'm perfectly willing to use the extra 25 watts to do it. (I'm rounding, I suspect 7200 RPM drives use more then 5400 ones like where i got the numbers from.)

The only real point I'm trying to make is the lower the draw, the more other components can impact that usage, such as add some hard drives or adding a PCI-E expansion card.

We already know how low it idles and that isn't in dispute, it's just the method.


----------



## NeoXF (Apr 28, 2013)

james888 said:


> Change for change sake is pointless. What do you think has needs changing.



What sake of change? We're busting quad-cores since 2006, within the same or slightly higher GHz range with almost the same tired instruction sets, with minimal IPC increase (if any) from generation to generation.

Meanwhile, things like ARM are catching up to x86 like fungus on a 3rd world gym ceiling.

Look at AMD's APU/heterogeneous initiative, at least on paper, it sounds like a huge f ing breakthrough, way bigger than gigahurtz wars and their diminishing returns or many-cores processing from awhile ago.


----------



## Octavean (Apr 30, 2013)

NeoXF said:


> What sake of change? We're busting quad-cores since 2006, within the same or slightly higher GHz range with almost the same tired instruction sets, with minimal IPC increase (if any) from generation to generation.
> 
> Meanwhile, things like ARM are catching up to x86 like fungus on a 3rd world gym ceiling.
> 
> Look at AMD's APU/heterogeneous initiative, at least on paper, it sounds like a huge f ing breakthrough, way bigger than gigahurtz wars and their diminishing returns or many-cores processing from awhile ago.



It's a phenomenon of convergence whereby Intel is attempting to increase power efficiency in order to move in to the same segment of the market as ARM and ARM is attempting to improve its performance in order to move into the segment of the market traditionally dominated by more powerful x84 / x64 Intel / AMD solutions.


----------



## TheoneandonlyMrK (Apr 30, 2013)

Once your into this chip and a decent motherboard territory ($$££€€)its more about how high it will oc with crazy cooling imho that and the max is min crowd.


----------



## Hilux SSRG (May 7, 2013)

Intel is not catering to people like me who don't give a fig about power efficiency.  Like I and other gaming/tech users really care about spending $20 bucks more on electricity costs p/year. 

I really want an eight [8] core, sixteen [16] hyper-threading beast of a x86 cpu running at 8k-10k gigahertz stock speed.  

Forget the 1-10% power effeciency gains per "upgrade" cycle. These are not mobile chips Intel !!

Waiting forever to upgrade from my i7-920 and Intel has not moved the bar much in 5+ years.


----------



## EarthDog (May 7, 2013)

Your bar... is high. LOL!


----------



## Fourstaff (May 7, 2013)

Hilux SSRG said:


> Waiting forever to upgrade from my i7-920 and Intel has not moved the bar much in 5+ years.



3930K is quite a lot more powerful than 920


----------



## Aquinus (May 7, 2013)

Fourstaff said:


> 3930K is quite a lot more powerful than 920



+1: Even the 3820 isn't a bad step up from the 920 in terms of performance. It's not mindboggling but there is an IPC improvement and it also runs at higher clocks and has a more powerful IMC. In general I would say SB-E was a decent step up from skt1366 because in general it's faster across the board regardless what CPU you get compared to the 920, but it's not necessarily a reason to upgrade. It's a good platform if you're upgrading anyways imho but that depends on what you're using it for.


----------



## Tatty_One (May 7, 2013)

Inceptor said:


> Sandy Bridge, in 2010, was a huge jump in performance, and I think it has skewed everyone's expectations.
> Intel has been squeezing optimizations and modifications of the same basic architecture for quite a few years now.  It should not be surprising that the performance bump is now in the 5-10% range for each new iteration.  Especially when taking into consideration the problems encountered when shrinking the design to smaller and smaller process nodes.



No it was'nt..... look at the clock for clock performance as opposed to stock clocks which will give you an accurate picture of the architecture advancement, I think you will find across the board we would be talking 10-12%.


----------



## Hilux SSRG (May 7, 2013)

Fourstaff said:


> 3930K is quite a lot more powerful than 920



Now I know I should "try" to find a comparison between the 920 and the 3930k [if one exists!] but let's say its 20-30% faster overall.  I'm wondering why anyone should shell out $500+ for a new mobo and chip for just 20-30% improvement in 5+ YEARS.  

I'm not looking to start an arguement but am willing to say intel has coasted for a few years now.  I'd rather spend the money on a better gpu from amd or nvidia.


----------



## HumanSmoke (May 7, 2013)

Hilux SSRG said:


> Now I know I should "try" to find a comparison between the 920 and the 3930k [if one exists!] but let's say its 20-30% faster overall.


Then again, you could fire up the calculator...
45% better in Mental Ray, 44% better in V-Ray, 39% better in Visual Studio......

Depending on the users intended workload, it may (or may not) look advantageous - and that's the whole platform I mean (X79 vs X58)


----------



## Tatty_One (May 8, 2013)

HumanSmoke said:


> Then again, you could fire up the calculator...
> 45% better in Mental Ray, 44% better in V-Ray, 39% better in Visual Studio......
> 
> Depending on the users intended workload, it may (or may not) look advantageous - and that's the whole platform I mean (X79 vs X58)



But as I said earlier, those differences are not a reflection of huge architectural improvements, when you actually look at the 2 offerings and see that the 920 stocks at 2.66Gig and had 4 cores/8 threads, the 3930 stocks at 3.2gig and has 6 cores/12 threads, it leaves me with the feeling that there is a scary amount of hype and very little real substance to the architecure, but then again I am a 930 owner so i perhaps would say that!


----------



## Ikaruga (May 12, 2013)

There is a test on a Chinese page which claims almost no performance gain compared to Ivy Bridge. I hope it's wrong.


----------



## radrok (May 12, 2013)

Tatty_One said:


> But as I said earlier, those differences are not a reflection of huge architectural improvements, when you actually look at the 2 offerings and see that the 920 stocks at 2.66Gig and had 4 cores/8 threads, the 3930 stocks at 3.2gig and has 6 cores/12 threads, it leaves me with the feeling that there is a scary amount of hype and very little real substance to the architecure, but then again I am a 930 owner so i perhaps would say that!



You got to count in that the 3930K overclocks way more than a Nehalem chip.

My old i7 920 couldn't do more than 4,2Ghz.

My 3930K does 5,1-5,2Ghz.


----------



## Aquinus (May 12, 2013)

radrok said:


> My 3930K does 5,1-5,2Ghz.



That's all your motherboard. My P9X79 Deluxe doesn't like 4.5Ghz or higher without some serious bumps in voltage (>1.425v). 4.92Ghz @ 1.51v was the highest I was ever able to achieve with this board and I've seen people use the RIVE to hit >5Ghz with the same CPU and similar voltages.

Not to say that I'm not happy with my 4.37Ghz under 1.4v, but it really depends on the motherboard. Unless I'm missing some important settings on my motherboard but I don't think that is the case.

You forget that SB-E has a better IPC than its 1366 predecessors, not by a lot but it starts counting more and more the higher the CPU clock goes because IPC scales linearly.



Tatty_One said:


> But as I said earlier, those differences are not a reflection of huge architectural improvements, when you actually look at the 2 offerings and see that the 920 stocks at 2.66Gig and had 4 cores/8 threads, the 3930 stocks at 3.2gig and has 6 cores/12 threads, it leaves me with the feeling that there is a scary amount of hype and very little real substance to the architecure, but then again I am a 930 owner so i perhaps would say that!



It was my impression that even the 3820 was a sizable improvement from a 920, forget a 3930k.


----------



## HumanSmoke (May 12, 2013)

Ikaruga said:


> There is a test on a Chinese page which claims almost no performance gain compared to Ivy Bridge. I hope it's wrong.


They appear to be graphics (and GPU dependant) benchmarks, so hardly surprising.
Of more interest would be 5GHz at 0.9v


----------



## radrok (May 12, 2013)

Aquinus said:


> That's all your motherboard. My P9X79 Deluxe doesn't like 4.5Ghz or higher without some serious bumps in voltage (>1.425v). 4.92Ghz @ 1.51v was the highest I was ever able to achieve with this board and I've seen people use the RIVE to hit >5Ghz with the same CPU and similar voltages.
> 
> Not to say that I'm not happy with my 4.37Ghz under 1.4v, but it really depends on the motherboard. Unless I'm missing some important settings on my motherboard but I don't think that is the case.
> 
> ...



I've had quite a lot to tweak to reach those clocks stable.

I think your motherboard has the same settings hidden in somewhere but I won't deny it's easier to overclock on a RIVE than with another kind of motherboard.

On top of my head I had to set LLC for both VCCSA and CPU to Ultra, had to fiddle with CPU current capability, CPU switching frequency and some strange settings that are on BIOS that I never heard or remember of... 

The voltage slope from 4,7/4,8 Ghz  to 5,1/5,2Ghz is insane though, we are talking from a comfy 1,35v to 1,52-1,55v.

I wouldn't be surprised to see my motherboard VRM toast someday even though they are watercooled but the backside is not and you can't touch that backplate without getting almost burned.


Speaking of HEDT platform I wish Intel would BCLK unlock the upcoming Haswell-E Xeons to have back some glory like X58 westmeres had.


----------



## TheoneandonlyMrK (May 12, 2013)

HumanSmoke said:


> They appear to be graphics (and GPU dependant) benchmarks, so hardly surprising.
> Of more interest would be 5GHz at 0.9v



Er didn't intel focus mostly on gpu improvements this time round and claimed a 50% improvement in that area though , I think given intels cpu ipc performance is bordering optimal already they are doing well but clearly they are as they should be, focused on power efficiency and gpu grunt.


----------



## D007 (May 12, 2013)

I'm glad I got my 960 when it came out..
Has been a little monster for me and a hell of a boost from a dual core.
This however doesn't make me want to upgrade at all..lol


----------



## Tatty_One (May 12, 2013)

radrok said:


> You got to count in that the 3930K overclocks way more than a Nehalem chip.
> 
> My old i7 920 couldn't do more than 4,2Ghz.
> 
> My 3930K does 5,1-5,2Ghz.



Good point, but people are talking about advancement in architecture, higher overclocks could be argued as an advancement, but to be honest, seeing as 90+% of CPU owners don't overclock it's a moot point.

@ Aquinus..... a decent improvement yes when stock (higher) clocks are factored of course but if we talk about the architecture, run them at the same speed and that improvement is reduced vastly which is all my point is, if everyone is happy with each generation just stocking at higher clocks that's one thing but don't we want REAL architectural improvements that give us what our hard earned $$$ is really paying for (or not as the case may be), I mean, with the advances in silicon, die size etc, CPU's "should" cost less, especially if all manufacturers are doing is applying a few "tweaks", raising stock clocks by 200mhz..... but funnily enough they are not really any cheaper.


----------



## Aquinus (May 12, 2013)

radrok said:


> The voltage slope from 4,7/4,8 Ghz to 5,1/5,2Ghz is insane though, we are talking from a comfy 1,35v to 1,52-1,55v.
> 
> I wouldn't be surprised to see my motherboard VRM toast someday even though they are watercooled but the backside is not and you can't touch that backplate without getting almost burned.



Yeah, you see my board gives me about ~300mhz less for the same voltages, but even under heavy load conditions my VRMs stay relatively cool, warm to the touch but never scalding hot. Might have something to do with the 16+2+2 phase power. I also haven't gone too gung-ho with the digi+ settings as far as VRM frequency and such, I think CPU and VCCSA LLC is set to high for me. I've been a bit more lazy than normal to try to push it higher so I haven't. There hasn't been a need to either. I would rather not fry my CPU unless I can get something better without paying too much. An IVB-E could be fun. I would love to see some numbers though. I personally would like to see 166Mhz strap capable CPU and motherboards become more common for skt2011 or maybe even some cherry picked CPUs that can do 250Mhz strap, that could be pretty cool but I'm just dreaming at this point. 



Tatty_One said:


> Good point, but people are talking about advancement in architecture, higher overclocks could be argued as an advancement, but to be honest, seeing as 90+% of CPU owners don't overclock it's a moot point.
> 
> @ Aquinus..... a decent improvement yes when stock (higher) clocks are factored of course but if we talk about the architecture, run them at the same speed and that improvement is reduced vastly which is all my point is, if everyone is happy with each generation just stocking at higher clocks that's one thing but don't we want REAL architectural improvements that give us what our hard earned $$$ is really paying for (or not as the case may be), I mean, with the advances in silicon, die size etc, CPU's "should" cost less, especially if all manufacturers are doing is applying a few "tweaks", raising stock clocks by 200mhz..... but funnily enough they are not really any cheaper.



I guess it could be argued that as technology progresses it gets increasingly difficult to design CPUs to run faster without dedicating more money and time in the development of new techniques to do it. I would imagine that intel is milking the current architecture as much as they can until there is really a reason to try and push forward. So they're in the power position and they're going to take advantage of it. I seriously doubt that they don't have a backup plan in case AMD or another company pulls a rabbit out of a hat, which I hate to say, doesn't look likely.


----------



## HumanSmoke (May 12, 2013)

theoneandonlymrk said:


> HumanSmoke said:
> 
> 
> > They appear to be graphics (and GPU dependant) benchmarks, so hardly surprising.
> ...


The results posted aren't using integrated graphics. They were obviously posted with the graphics card listed in the screenshot - an MSI GTX 660 Hawk. A quick look at the HD 4000's actual 3DMark Fire Strike capabilities should be a pretty big red flag.

So, as far as I'm concerned, I still don't see a selection of GPU based benchmarks being overly relevant when a third-party GPU is being used to render the result.

EDIT: Digging around on the site where the graphics comparison cropped up (note to Ikaruga: a link with the graph you posted would've been good), also presents some CPU benchmarks which are probably more relevant- although I'm wondering how mature the board BIOS is given the memory bandwidth numbers:


----------



## Fourstaff (May 12, 2013)

Aquinus said:


> So they're in the power position and they're going to take advantage of it.



No they are more worried about the midgets chewing them from below (read: ARM). Our need for CPU advances has stalled a bit as of late (for the past 3 years or more really), and the real advances are made in efficiency (which makes sense since the bulk of the cost of datacenters is the power bill). The current climate dictates the direction Intel takes, not the other way round. Remember they have to keep running forward and release incrementally better products even when there is no competition: they have to compete against their past.


----------



## ShockG (May 13, 2013)

For so called tech enthusiasts most of us have unreasonable expectations on INTEL, AMD or any other company in this business. More so than Joe average. 
If there was a way for INTEL to magically deliver a 30% improvement over Sandy-Bridge-E, that's what we would have today. Sadly this is not possible to achieve for any outfit; ARM, AMD, NVIDIA, TI, SAMSUNG etc. Nobody could deliver these kinds of gains within the constraints that INTEL has. The advancements are incremental and at no point did INTEL ever promise anyone massive gains when moving from one generation to another. That assertion came from us and it has always been incorrect. When investing billions of dollars into R&D and thousands of man hours, lazy isn't the word I'd use to describe the efforts of INTEL, AMD or any outfit for that matter. That word would be better reserved for my own profound and limited understanding of the technology and it's evolution. I'm an enthusiast because of my appreciation for the technology not my expectations of it. 

We also tend to forget that, as far as expertise at processors and process manufacturing is concerned, no other company on this planet has dedicated as many resources to this as INTEL. I doubt if any one of us can find a single company that has commercial products on a fin-fet process at this node let alone with such massive cores containing so many logic gates. Whatever system you may own that you find to be "good enough"  for all your needs, it's precursors had similar miniscule advances between each generation, which culminated in your "good enough" system. 

As I am no engineer I find it hard to be disappointed by Ivy-Bridge-E, because whatever disappointment there may be, it stems from my own ignorant expectations rather than INTEL's inability to produce a compelling CPU.
I will be buying the 4960X, with the relevant motherboards when it's released.


----------



## NeoXF (May 13, 2013)

ShockG said:


> For so called tech enthusiasts most of us have unreasonable expectations on INTEL, AMD or any other company in this business. More so than Joe average.
> If there was a way for INTEL to magically deliver a 30% improvement over Sandy-Bridge-E, that's what we would have today. Sadly this is not possible to achieve for any outfit; ARM, AMD, NVIDIA, TI, SAMSUNG etc. Nobody could deliver these kinds of gains within the constraints that INTEL has. The advancements are incremental and at no point did INTEL ever promise anyone massive gains when moving from one generation to another. That assertion came from us and it has always been incorrect. When investing billions of dollars into R&D and thousands of man hours, lazy isn't the word I'd use to describe the efforts of INTEL, AMD or any outfit for that matter. That word would be better reserved for my own profound and limited understanding of the technology and it's evolution. I'm an enthusiast because of my appreciation for the technology not my expectations of it.
> 
> We also tend to forget that, as far as expertise at processors and process manufacturing is concerned, no other company on this planet has dedicated as many resources to this as INTEL. I doubt if any one of us can find a single company that has commercial products on a fin-fet process at this node let alone with such massive cores containing so many logic gates. Whatever system you may own that you find to be "good enough"  for all your needs, it's precursors had similar miniscule advances between each generation, which culminated in your "good enough" system.
> ...



Spoken like a true mindless consumer of this day and age... no offence.
Sorry, but that was a utter load of rubbish to me and I can't find any another way to say it (other than not saying it at all, but since everyone around me keeps saying I should speak out more, I'm starting to not hold back any more).

This kind of thinking borders on Stockholm syndrome... but in regards to consumers as opposed to a full-on assailant. And it's kind of sad if the trend will be shifting that way... As long as it's consumerism we're talking about, we, the customers, the paying customers, should have the first say in anything, not the other way around. I'm not buying into Intel's shitty performance increments, and I'm certainly not defending their name.


----------



## HumanSmoke (May 13, 2013)

NeoXF said:


> but since everyone around me keeps saying I should speak out more


As an appreciator of unintentional comedy, I fully support everyone around you.


----------



## ShockG (May 14, 2013)

NeoXF said:


> Spoken like a true mindless consumer of this day and age... no offence.
> Sorry, but that was a utter load of rubbish to me and I can't find any another way to say it (other than not saying it at all, but since everyone around me keeps saying I should speak out more, I'm starting to not hold back any more).
> 
> This kind of thinking borders on Stockholm syndrome... but in regards to consumers as opposed to a full-on assailant. And it's kind of sad if the trend will be shifting that way... As long as it's consumerism we're talking about, we, the customers, the paying customers, should have the first say in anything, not the other way around. I'm not buying into Intel's shitty performance increments, and I'm certainly not defending their name.


You're entitled to your opinion for sure, doesn't mean it's a valid one though. 
Intel's ***** performance increments? What has lead you to believe they could or were supposed to be better? What information do you have, that nobody else apparently has, leads you to make this statement with such zeal? In your expertise, what could INTEL have done from a technical POV to ensure the double digit percentage gains you so desire? 
"You, the paying customers", aren't entitled to anything. INTEL above all else is a self serving business, they owe you nothing other than what they have provided to you when you purchased your most recent CPU form them. You are not entitled to anything past that, hence there's no discourse needed between INTEL and you on what they release and how they go about it. 
I'm not an engineer or anything of the kind (and I'll certainly not attempt to come off as one from behind my keyboard), but from what I gather it is truly difficult to design a CPU given many of the limitations in TDP and such. Your annoyance stems from your ignorance, not from an understanding of what it is INTEL was trying to achieve here. 
The question is where have you been done wrong by Ivy-Bridge-E for the CPU to warrant such disdain from you? I re-iterate, you're disappointed entirely because of your expectations. When did INTEL promise you or anyone else massive performance gains with Ivy-Bridge-E?


----------



## NeoXF (May 14, 2013)

ShockG said:


> [snip]



I'm entitled to not paying a dime on anything they make, for starters, unless this so called "business" move in a direction I'm more inclined to accept paying for... And I don't know how you see businesses, but how I see it, if the customers aren't impressed, therefor, willing to spend on your products, neither will your so-called business stand for much longer.

But in a sense, I guess what you said is right, it's not Intel's fault, it's the consumer's fault for buying into nonsense every damn tick and tock they shelve out, and to an extent, to their competition. But I won't go into how wrong thinking (external) competition is the only thing pushing anything forward in this world. Very few people/organizations strive to better themselves because they can these days and for the most part, just stagnate slightly above the lowest common denominator.




HumanSmoke said:


> As an appreciator of unintentional comedy, I fully support everyone around you.



I challenge you to find me a nVidia GPU-based laptop with a AMD processor. Even more so, specifically with the setup I listed.

There might be a batch of laptops with those GPU/GPUs w/ something like i7-3630QMs, but seeing as ix-4xxx will be priced pretty much the same, I see no point in paying for older hardware, no matter if there are any real improvements or not.


----------



## Hilux SSRG (May 14, 2013)

ShockG said:


> We also tend to forget that, as far as expertise at processors and process manufacturing is concerned, no other company on this planet has dedicated as many resources to this as INTEL. released.



Intel is not pushing the boundry enough and it hasn't for years now.  As an enthusiast I don't care about manufacturing concerns, I just would like to give money to a company [AMD, ARM, etc.] for a quality x86 chip that has improved *significantly*.  

As it stands Intel is subpar and that is why myself and others haven't upgraded as often the last few years.


----------



## Aquinus (May 14, 2013)

Hilux SSRG said:


> Intel is not pushing the boundry enough and it hasn't for years now.  As an enthusiast I don't care about manufacturing concerns, I just would like to give money to a company [AMD, ARM, etc.] for a quality x86 chip that has improved *significantly*.
> 
> As it stands Intel is subpar and that is why myself and others haven't upgraded as often the last few years.



I don't know what planet you live on, but the 3820 is a pretty significant upgrade from a 920 or a 930. It clocks higher, supports more memory, runs faster memory, and not relying on QPI for PCI-E through the IOH is a nice perk. In general, the platform has gotten better, despite the performance, which is still better. So all in all, I think Intel achieved what they wanted to with skt2011 because it's just skt1155 on steroids.

More often than not if you're getting skt2011, you want either multi-threaded performance, support for more memory, the ability to have VT-d and overclock at the same time, or to have a lot of expansion opportunities with the 40 PCI-E lanes on the CPU, which is certainly no slouch. Clocks will only go so high but clocks alone is not a reason to go skt2011 and anyone who thinks that has no idea what they're doing.


----------



## Tatty_One (May 14, 2013)

Aquinus said:


> I don't know what planet you live on, but the 3820 is a pretty significant upgrade from a 920 or a 930.



Your argument is flawed, only because the 3820 is effectively 2 generations ahead of Bloomfield, even then, once you do the research, you find clock for clock, across the board around an 18% - 20% increase in performance, in some things more.... yes, in a few things even less, that in my book spanning a further 2 generations is not too great personally, which is why I still have a 930........  See the thing is..... and this is exactly my earlier point, because 95% of CPU owners don't overclock, my CPU is faster than 95% of all later generation Intel CPU's, certainly the 4 core or 4 core/8 threadded ones anyways


----------



## radrok (May 14, 2013)

To be honest the biggest upgrade Intel could gift to Enthusiast would be QPI enabled i7 X editions...

Dunno you guys but I'd drool over a dual 3960X/3930K setup.


----------



## Aquinus (May 15, 2013)

Tatty_One said:


> Your argument is flawed, only because the 3820 is effectively 2 generations ahead of Bloomfield



Two? Okay, one for initial skt1366 release and one for a die shrink. A die shrink isn't going to yield amazing results compared to architecture changes.



Tatty_One said:


> See the thing is..... and this is exactly my earlier point, because 95% of CPU owners don't overclock, my CPU is faster than 95% of all later generation Intel CPU's, certainly the 4 core or 4 core/8 threadded ones anyways



...and my believe that this point is flawed is that most users investing in a skt1366 or skt2011 machine is highly unlikely going to be running it at stock clocks unless you're using a Xeon on skt2011. So you and I alike both like overclocking, which still leaves me 3820 faster than your 930. Not to say that your 930 isn't capable of doing everything my 3820 can do, but between the IPC gains and the higher overclocks, the performance improvement is significant for some applications.

All in all, most processors do everything it needs to now a days and more often than not overclocking is merely for fun, but the fact still stands that performance along is not a reason to invest in skt2011. Also the changes from skt1366 and skt2011 don't highlight performance. The big change was eliminating the IOH and putting PCI-E on the CPU (40 whole lanes of it,) and practically making QPI useless on most X79 boards.

Simplifying the motherboard design and adding more functionality to the CPU would be considered an improvement, despite not being directly performance related, wouldn't you say?



Tatty_One said:


> 95% of CPU owners don't overclock



95% of CPU owners don't own a skt2011 or skt1366 rig either.


----------



## Tatty_One (May 15, 2013)

Sorry?  You are talking platforms there, but if you want to..... what about S1155, S1156 prior to 2011?  But lets look at CPU's....... 920/930/940/950/960 Bloomfield......i7 860/870 ........ 2500/2600 Sandybridge..... so two REAL generations I think 3 if you wanna include the 860/870 on S1156

Bloomfield was launched in Q4 2008, the 3820 was launched in Q1 2012, over 3 years later.


----------



## Aquinus (May 15, 2013)

Tatty_One said:


> Sorry?  You are talking platforms there, but if you want to..... what about S1155, S1156 prior to 2011?  But lets look at CPU's....... 920/930/940/950/960 Bloomfield......i7 860/870 ........ 2500/2600 Sandybridge..... so two REAL generations I think 3 if you wanna include the 860/870 on S1156
> 
> Bloomfield was launched in Q4 2008, the 3820 was launched in Q1 2012, over 3 years later.



Tatty, what's your point? Any hardware that is released later and has more time for development is going to be better. The point is that you said that Intel has made very few changes, which is bullshit unless you're talking about strictly the compute cores.

Okay Tatty, let's forget performance and age of the CPU for a moment.

skt1366 has an IOH using QPI for PCI-E and DMI, skt2011 does not, it's all built into the CPU.
skt1366 has a triple channel memory controller, skt2011 has a quad.
skt1366 used to keep the uncore and core frequency segregated so the uncore can run slower than the core, skt2011 does not do this, the uncore is directly tied to the core frequency iirc.
skt1366 quad cores had 8Mb of L3, skt2011 so far has had at least 10Mb for a quad.

This all becomes moot though if you have something like a 970 or a 980, but as far as a quad-core is concerned, the 3820 and skt2011 is a contender and I'm not sure how you can say that it isn't. I'm not saying that every 930 and 920 owner should go out and invest in a 3820 or better though. I'm just saying that significant changes have been made to the platform since skt1366, regardless of weather or not they addressed performance or something else.


----------



## Tatty_One (May 15, 2013)

Aquinus said:


> Tatty, what's your point? Any hardware that is released later and has more time for development is going to be better. The point is that you said that Intel has made very few changes, which is bullshit unless you're talking about strictly the compute cores.



I have never said Intel has made very few changes, I did say that there is little performace improvement over almost 3.5 years and spanning at least 3 generations of architecture, I said this in response to some people (not exclusively you but you included) who used terminology such as BIG or in some other instances HUGE (not you) improvement gains often relating to just one generation, this simply is NOT the case once you look at like for like, clock for clock performance which is specifically why I referred to "architectural improvements".

I am not suggesting anyone is getting ripped off or that Intel are very naughty boys, simply that some statements can be misleading, unless of course 8% across one generation, or 22% across 3 generations really is HUGE   Big and huge are somewhat subjective but my interpretation is perhaps skewed, so I suppose my point is....... that additional time you refer to is not providing enough improvements IMO.


----------



## Aquinus (May 15, 2013)

Tatty_One said:


> so I suppose my point is....... that additional time you refer to is not providing enough improvements IMO.



When you have a CPU with over a billion transistors crammed into such a small place, it's not exactly easy to make huge changes to it without investing a lot of time into testing and development. I think you're overstating the ability of Intel, despite the resources that they have available. You make it sound like they can completely redesign a CPU and get it right in 3 years. It's taken Intel years to get the Core architecture where it is now after years of revisions, changes, and optimizations. Right now everyone sees what AMD is going through with BD and PD, but as time goes on AMD will optimize it and get it right. I don't think it's reasonable to assume Intel is going to break a proven platform without any reason for doing it. There isn't the drive in the market and developers don't tend to need it unless it's in a server as cloud computing is really becoming the thing to do now.

So I think Intel has done a good amount in 3 years. Could they have done more? Absolutely, I won't deny that they could have done more, but would it be the same quality as revisions in the past if they used the same amount of time? I doubt it. Less care would be put into each change so I have a strange feeling that it will negatively impact QA.


----------



## radrok (May 15, 2013)

Aquinus said:


> I think you're overstating the ability of Intel, despite the resources that they have available.



While I tend to agree with this idea I have to add a bit.

While raw IPC is hard to improve I think they are just sandbagging on the core count, I mean bloody Westmere-EX (LGA 1567) had 10 cores.

What on earth is blocking them on making available a bloody 12core CPU for socket 2011 other than themselves?


----------



## Aquinus (May 15, 2013)

radrok said:


> What on earth is blocking them on making available a bloody 12core CPU for socket 2011 other than themselves?



They would have to put a 4k USD price tag on it or some data centers or rather starting busineses would just buy consumer products because they're cheaper instead of investing in a Xeon server. Also in most cases, a 10-core CPU at 2.4Ghz isn't going to do most people any good. There just isn't a market for it, it's so tiny that it isn't worth it to make yet another platform to accommodate a design that holds 10 cores.

Just because you can cram 10 or 12 cores on to a CPU doesn't mean you can get them to run fast and even if you can, power consumption and heat dissipation is a huge issue.


----------



## Tatty_One (May 15, 2013)

Aquinus said:


> When you have a CPU with over a billion transistors crammed into such a small place, it's not exactly easy to make huge changes to it without investing a lot of time into testing and development. I think you're overstating the ability of Intel, despite the resources that they have available. You make it sound like they can completely redesign a CPU and get it right in 3 years. It's taken Intel years to get the Core architecture where it is now after years of revisions, changes, and optimizations. Right now everyone sees what AMD is going through with BD and PD, but as time goes on AMD will optimize it and get it right. I don't think it's reasonable to assume Intel is going to break a proven platform without any reason for doing it. There isn't the drive in the market and developers don't tend to need it unless it's in a server as cloud computing is really becoming the thing to do now.
> 
> So I think Intel has done a good amount in 3 years. Could they have done more? Absolutely, I won't deny that they could have done more, but would it be the same quality as revisions in the past if they used the same amount of time? I doubt it. Less care would be put into each change so I have a strange feeling that it will negatively impact QA.



You have hit the nail on the head and have summed up the last 3 or so years worth of architectural progression very accuratly and thats what dissapoints me just a little...... why?  because the move from Yorkfield to Bloomfield realised around a 20% performance improvement across the board and to be honest at the time was a revelation, we have had few revelations since!  Again personally i would have prefered Intel to have skipped one of the last 3 changes/generations/platforms and spent some additional development time in introducing something with greater gains, but as I said, thats just my opinion and may not have been cost effective or viable in the current climate.


----------



## radrok (May 15, 2013)

Tatty_One said:


> thats just my opinion and may not have been cost effective or viable in the current climate.



I just think it wouldn't have been profitable for them.


----------



## Aquinus (May 15, 2013)

radrok said:


> I just think it wouldn't have been profitable for them.



In the end that's really what drives business, isn't it?


----------

