# Intel Sandy Bridge to Introduce New Sockets, Chipsets, Reorganize Platform Further



## btarunr (Apr 21, 2010)

Intel plans a pair of new sockets for launch with its new processor architecture that succeeds Westmere, codenamed "Sandy Bridge", which are due for 2011. As part of its "tick-tock" product launch strategy, the company is currently transitioning between the 45 nm "tock" (Nehalem architecture), and 32 nm "tick" (Westmere architecture). In 2011, it will transition from the 32 nm "tick" (Westmere architecture), to the 32 nm "tock" (Sandy Bridge architecture). The company uses a "tick-tock" model of process development, where each processor architecture gets to be made in two successive manufacturing processes, while each process gets to build two succeeding architectures. It seems to have become clear that with Sandy Bridge, Intel will also switch to new socket designs, making existing motherboards obsolete then. Architecturally, Sandy Bridge will introduce new feature-sets that make the CPU more powerful, clock-to-clock, such as AVX - Advanced Vector Extensions, an evolution of the SSE instruction set, native AES engine which has been introduced with Westmere, and so on. 

The present LGA-1156 package on which Intel builds value-through-performance processors including a bulk of mainstream processors, will be succeeded with the LGA-1155 package. Though similar, LGA-1155 and LGA-1156 are not inter-compatible, meaning that LGA-1155 processors will not work on existing LGA-1156 motherboards, and LGA-1156 processors will not work on LGA-1155 motherboards, either. For these processors, the arrangement of vital components is similar to the LGA-1156 package, except that every LGA-1155 processor - dual-core or quad-core - will feature an on-die display controller.






The die itself will be monolithic, where the northbridge component completely integrates with the processor component, leaving only the southbridge outside the package, on the motherboard. Currently, the "Clarkdale" and "Arrandale" dual-core processors have the processor and northbridge components on separate dies, but on the same package. LGA-1155 is also designated as socket H2 (LGA-1156 is H1), the package is identical in size to LGA-1156, but has a different pin layout and orientation notch. 





Chipsets that drive the LGA-1156 platform include P67, H67, H61, and Q67, which will support features which were conceived prior to Ibex Peak platform's launch, but were shelved, such as ONFI NAND Flash "Braidwood", etc. USB 3.0 still isn't part of the feature-set, though native SATA 6 Gb/s support is on the cards. 

The next big platform to succeed the LGA-1366, which caters to processors in the upper performance-though-enthusiast segments is the "Patsburg" platform, succeeding the existing "Tylersburg" based Intel X58, 5000 series chipsets. Here, Intel will introduce a massive new socket, the LGA-2011. The pin count is drastically increased for two reasons: the processor will have a 256-bit wide memory interface (quad-channel DDR3), and the northbridge component (currently X58 PCH) will be integrated completely into the processor package, upping the pin count with the PCI-Express and DMI pins. The on-die PCI-Express 2.0 root-complex will give out 32 lanes for graphics (unlike 16 lanes on the LGA-1155), and a DMI link to the so-called "Intel X68" chipset, which is relegated to being a Platform Controller Hub, just like the P55, or P67. The X68 could have a feature-set similar to the P67.

*View at TechPowerUp Main Site*


----------



## H82LUZ73 (Apr 21, 2010)

New Sockets, Chipsets, which is why i stuck with AMD.....Intel=performance sure but also every 2 years= $$$$$$$$


----------



## Imsochobo (Apr 21, 2010)

way to ruin value for those who have upgraded, ill still put a 6 core in a 2005 motherboard just to piss off a friend.


----------



## gumpty (Apr 21, 2010)

Do you reckon they threw a few extra pins into that 'LGA-2011' just so that it could have the same name as the year it comes out?


----------



## LAN_deRf_HA (Apr 21, 2010)

Guessing the original discussion will die off... so Ill re-post myself were it's relevant.



LAN_deRf_HA said:


> It occurs to me now why intel has been avoiding usb 3.0. They really wanted to push lightpeek and are hoping usb 3.0 will just remain obscure until lightpeek is ready.


----------



## Phxprovost (Apr 21, 2010)

why is it that Intel seems to come out with more sockets then cpu's these days?


----------



## Frick (Apr 21, 2010)

btarunr said:


> The present LGA-1156 package on which Intel builds value-through-performance processors including a bulk of mainstream processors, will be succeeded with the LGA-1155 package. Though similar, LGA-1155 and LGA-1156 are not inter-compatible, meaning that LGA-1155 processors will not work on existing LGA-1156 motherboards, and LGA-1156 processors will not work on LGA-1155 motherboards, either. For these processors, the arrangement of vital components is similar to the LGA-1156 package, except that every LGA-1155 processor - dual-core or quad-core - will feature an on-die display controller.



This sentence is funny. ^^

Anyway, how old is the 1156 socket? This is just stupid imo.


----------



## Relayer (Apr 21, 2010)

Well, if the market is basically capping your processor sales at the $300.00 retail price point you need to do something to get more money out of consumers One solution is to make them buy a new Intel made chipset as well if they want to upgrade their processor. I'm sure the motherboard manufacturers don't mind either.

This is a very shortsighted way of conducting business, assuming AMD manages to stick around. All bets are off if Intel can drive their only competitor out of business. Short of that though, and assuming both companies continue on with their processor road maps, I think this will bite Intel. "You can fool all of the people some of the time and some of the people all of the time, but you can't fool all of the people all of the time." This basic truism will catch up to them, given enough time.


----------



## H82LUZ73 (Apr 21, 2010)

Frick said:


> This sentence is funny. ^^
> 
> Anyway, how old is the 1156 socket? This is just stupid imo.



I7 came out Q3 08 or early 09 I think


----------



## Imsochobo (Apr 21, 2010)

H82LUZ73 said:


> I7 came out Q3 08 or early 09 I think



LGa1156 came out a year later than I7 almost.

They came out last summer! they aint a year i think! :|


----------



## Fourstaff (Apr 21, 2010)

So all these future-proof arguments have amounted to naught. Curse you, intel!


----------



## PCpraiser100 (Apr 21, 2010)

Well, Intel has screwed me over again, next time I should consider going AMD no matter what. Intel you suck!


----------



## toyo (Apr 21, 2010)

I feel Intel's hands trying to deeply reach into my pockets... This socket-job is a highway heist. Intel deserves to be sabotaged for making people buy upgrades every 2 years... this cycle used to be much longer. 

I hope the time comes soon for AMD to become AMD again and give Intel some good old fashioned (true) competition.


----------



## inferKNOX (Apr 21, 2010)

LOL, new socket for every year:
2009 - 1366
2010 - 1156
2011 - 1155

Phew, I'm glad to be sticking with AMD!
From what I hear the X6 will even be compatible in the AM2+ socket.


----------



## Tatty_One (Apr 21, 2010)

The CPU socket/support issue is a kick in de ballz for most mainstreamm users IMO and they should feel let down by intel, especiall S1156 owners.  It don't effect me too much purelyu cause I have never owned a CPU or motherboard for more than a year!  that does not excuse this though I appreciate.


----------



## Tatty_One (Apr 21, 2010)

inferKNOX said:


> LOL, new socket for every year:
> 2009 - 1366
> 2010 - 1156
> 2011 - 1155
> ...



Slightly out there.... 1366 = 2008, 1156 = 2009, 1155 = 2011/12


----------



## btarunr (Apr 21, 2010)

I don't get the socket hue and cry. LGA-1366 = Q3-2008, LGA-2011 = Q3-2011 (3 years) ; LGA-1156 = Q3-2009, LGA-1155 = 2012(?) (3 years). 

A socket every 3 years is kosher.


----------



## Assimilator (Apr 21, 2010)

Right now I'm glad I'm still on S775. My PC is fast enough for what I need/want to do now; if my E8400 gets too slow within the next year I'll just grab a second-hand Q9650 and OC the hell out of it.

Next upgrade: Sandy Bridge S2011, I hope the socket will last for longer than 3 years. :/


----------



## gumpty (Apr 21, 2010)

btarunr said:


> I don't get the socket hue and cry. LGA-1366 = Q3-2008, LGA-2011 = Q3-2011 (3 years) ; LGA-1156 = Q3-2009, LGA-1155 = 2012(?) (3 years).
> 
> A socket every 3 years is kosher.



+1

It might piss off the people that upgrade their PC one piece at a time, but for people that just do a major build every few years, replacing most components, then it isn't much of an issue.


----------



## Hugis (Apr 21, 2010)

Assimilator said:


> Right now I'm glad I'm still on S775. My PC is fast enough for what I need/want to do now; if my E8400 gets too slow within the next year I'll just grab a second-hand Q9650 and OC the hell out of it.:/




im in the same boat but ill prolly go AM3 next time I upgrade, really like the look of these future 
 Phenom II X4 940T/960T´s


----------



## WarEagleAU (Apr 21, 2010)

At one point in time, I thought AMD was going to do a different Chipset for the X6 procs and intro Quad Channel Memory. Perhaps that is for their newest architecture, not what is being brought in currently.


----------



## mdm-adph (Apr 21, 2010)

Well, Intel fans -- that's just the way it goes.  Whoever's on top in the CPU war plays the socket game -- AMD did it back during the Pentium IV days, and Intel's doing it now.


----------



## DrPepper (Apr 21, 2010)

mmmm I've got my sights fixed on X68.


----------



## Altered (Apr 21, 2010)

Its ALL about the $$$$ its business and business is to make $$$$. Time will tell as the customers buy. If peeps keep buying while getting jerked off by any company guess what they will keep getting jerked off.


----------



## newtekie1 (Apr 21, 2010)

H82LUZ73 said:


> New Sockets, Chipsets, which is why i stuck with AMD.....Intel=performance sure but also every 2 years= $$$$$$$$



Yes, but with Intel, you can stick with your old hardware and still outperform AMD's offerings, so there is no need to upgrade every time a new processor is released...


----------



## kid41212003 (Apr 21, 2010)

I'm gonna upgrade!

QUAD CHANNELS OMGZ! Even though I don't even need it .

This is called innovation.


----------



## ERazer (Apr 21, 2010)

i just upgraded to 1156 and with this news it woulda prolly made me upset but with TPU b/s/t im feel just fine   but next upgrade im definitely sticking with amd


----------



## NeSeNVi (Apr 21, 2010)

This is what I want after my actual P4 on s.478. No doubt


----------



## Kitkat (Apr 21, 2010)

again lol


----------



## pr0n Inspector (Apr 21, 2010)

I'm going to build a new i5 machine before summer and yet I do not find this piece of news distressing. Strange. Might have something to do with me not interested in the dumb concept of "upgrade path".


----------



## Trigger911 (Apr 21, 2010)

pr0n Inspector said:


> I'm going to build a new i5 machine before summer and yet I do not find this piece of news distressing. Strange. Might have something to do with me not interested in the dumb concept of "upgrade path".




I buy what I think will do me the best .. but amd is the way I always go being a poor guy lol but amd seems good enough for me and some day they will be pwning haha


----------



## btarunr (Apr 21, 2010)

pr0n Inspector said:


> I'm going to build a new i5 machine before summer and yet I do not find this piece of news distressing. Strange. Might have something to do with me not interested in the dumb concept of "upgrade path".



Go ahead and build it. LGA-1155 is two years away.


----------



## 1c3d0g (Apr 21, 2010)

PCpraiser100 said:


> Well, Intel has screwed me over again, next time I should consider going AMD no matter what. Intel you suck!



It was known for a long time (before they even came out) that S1156 was not going to be a long-term socket design. Moving/fusing the IGP into the CPU is not a trivial task, so it's to be expected that certain parameters such as pin counts can and will change. Anyways, this is technology, if you don't like it, don't upgrade. It's as simple as that.


----------



## DanishDevil (Apr 21, 2010)

With their marketing strategy, it really seems like late adopters are the ones who get bit the worst. Early adopters get to use their sockets for about 2 years, but it's those who take a while longer after something comes out to buy it that end up feeling obsolete.

I agree that it was bound to happen. Once 32nm CPUs were released, they were only released on 1156, and their integrated GPUs only worked on P57 boards, which there were very few of.


----------



## jagd (Apr 21, 2010)

No offence but you are news editor and wrote sandy bridge is 2011 , this makes socket 1156--->1155 18 months nearly (iirc sandy bridge expected at 2011 1st quarter)



btarunr said:


> I don't get the socket hue and cry. LGA-1366 = Q3-2008, LGA-2011 = Q3-2011 (3 years) ; LGA-1156 = Q3-2009, LGA-1155 = 2012(?) (3 years).
> 
> A socket every 3 years is kosher.


----------



## Static~Charge (Apr 21, 2010)

Phxprovost said:


> why is it that Intel seems to come out with more sockets then cpu's these days?



Because Intel charges royalties for the use of their socket designs. By changing them constantly, Intel gets to milk that cash cow three times: the CPU, the CPU socket, and the chipset.

Never mind that they end up pissing off their buyers, especially the enthusiasts.


----------



## btarunr (Apr 21, 2010)

jagd said:


> No offence but you are news editor and wrote sandy bridge is 2011 , this makes socket 1156--->1155 18 months nearly (iirc sandy bridge expected at 2011 1st quarter)



Sandy Bridge architecture is arriving in 2011 -> Yes, with LGA-2011. LGA-1155 in 2011 -> No, never said that.


----------



## HalfAHertz (Apr 21, 2010)

Well I understand the need for a new socket, because the move to a 256-bit memory bus (I hope we get quad pumped memory too  ) but the 1156->1155 transition is a little retarded. Couldn't they have played around with the layout a bit and keep the old socket?


----------



## lukesky (Apr 21, 2010)

If Intel kept the 1156 socket you would still need new chipset, just like 945, 965 to X38, P35 transition. You basically need a new chipset for every architectural change. AMD basically kept the same architecture from K8 days that's why AM2 is compatible with Am3 etc. So, essentially, Intel is just more aggressive on architectural changes.


----------



## fullinfusion (Apr 21, 2010)

Way to go Intel keep changing sockets


----------



## MikeMurphy (Apr 21, 2010)

Why is everyone so fixated on sockets?

Besides from techies how many people actually upgrade the CPU in their computer??  I'd like to table a guess of much less than 1%.

So why would Intel not change their sockets to accommodate their latest in technology?

You should be used to it by now.  Otherwise, switch to AMD.


----------



## Wile E (Apr 21, 2010)

It's a bummer, since I am a late 1366 adopter, but honestly, how many of us enthusiasts that buy top end hardware actually end up keeping the same mobo for years and years? Even if they kept the same socket, and just released a new chipset, we were still likely to buy a new mobo with our 8 core cpu anyway.

I hate to admit it, and it leaves me feeling a little burnt, but the truth is, it doesn't really effect me that much.


----------



## mk_ln (Apr 21, 2010)

unless i'm mistaken, wasn't sandy bridge supposed to be released in 2010?


----------



## Binge (Apr 21, 2010)

I think when X68 becomes available I'll still have a lot of power in the previous 1366 platform.  Whatever, let them change.  Let's see how much more powerful the procs are going to be on these new sockets.


----------



## Trigger911 (Apr 21, 2010)

its got a fatter bus so it should move data faster.


----------



## DrPepper (Apr 21, 2010)

Trigger911 said:


> its got a fatter bus so it should move data faster.



No just because it's bigger doesn't mean it's faster. Just means you can move more at any one time.


----------



## Hayder_Master (Apr 21, 2010)

remove one pin"leg" from the CPU to made new sockets, that's sucks


----------



## LittleLizard (Apr 21, 2010)

now that im ACTUALLY thinking. 256-bit wide memory controller means that most mainboards will need to have 8 Memory Slots. Then, the actual fact that it has 2011 pins, it means that it will be need to be way bigger than the current LGA 1366 which is already enourmous. So, the questions are: A - ATX format is getting too short on wideness
      B - MATX format is too small

maybe they can sort it cause X68 is just one chip, but, its just too much.


----------



## fochkoph (Apr 21, 2010)

pr0n Inspector said:


> I'm going to build a new i5 machine before summer and yet I do not find this piece of news distressing. Strange. Might have something to do with me not interested in the dumb concept of "upgrade path".



Same here, with 1155 coming in about two years I'll probably end up upgrading about a year after that socket is introduced anyway. If the i5 750 vs i7 920 gaming performance is anything to go by for 1155 vs 2011, I have zero interest in X68 and LGA-2011. Three year upgrade paths are just about right for me.


----------



## eidairaman1 (Apr 22, 2010)

Intel=Nvidia 2 companies that force you to upgrade and cant stand any form of competition


----------



## dr emulator (madmax) (Apr 22, 2010)

Wile E said:


> It's a bummer, since I am a late 1366 adopter, but honestly, how many of us enthusiasts that buy top end hardware actually end up keeping the same mobo for years and years? Even if they kept the same socket, and just released a new chipset, we were still likely to buy a new mobo with our 8 core cpu anyway.
> 
> I hate to admit it, and it leaves me feeling a little burnt, but the truth is, it doesn't really effect me that much.



it shouldn't as you've got a thousand bucks processor which should (if ya don't kill it ) last for years 

i'm still going to get an i7 processor and mobo sometime this year why ?
well i haven't upgraded in about 7 years :shadedshu so hopefully the new i7 
will last just as long as my current system has, which by then there might be a 24 core processor out 
that will  my eyes brain and wallet


----------



## DOM (Apr 22, 2010)

im sad but this is going to stop me from spending for a while lol


----------



## Scrizz (Apr 22, 2010)

sweet

s775 --> s2011 4 me


----------



## qwerty_lesh (Apr 22, 2010)

1366 -> 2011 for me ! ^_^

3 years is great, sucks for those who jumped on the Tylersburg/Bloomfield wagon much much later, tho they shouldnt complain, they didn't pay through the nose like us early adopters did!!

the good news is, I will not be going gulftown from my 920 haha, i mean, whats the point, if you know its going to be super seeded in a year you'd be crazy to spend such money on a cpu and find out its not future proof.

I am surprised that the x68 & cpus will have DMI, the x58 (AFAIK) are all QPI, so...  Intels dropping quick-path interconnect as the exclusive for their top performance platforms


----------



## TIGR (Apr 22, 2010)

The more frequently Intel releases new [incompatible] sockets/platforms, the more AMD systems I will be building for my customers, and for myself. A good upgrade path can be a strong selling point for informed builders/buyers.


----------



## MN12BIRD (Apr 22, 2010)

2 new sockets? 

Fuck this, I'm outta here.


----------



## a_ump (Apr 22, 2010)

yea it does seem really soon. i mean how long did LGA775 last? like 5yrs? 2004 is earliest i remember it, could be wrong and intel has still released CPU's for it recently. So seeing it live this long, people get used to that n then expect the next socket to do the same. Which imo they will, i think this is just an informing of what's to come. These socket changes won't happen Q1 2011 or anything.


----------



## Scrizz (Apr 22, 2010)

it's the same thing with windows..
Xp lasted too long and people had forgotten hoe regularly OSs changed..
same story different product.


----------



## [I.R.A]_FBi (Apr 22, 2010)

i  my Q6600 

LGA2011, my savings starts with my first paycheck


----------



## AsphyxiA (Apr 22, 2010)

my girlfriend just happened to be listening to "Tik Tok" by Kesha when I read this


----------



## DarthCyclonis (Apr 22, 2010)

Intel is getting carried away.  But can you blame them?  They are in business to make money.

However the I7 1366 platform is more than sufficent the next couple of years.   When Windows OS, games and programs I use start utilizing multi-cores and multi threads a little more efficiently there is really no need for an upgrade. 

Just a waste of money at this point.


----------



## pr0n Inspector (Apr 23, 2010)

DarthCyclonis said:


> Intel is getting carried away.  But can you blame them?  They are in business to make money.
> 
> However the I7 1366 platform is more than sufficent the next couple of years.   When Windows OS, games and programs I use start utilizing multi-cores and multi threads a little more efficiently there is really no need for an upgrade.
> 
> Just a waste of money at this point.



This is only a waste of money if the user is suffering from Upgrading OCD.


----------



## DrPepper (Apr 23, 2010)

pr0n Inspector said:


> This is only a waste of money if the user is suffering from Upgrading OCD.



Well imagine all the people on P4's who might want to upgrade. The way I see it is if you didn't get core i7 you should get Sandy Bridge and if you got core i7 it would be sensible to get the next iteration. Not sure if that makes sense.


----------



## TIGR (Apr 23, 2010)

DrPepper said:


> Well imagine all the people on P4's who might want to upgrade. The way I see it is if you didn't get core i7 you should get Sandy Bridge and if you got core i7 it would be sensible to get the next iteration. Not sure if that makes sense.



Makes perfect sense.

Just would be nice if Sandy Bridge chips were on a compatible platform, considering the more simple offerings of Intel's competitor.


----------



## newtekie1 (Apr 23, 2010)

eidairaman1 said:


> Intel=Nvidia 2 companies that force you to upgrade and cant stand any form of competition



I don't see how nVidia forces anyone to upgrade.  Can you elaborate on that some more please?


----------



## DrPepper (Apr 23, 2010)

TIGR said:


> Makes perfect sense.
> 
> Just would be nice if Sandy Bridge chips were on a compatible platform, considering the more simple offerings of Intel's competitor.



I think if intel could, they would but the had to reorganise the whole socket in order for sandy bridge to work.


----------



## a_ump (Apr 23, 2010)

[I.R.A]_FBi said:


> i  my Q6600
> 
> LGA2011, my savings starts with my first paycheck



here here . i too love my q6600. at 3.4ghz, it still runs games fine for me. I'll upgrade my graphic card before my CPU. 

It's ridiculous how long multi-core chips have been out; dual, tri, quad, now hexa, and applications still aren't optimized for multi-core/threads. programmers need to step up and at least optimize CPU usage. Look at Valve, they did a great job on multi-core rendering with TF2 and L4D.


----------



## theubersmurf (Apr 23, 2010)

Phxprovost said:


> why is it that Intel seems to come out with more sockets then cpu's these days?


Because they hate you and love your money? I'm not sure.


----------



## Deleted member 67555 (Apr 23, 2010)

I gave up Intel with the 775 switch to 1366...

Amd has made some switchs in that time too..But they have mostly been compatible..
So when Bulldozer comes out I'll switch to that..and never go Intel again..
To many sockets in so little time...It is BS..Some are willing to accept it I'm not


----------



## LAN_deRf_HA (Apr 23, 2010)

Apparently the 1155 dual cores will have a tdp of 35w, and no processor will have a tdp over 45 (probably quads). So I'm guessing 6+ core procs won't be available for it.


----------



## FordGT90Concept (Apr 23, 2010)

That's what Intel gets for moving the North Bridge to the CPU--constant socket upgrades.


----------



## LAN_deRf_HA (Apr 24, 2010)




----------



## TIGR (Apr 24, 2010)

Is there no way for Intel to create a modular platform that can provide buses, links, sockets, etc. that can span many generations? And what do they gain by releasing two sockets at a time, and so close to the last two? Product line/platform differentiation? Is it that valuable?


----------



## FordGT90Concept (Apr 24, 2010)

The main reasons are voltage, DIMM slots, and south bridge.  Putting a processor in a motherboard that don't have the correct pinage for those things, something is going to get damaged.  They change the sockets to make sure those mistakes don't have a chance of happening.

LGA-2011 = quad-channel, two-way
LGA-1366 = tri-channel, two-way
LGA-1156 = dual-channel, one-way

I'm not sure why they are changing 1156 to 1155 though, it could be voltages.


----------



## Wile E (Apr 24, 2010)

dr emulator (madmax) said:


> it shouldn't as you've got a thousand bucks processor which should *(if ya don't kill it* ) last for years
> 
> i'm still going to get an i7 processor and mobo sometime this year why ?
> well i haven't upgraded in about 7 years :shadedshu so hopefully the new i7
> ...



That's a big, BIG if. lol


----------



## erocker (Apr 24, 2010)

If we can get more USB 3.0 and Sata III with more PCI-E bandwith, it's a win. It's unfortunate that they can't do this with a compatible socket.


----------



## Wile E (Apr 24, 2010)

jmcslob said:


> I gave up Intel with the 775 switch to 1366...
> 
> Amd has made some switchs in that time too..But they have mostly been compatible..
> So when Bulldozer comes out I'll switch to that..and never go Intel again..
> To many sockets in so little time...It is BS..Some are willing to accept it I'm not



You realize that AMD has done the same exact thing, don't you? Remember 939 and 754, and their short lives? It happens in both camps. 1366 will have been out around 3 years when this new socket releases. That's actually reasonable. Sucks from a late adopter's point of view, but it's really not that bad. I'll move to the new socket on the refresh. This 980X will be plenty powerful enough until then.


----------



## Deleted member 67555 (Apr 24, 2010)

Yeah...I do LOL..
It's just not been nearly as often..and when people got upset and cried about it AMD got the message (to a point) and yet they will be doing a Socket change here pretty soon as well..But AM2/AM2+/AM3 will have lasted about 5 years..

Hey Don't get me wrong I like that Intel does this, as it keeps the average Joe confused as to what works with what, and gives people a Jobs, it's just not for me..


----------



## kid41212003 (Apr 24, 2010)

Sure, they do last 5 years with no significant improvement between CPUs, or you would end up with a "slow-downed" cpu with the old mobo.


----------



## TheMailMan78 (Apr 24, 2010)

I really don't see what the big deal is. People get mad when companies don't progress in technology (AMD) and they get mad when the progress to fast (Intel). The bottom line is Intel or AMD changing sockets so fast affects a very small group (enthusiasts) but make no difference to anyone else.

I applause Intel for advancing the industry and for us it shouldn't make any difference. Why? Because we change our rigs every other year anyways!

Also if you get to brass tacts what takes advantage of an i7 currently? I mean what really makes them scream for mercy?...........yeah thats what I thought.


----------



## Deleted member 67555 (Apr 24, 2010)

LOL I know I change mine about every 9 months...(if that)
I use the Tick-Tock as well but for me it's CPU-tick-hard drives-tock Video Card-tick Memory-tock


----------



## Trigger911 (Apr 24, 2010)

TheMailMan78 said:


> I really don't see what the big deal is. People get mad when companies don't progress in technology (AMD) and they get mad when the progress to fast (Intel). The bottom line is Intel or AMD changing sockets so fast affects a very small group (enthusiasts) but make no difference to anyone else.
> 
> I applause Intel for advancing the industry and for us it shouldn't make any difference. Why? Because we change our rigs every other year anyways!
> 
> Also if you get to brass tacts what takes advantage of an i7 currently? I mean what really makes them scream for mercy?...........yeah thats what I thought.



I totally agree that's the world of technology its always moving changing and merging


----------



## DrPepper (Apr 24, 2010)

Again like everyone is saying, big deal it's not like my i7 will be any less good. Hell a Q6600 is still a decent processor these days.


----------



## HillBeast (Apr 25, 2010)

All these people going 'I'm glad I'm sticking with AMD' need to realise, yes sure AMD don't bring out heaps of sockets all tbhe time, but think about when K8 came out, they had 754, 939, then AM2 and all were not compatible with each other and this was happening very close to each other.

The other thing is, AMD is most likely (like 99% chance) going to bring out a new socket for Fusion because I personally see no way they can manage to bring out that many features and not upgrade socket, and I can see now that Intel has moved the IOH/MCH into the CPU, this will happen alot now due to features that would have been added to the IOH/MCH that could have been a simple motherboard update now require a full on socket change.

Don't get me wrong, I think it's ridiculous they had to go and change so damned quickly, but when you really think about it, Intel has never been a cheap processor manufacturer. They have always been just the best. It's as simple as that.


----------



## HillBeast (Apr 25, 2010)

TheMailMan78 said:


> Also if you get to brass tacts what takes advantage of an i7 currently? I mean what really makes them scream for mercy?...........yeah thats what I thought.



Prime95 is the only thing that comes to mind, and that's not a real world app.


----------



## eidairaman1 (Apr 25, 2010)

Ok Remember this Athlon 64, 754 was Single Channel DDR MC, 940 was Dual Channel DDR, 939 came about to replace 754, AM2 was for DDR2, AM3 was to prevent AM2 cpus from being inserted into AM3 boards due to the Memory Controller being DDR2 only and AM3 being DDR3. AM2, AM2+, AM3 are all technologically the same just improvements etc of PhII and faster. I suspect Bulldozer to be on a different Socket in 2011. TBH Im not holding my breath as I will get a PHII X6 1090T BE with a 890FX board, 5870 or 5890, HT Omega Sound card, 8 Gigs DDR3, 



HillBeast said:


> All these people going 'I'm glad I'm sticking with AMD' need to realise, yes sure AMD don't bring out heaps of sockets all tbhe time, but think about when K8 came out, they had 754, 939, then AM2 and all were not compatible with each other and this was happening very close to each other.
> 
> The other thing is, AMD is most likely (like 99% chance) going to bring out a new socket for Fusion because I personally see no way they can manage to bring out that many features and not upgrade socket, and I can see now that Intel has moved the IOH/MCH into the CPU, this will happen alot now due to features that would have been added to the IOH/MCH that could have been a simple motherboard update now require a full on socket change.
> 
> Don't get me wrong, I think it's ridiculous they had to go and change so damned quickly, but when you really think about it, Intel has never been a cheap processor manufacturer. They have always been just the best. It's as simple as that.


----------



## kid41212003 (Apr 25, 2010)

That's a Phenom II with 2 extra cores (45nm), obviously won't able to OC as good. If you're already have a Phenom II quad, there's no reason to upgrade. Beside dual 8GB < triple 6GB < quad channel socket 2011.


----------



## FordGT90Concept (Apr 25, 2010)

TheMailMan78 said:


> Also if you get to brass tacts what takes advantage of an i7 currently? I mean what really makes them scream for mercy?...........yeah thats what I thought.


Like any other processor, an app that loads the cores to 100%.  I've coded many of those.  They make Pidgin run delayed (between message sent and confirmation sound) and IE8 take forever to open. XD

The app attached to this post will make any processor (or multiple processors) beg for mercy for five minutes.


----------



## Wile E (Apr 25, 2010)

eidairaman1 said:


> Ok Remember this Athlon 64, 754 was Single Channel DDR MC, 940 was Dual Channel DDR, 939 came about to replace 754, AM2 was for DDR2, AM3 was to prevent AM2 cpus from being inserted into AM3 boards due to the Memory Controller being DDR2 only and AM3 being DDR3. AM2, AM2+, AM3 are all technologically the same just improvements etc of PhII and faster. I suspect Bulldozer to be on a different Socket in 2011. TBH Im not holding my breath as I will get a PHII X6 1090T BE with a 890FX board, 5870 or 5890, HT Omega Sound card, 8 Gigs DDR3,


Wrong. 754 was single channel, but 939 was not it's dual channel replacement. It was just like Intel is doing now. A lower chipset, and an enthusiast chipset simultaneously. 940 was an Opteron server socket. Then the rest. They have been able to keep the same socket because they have not added any new features to the cpu, like Intel moving their chipsets to the cpu.


----------



## TheMailMan78 (Apr 25, 2010)

FordGT90Concept said:


> Like any other processor, an app that loads the cores to 100%.  I've coded many of those.  They make Pidgin run delayed (between message sent and confirmation sound) and IE8 take forever to open. XD
> 
> The app attached to this post will make any processor (or multiple processors) beg for mercy for five minutes.



Well yeah there are apps that will do it. However how many people need to make an i7 scream. I think you are over analyzing again ford.


----------



## HillBeast (Apr 25, 2010)

TheMailMan78 said:


> Well yeah there are apps that will do it. However how many people need to make an i7 scream. I think you are over analyzing again ford.



Yeah. In real world sitations: gaming, office work, video editing, etc, an i7 never gets pushed to it's limits. Very few apps can use all 8 threads. Most games barely use 4 of them.


----------



## FordGT90Concept (Apr 25, 2010)

Video editing could depending on what you are doing and how the software is programmed.
Gaming only uses 3-4 cores at most.
Office work is fine on 1 core.

New games may use all 8 cores but 4 of which are at barely more than idle.  Any software that does something and fully loads 8 cores means it will take twice as long on a quad core, four times as much on a dual cores, and so on.  Programmers try to avoid creating that much strain on systems unless it is unavoidable (like BOINC/F@H) or intentional (like the app I linked to).


----------



## Relayer (Apr 25, 2010)

Do not confuse cores and threads. While your i7's have 8 threads (except the 980) they only have 4 cores. While splitting the core into 2 threads usually improves performance it doesn't come close to doubling it, like adding another core would do. Assuming 1core to two cores, of course.


----------



## FordGT90Concept (Apr 25, 2010)

When I say 8 cores, it's pretty obvious I mean logical and physical cores.  My Core i7 has 8 cores (4 logical and 4 physical).  My Xeon 5310 server also has 8 cores (8 physical).

I wrote a multithreaded application for benchmarking using a simple counting scheme.  Performance was in excess of four times better with hyperthreading enabled than not.  SMT, when done right, means the actual number of cores is irrelevent.  The more data thrown on to the CPU as a whole the better the performance.

On applications that aren't heavily multithreaded, the architecture of Core i# is inferior to that of Core 2 and Phenom II.


----------



## TheMailMan78 (Apr 25, 2010)

FordGT90Concept said:


> When I say 8 cores, it's pretty obvious I mean logical and physical cores.  My Core i7 has 8 cores (4 logical and 4 physical).  My Xeon 5310 server also has 8 cores (8 physical).
> 
> I wrote a multithreaded application for benchmarking using a simple counting scheme.  Performance was in excess of four times better with hyperthreading enabled than not.  SMT, when done right, means the actual number of cores is irrelevent.  The more data thrown on to the CPU as a whole the better the performance.
> 
> On applications that aren't heavily multithreaded, the architecture of Core i# is inferior to that of Core 2 and Phenom II.



Good break down my friend.


----------



## HillBeast (Apr 26, 2010)

FordGT90Concept said:


> On applications that aren't heavily multithreaded, the architecture of Core i# is inferior to that of Core 2 and Phenom II.



How?


----------



## Relayer (Apr 26, 2010)

FordGT90Concept said:


> When I say 8 cores, it's pretty obvious I mean logical and physical cores.  My Core i7 has 8 cores (4 logical and 4 physical).  My Xeon 5310 server also has 8 cores (8 physical).
> 
> I wrote a multithreaded application for benchmarking using a simple counting scheme.  Performance was in excess of four times better with hyperthreading enabled than not.  SMT, when done right, means the actual number of cores is irrelevent.  The more data thrown on to the CPU as a whole the better the performance.
> 
> On applications that aren't heavily multithreaded, the architecture of Core i# is inferior to that of Core 2 and Phenom II.




Not trying to get into a pissing contest with you. I'll concede you have more knowledge on the subject. There are just those out there though who would misinterpret what you say and think that a quad core i7 is in actuality an 8 core processor rather than a quad core with 8 threads.

I'm curious though, out of say the top 100 commercial programs (just picking a number here) what percentage would scale the way the counting program you wrote, at >4x faster with hyperthreading than without? I'd imagine the percentage would be pretty small?


----------



## HillBeast (Apr 26, 2010)

Relayer said:


> There are just those out there though who would misinterpret what you say and think that a quad core i7 is in actuality an 8 core processor rather than a quad core with 8 threads.



People who reckon it is an octa-core chip obviously are blind and haven't read the box. Quoting my i7-930 box:

[Intel Core i7 Inside Logo]
QUAD-CORE
DESKTOP
INTEL CORE i7 PROCESSOR

Also about the whole Core 2s being better than Core i7s in poorly threaded apps, it's simply not true. I wrote a program years ago for comparing P4s and it tests the CPUs per thread performance, and per MHz, Nehalem is faster than Core. Can't remember exact numbers, but if memory serves me right my old Core 2 E8400 at 3.0GHz got the same score as a Core i7 920 at 2.6GHz.


----------



## FordGT90Concept (Apr 26, 2010)

HillBeast said:


> How?


Ask Intel. 




Relayer said:


> I'm curious though, out of say the top 100 commercial programs (just picking a number here) what percentage would scale the way the counting program you wrote, at >4x faster with hyperthreading than without? I'd imagine the percentage would be pretty small?


How many applications *need* to use all the power a computer possesses?  None.  The only programs that do use as much power as is available are the likes of conversion applications (encoding, decoding, compilers, assemblers, converters, etc.) where waiting is a burden.   The percentage of "top 100 commercial programs" has a direct relation to what percentage perform those duties.  It would be pretty small but at the same time, it is those applications that are pushing the industry towards faster processors.




HillBeast said:


> Also about the whole Core 2s being better than Core i7s in poorly threaded apps, it's simply not true. I wrote a program years ago for comparing P4s and it tests the CPUs per thread performance, and per MHz, Nehalem is faster than Core. Can't remember exact numbers, but if memory serves me right my old Core 2 E8400 at 3.0GHz got the same score as a Core i7 920 at 2.6GHz.


If you have a Core 2 machine we could certainly test that again.  I can limit the application I used before to just a single thread.

What makes my app unique is that, theoretically, it can linger in the L1/L2 caches of the processor eliminating the bottleneck on Core 2 machines (extra latency from going to the North Bridge, the RAM, and back again).  Low scores (like my Core i7 920 without Hyperthreading) are most likely caused by cache collisions where the core(s) had to run all the way to the RAM instead of staying on the processor.  Little faults like that is all it takes to get ahead.


----------



## HillBeast (Apr 26, 2010)

FordGT90Concept said:


> If you have a Core 2 machine we could certainly test that again.  I can limit the application I used before to just a single thread.



I have tested it again. Several times. I didn't believe such a boost could come from a simple architecture change but it did. Before I even got my i7 I tested it and I made sure it wasn't being unfair on the Core 2 but it simply wasn't. I have tested this on heaps of CPUs in the past and it is a very fair test of the processor.

I highly doubt Intel would be dumb enough to release a chip worse than it's predecessor after the fiasco they had with Netburst.

Bottom line: i7 is more powerful in every way, multi or single threaded.


----------



## Relayer (Apr 26, 2010)

> People who reckon it is an octa-core chip obviously are blind and haven't read the box. Quoting my i7-930 box:
> 
> [Intel Core i7 Inside Logo]
> QUAD-CORE
> ...



Not all processors come with the box. Most come inside the assembled PC. I've seen a professional animator who believed that his i7 920 had 8x 2.67GHz cores because there are 8 rendering threads and it's a 2.67GHz rated processor. He didn't build his workstation. He bought it though because the person selling it to him lead him to believe that while it wasn't really an 8 core processor, it was the same thing. I was just trying to avoid that type of misinformation. If it's that obvious though, carry on.


----------



## HillBeast (Apr 26, 2010)

Relayer said:


> Not all processors come with the box. Most come inside the assembled PC. I've seen a professional animator who believed that his i7 920 had 8x 2.67GHz cores because there are 8 rendering threads and it's a 2.67GHz rated processor. He didn't build his workstation. He bought it though because the person selling it to him lead him to believe that while it wasn't really an 8 core processor, it was the same thing. I was just trying to avoid that type of misinformation. If it's that obvious though, carry on.



True.


----------



## lukesky (Apr 27, 2010)

[QUOTE

On applications that aren't heavily multithreaded, the architecture of Core i# is inferior to that of Core 2 and Phenom II.[/QUOTE]

No. Core i7 is faster in single threaded application. It has higher performance per clock than any other architecture. Look at how it fares in gaming compared to phenom and Core 2 Quads.


----------



## DrPepper (Apr 27, 2010)

lukesky said:


> No. Core i7 is faster in single threaded application. It has higher performance per clock than any other architecture. Look at how it fares in gaming compared to phenom and Core 2 Quads.



Not much better. It does have a high instructions/clock rate 23.9 compared to 18.6 for a core 2 quad yorktown so yes it is faster but not by a huge margin.


----------



## LAN_deRf_HA (Apr 27, 2010)

lukesky said:


> No. Core i7 is faster in single threaded application. It has higher performance per clock than any other architecture. Look at how it fares in gaming compared to phenom and Core 2 Quads.



Games? That's the last thing you should look at for a yorkfield vs i7 comparison. There's zero performance difference with single gpu setups, a slight edge only shows up for dual gpu situations. Random productivity benchmarks are where the difference is.

Look here at the bottom http://www.anandtech.com/bench/Product/45?vs=48

Two games are tied, two games where either or wins. That sums up the whole of the video game comparison... it's even.


----------



## D007 (Apr 27, 2010)

All I see is marginal improvement for far to much cost.
This is a bad financial move for intel. Exactly like the ps3 was for sony.


----------



## FordGT90Concept (Apr 27, 2010)

HillBeast said:


> I highly doubt Intel would be dumb enough to release a chip worse than it's predecessor after the fiasco they had with Netburst.


Nehalem practically is Netburst with a new dress.  The only real difference is the pipelines are quite a bit shorter (the mistake in NetBurst was the assumption they could keep increasing the clockspeeds indefinitely).


Look at the most to least portion of the results here:
http://forums.techpowerup.com/showpost.php?p=1686222&postcount=119

4 times was exagerating , it was (as expected) almost twice as fast with hyperthreading enabled.  No hyperthreading = very low scores except in the floating point department.  It got hurt very badly in the integer department without Hyperthreading.  That proves that over half of the ALU's are idle without two threads throwing work at it where the FPUs are mostly loaded with just a single thread giving it work.


As you can see, I haven't done any testing at only one thread yet.



lukesky said:


> No. Core i7 is faster in single threaded application. It has higher performance per clock than any other architecture. Look at how it fares in gaming compared to phenom and Core 2 Quads.


Phenoms II win 50% of the time.  Core 2 loses all the time because of the delay in reaching the RAM that is minimized in Phenom II and Core i#.


----------



## Wile E (Apr 27, 2010)

FordGT90Concept said:


> Nehalem practically is Netburst with a new dress.  The only real difference is the pipelines are quite a bit shorter (the mistake in NetBurst was the assumption they could keep increasing the clockspeeds indefinitely).
> 
> 
> Look at the most to least portion of the results here:
> ...



Numerous tests on i7 with HT disabled shows that it's faster clock for clock than both AMD and Core 2. It's all over the net. Just look at the 750's benchmarks.


----------



## HillBeast (Apr 27, 2010)

FordGT90Concept said:


> As you can see, I haven't done any testing at only one thread yet.



That what I was talking about. 



> Also about the whole Core 2s being better than Core i7s in poorly threaded apps, it's simply not true. I wrote a program years ago for comparing P4s and it tests the CPUs per thread performance, and per MHz, Nehalem is faster than Core.



I never said it was better in multi threaded apps, but from your absolutely confusing chart all I can understand is that i7 works better when you turn on HT, which any smart person knows already. It does after all double the number of threads per core.


----------



## FordGT90Concept (Apr 27, 2010)

Wile E said:


> Numerous tests on i7 with HT disabled shows that it's faster clock for clock than both AMD and Core 2. It's all over the net. Just look at the 750's benchmarks.


Edit: Clock for clock, Core 2 (Penryn) is about equal to Phenom II.  Core i7 is clock for clock faster than Core 2 (and thus, Phenom II) in multithreading but I'm not certain about single threaded. 


uint64: Core i7 920 < Phenom II 955
double: Core i7 920 > Phenom II 955 (not by much on the 4 thread test)

Core i7 920 is better at floating point operations than the Phenom II 955 but not at integer operations.  If you are testing with an application that is heavy in floats, the Core i7 will come out on top (with hypthreading disabled).  If the application is heavy on ints, Core i7 (hyperthreading disabled) will come in second.

Turning on hyperthreading and the workload the same (4 threads each), Phenom II 955 and Core i7 920 are very close (i7 on top) while the Core i7 920 slaughters the Phenom II 955 in the floating point area.  Turn on hyperthreading and add 4 more threads and the Core i7 920 simply mocks the Phenom II 955.




HillBeast said:


> I never said it was better in multi threaded apps, but from your absolutely confusing chart all I can understand is that i7 works better when you turn on HT, which any smart person knows already. It does after all double the number of threads per core.


Read the results at the bottom for analysis.  The test was to compare HT to no HT, two processors versus one, and AMD versus Intel (they were about the same price when the benchmark was done).


----------



## HillBeast (Apr 27, 2010)

Right so here are some results from my testing program.

When I first wrote it, it was when P4s were all big and I made it to compare them. I first wrote it on my P4 2.4GHz Northwood and since then I have been using it on all the CPUs I have had since and here are some quick results:


```
Arch.		Processor				Core Clock	Score		Points/MHz
Nehalem		Intel Core i7 930			3800MHz		11704	3.08
Nehalem		Intel Core i7 860			2860MHz		8780	3.07
Nehalem		Intel Core i7 920			2793MHz		8520	3.05
Core		Intel Core 2 Duo E8400			3000MHz		8677	2.89
Core		Intel Core 2 Duo E6550			2333MHz		5577	2.39
Pentium M	Intel Pentium M 1.86 GHz		1866MHz		3972	2.19
Pentium M	Intel Celeron M 420			1600MHz		3478	2.17
K8		AMD Athlon 64 X2 4600+			2400MHz		4748	1.98
K8		AMD Athlon 64 X2 4200+			2200MHz		4348	1.98
K7		AMD Athlon XP Barton (Underclocked)	1722MHz		3137	1.81
K7		AMD Duron 800+				800MHz		1330	1.66
Netburst	Intel Pentium 4 540			3000MHz		3499	1.17
Netburst	Intel Pentium 4 Northwood		2400MHz		2985	1.14
```

And if you look back at history these scores really do correlate well. K7 was faster than Netburst per clock, Pentium M was much faster than Netburst, Core was faster still. Nehalem gets the highest score therefore Nehalem is the best.

Anyone who doubts these findings are either a fanboy or a just plain stupid.


----------



## FordGT90Concept (Apr 27, 2010)

Wanna send me that app so I can try it?


----------



## HillBeast (Apr 27, 2010)

Here is my testing app. Ignore the Total Processor Score as it is inaccurate. It simply theorizes what the CPU could do if all the cores work perfectly together in multi threaded applications. I didn't have the know-how to write a multi threaded program back when I wrote it. The Per Thread Score is still accurate though.

Oh this is a rewrite of the original program for P4s.

Let me know what you get.


----------



## LAN_deRf_HA (Apr 27, 2010)

I've never seen a review that showed Phenom II being even with core 2. Close in some things, ahead in a few select things, but lagging in the vast majority. http://www.anandtech.com/bench/Product/48?vs=88

Edit* I got 11468


----------



## FordGT90Concept (Apr 27, 2010)

My Core i7 920 matches your score (within a few dozen points).


```
Core		Intel Xeon E5310 (dual processor)		1600MHz		4336	2.71
```

What type of calculations does this appication do?  Lemme guess, floating point?




LAN_deRf_HA said:


> I've never seen a review that showed Phenom II being even with core 2. Close in some things, ahead in a few select things, but lagging in the vast majority. http://www.anandtech.com/bench/Product/48?vs=88


According to that, Phenom II 965 BE and Core 2 Q9650 are well matched, Core 2 being slightly more efficient clock for clock.


----------



## LAN_deRf_HA (Apr 27, 2010)

I usually don't consider mismatched stock speed comparisons as phenom and core 2 chips have roughly equal clock speed potentials, so clock for clock is most relevant for enthusiasts.


----------



## HillBeast (Apr 27, 2010)

FordGT90Concept said:


> What type of calculations does this appication do?  Lemme guess, floating point?



I can't remember. It was a long time ago. I don't think so though. I'm pretty sure it was just simple arithmetic.


----------



## FordGT90Concept (Apr 27, 2010)

I haven't tested any Core 2s and you didn't list any Phenom IIs so it's hard to line up my multithreaded charts to your single-threaded chart.  Still, note how close Core 2 (Penryn) is to Core i7 despite the major architectural changes (namely, moving the memory controllers to the chip).


----------



## HillBeast (Apr 27, 2010)

FordGT90Concept said:


> I haven't tested any Core 2s and you didn't list any Phenom IIs so it's hard to line up my multithreaded charts to your single-threaded chart.  Still, note how close Core 2 (Penryn) is to Core i7 despite the major architectural changes (namely, moving the memory controllers to the chip).



Yeah after my bad experience with my original Phenom, I promised myself to stay away from them. I was an AMD boy until I got my first Phenom and it really let me down.

I don't think my program is affected by memory much, hence the very low difference.


----------



## FordGT90Concept (Apr 27, 2010)

I predicted (back on Hardware Analysis) the Phenom was going to suck before it even had a name (it was referred to as K8L or K10 at the time).  I avoided them like the plague when they did finally launch like 18 months late.  I'm glad they finally got off their rubbish 65nm fab with the Phenom II and Athlon II processors though (they are decent for mainstream systems).


----------

