Monday, December 31st 2012

Arctic Leaks Bucket List of Socket LGA1150 Processor Model Numbers

CPU cooler manufacturer Arctic (aka Arctic Cooling) may have inadvertently leaked a very long list of 4th generation Intel Core processors based on its LGA1150 socket. Longer than any currently posted lists of Core "Haswell" processors, the leak includes model numbers of nine Core i7, seventeen Core i5, five Core i3, and two Pentium models. Among the Core i7 models are already known i7-4770K flagship chip, i7-4770S, and a yet-unknown i7-4765T. The Core i5 processor list is exhaustive, and it appears that Intel wants to leave no price-point unattended. The Core i5-4570K could interest enthusiasts. In comparison to the Core i5 list, the LGA1150 Core i3 list is surprisingly short, indicating Intel is serious about phasing out dual-core chips. The Pentium LGA1150 list is even shorter.

The list of LGA1150 processor models appears to have been leaked in the data-sheets of one of its coolers, in the section that lists compatible processors. LGA1150 appears to have the same exact cooler mount-hole spacing as LGA1155 and LGA1156 sockets, and as such upgrading CPU cooler shouldn't be on your agenda. Intel's 4th generation Core processor family is based on Intel's spanking new "Haswell" micro-architecture, which promises higher performance per-core, and significantly faster integrated graphics over previous generation. The new chips will be built on Intel's now-mature 22 nm silicon fabrication process. The new chips will begin to roll out in the first-half of 2013.
Source: Expreview
Add your own comment

58 Comments on Arctic Leaks Bucket List of Socket LGA1150 Processor Model Numbers

#2
btarunr
Editor & Senior Moderator
trijesh313wow there is 2 unlocked i7
Yes, i7-4670K is 4770K with GT2 IGP.
Posted on Reply
#3
Frick
Fishfaced Nincompoop
Prima.VeraSo no more for LGA1155 ??
Of course not.
Posted on Reply
#4
de.das.dude
Pro Indian Modder
Prima.VeraSo no more for LGA1155 ??
Yeah intel usually doesnt change sockets like this. Oh wait, they do :roll:
Posted on Reply
#5
Nordic
2 cpu generations a socket seem reasonable. I would prefer three. I bet intel could of made haswell 1155 if they had wanted to.
Posted on Reply
#6
micropage7
at least they need to use the same retention
but too bad new socket = different retention
Posted on Reply
#7
Aquinus
Resident Wat-man
james888I bet intel could of made haswell 1155 if they had wanted to.
I'm sure they could, but I'm willing to bet that they wouldn't have been able to implement the VRM control changes if they didn't switch.
Posted on Reply
#8
dj-electric
Wow, someone's gonna get their ass kicked hard.
Posted on Reply
#9
3870x2
james8882 cpu generations a socket seem reasonable. I would prefer three. I bet intel could of made haswell 1155 if they had wanted to.
These are not full generations though. Look at 775, that was great. I guess Intel is on the motherboard manufacturers payroll.
Posted on Reply
#10
Zubasa
3870x2These are not full generations though. Look at 775, that was great. I guess Intel is on the motherboard manufacturers payroll.
Back in the 775 days Intel had some real competition, now they don't.
Posted on Reply
#11
FordGT90Concept
"I go fast!1!11!1!"
Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.

I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales). Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

So depressing. :(
Posted on Reply
#12
3870x2
FordGT90ConceptWhy is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.

I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales). Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

So depressing. :(
I know exactly what you mean. It seems like things are stagnant. There are mostly just architectural changes that are marginally better than the last.

Now they have tablets and phones with graphics superior to the Xbox 360 and PS3 (if you are unaware of this, check it out. One good example is the Unreal engine, but there are a few others).

These devices are much more powerful in their graphics than laptops with IGPs. I will be doing another build this year, probably my final build, and I will keep it in tact to remember the days when we could build our own computer.
Posted on Reply
#13
Frick
Fishfaced Nincompoop
FordGT90ConceptWhy is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.

I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales). Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

So depressing. :(
It's because people don't need more speed. You don't need more speed, you said so yourself. The speed race for consumers ended with the Core 2 Duo. Computers are getting faster, but the applications that actually make use of the speed is getting rarer. This is a good thing, because people don't have to spend tons of cash on computers anymore, and we can focus on better performance/watt ratios. Why is this depressing? Did you really expect everyone would have a super computer in their home, even if that would be retardedly wasteful?
Posted on Reply
#14
FordGT90Concept
"I go fast!1!11!1!"
I'm a gamer. It's depressing because gaming technology isn't advancing. With all these idle cores and RAM, there should be more than enough hardware to run a pseudo-AI in games yet, it isn't being done. Where are the slew of simulators that we saw in the 1990s which pushed the bounds of what is possible? Where's the FarCry titles that attempt to render insane amounts of foilage? Where's all the physic's-based games that Aegia promised? Why did gamers kill great inovative titles like Spore because of DRM? Most of the innovations are coming from indie developers (e.g. Minecraft) but they don't have the resources to take the game design to the next level. Case in point: look at Conan O'Brien's review of Minecraft.

We're inventing ways to make computers dumber and slower (to the point they're virtualized on "clouds"), not smarter and faster (which created tons of optimism up until about 2008). Someone needs to modify AMD's logo and change it to "Gaming Devolved" and stamp it on the entire industry.


A Core i7 today is the equivilent of a supercomputer 20 years ago.
Posted on Reply
#15
Frick
Fishfaced Nincompoop
FordGT90ConceptWe're inventing ways to make computers dumber and slower (to the point they're virtualized on "clouds"), not smarter and faster (which created tons of optimism up until about 2008).
We're making them "slower" because we don't need them to be faster. As you yourself point out. And the cloud and virtualization is being smarter, not dumber.
FordGT90ConceptA Core i7 today is the equivilent of a supercomputer 20 years ago.
They are, but that is not what I meant.

Also Minecraft is huuuggeellly overrated as a game. I'm not sure I want to call it a game. But that is OP.
Posted on Reply
#16
badtaylorx
wow....i think you guys are missing something....


Sandy Bridge was THAT GOOD!!!

it's skewing your perceptions
Posted on Reply
#17
NinkobEi
Intel is focusing on integrating more video processing power into its processors rather than just creating more powerful processors. I can see the reasoning behind that. Once they get a good architecture they can trim the fat, shrink it and slap it into the new iPad7 and make way more money than they could by selling us tech-jerks new hardware.
Posted on Reply
#18
jihadjoe
If we are going to get faster hardware, the ball really is in the software developers' park to make something that will really push current CPUs to their knees. Otherwise Intel will be more than happy focusing on increasing performance-per-watt over outright performance itself.
Posted on Reply
#19
newtekie1
Semi-Retired Folder
FordGT90ConceptWhy is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.
Yep, I'll likely still be running my 875K until at least the end of 2013. I haven't found anything yet it wasn't good enough for, and I don't see anything coming in the next year either.
FordGT90ConceptI'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales). Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

So depressing. :(
Back in the early 2000's Software was outpacing hardware. For a lot of what people wanted to do a single core Pentium 4 just wasn't enough anymore, consumers were starting to move more into digital photograph editing and digital movie editing. Functions that used to only be reserved for high end multi-processor workstations. Now the reverse seems true, the hardware has outpaced the software, and there isn't really a large consumer demand for faster hardware, the hardware we have or have had for a while, does everything pretty fast already.
Posted on Reply
#20
TheinsanegamerN
i dont see why it is so bad if intel focuses on performance per watt. i for one would love to have the core i5's performance in a 45 watt desktop package.
Posted on Reply
#21
Frick
Fishfaced Nincompoop
newtekie1isn't really a large consumer demand for faster hardware, the hardware we have or have had for a while, does everything pretty fast already.
Conroe was the end all solution. A computer from that time, with upgraded RAM, is still more than enough for the majorityof consumers. Imagine using a 1997 CPU in 2002..
Posted on Reply
#22
Prima.Vera
I'm really curious how the new i7 4770 will perform compared to 3770. If it will be the same performance gap like between 3770 and 2700 I foresee some dark times for Intel...
Posted on Reply
#23
TRWOV
FrickConroe was the end all solution. A computer from that time, with upgraded RAM, is still more than enough for the majorityof consumers. Imagine using a 1997 CPU in 2002..
True that. And if you take into account that 1366x768 is the most popular resolution you don't even need a gaming grade graphics card. About a year ago I build a PC for a relative with a Celeron G540, H61 board, 4GB RAM, HD6670 GDDR5 and he couldn't be happier (he was "gaming" on a P4 3.2C + X800XT).
Posted on Reply
#24
ensabrenoir
ALL so true......well almost all. The pad era will either fade away or spawn a pad with the power of a lap/desktop. Evolve or die guys...There will always be modders and overclockers..just in new formats...
Posted on Reply
#25
Aquinus
Resident Wat-man
Prima.VeraI'm really curious how the new i7 4770 will perform compared to 3770. If it will be the same performance gap like between 3770 and 2700 I foresee some dark times for Intel...
Maybe. I think it really depends on how much less power it would use when running faster as well and how well AMD does with their processors. I don't think Intel is store for dark times though regardless of how much faster the 4770 is over the 3770.
Posted on Reply
Add your own comment
Nov 21st, 2024 11:51 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts