# Arctic Leaks Bucket List of Socket LGA1150 Processor Model Numbers



## btarunr (Dec 31, 2012)

CPU cooler manufacturer Arctic (aka Arctic Cooling) may have inadvertently leaked a very long list of 4th generation Intel Core processors based on its LGA1150 socket. Longer than any currently posted lists of Core "Haswell" processors, the leak includes model numbers of nine Core i7, seventeen Core i5, five Core i3, and two Pentium models. Among the Core i7 models are already known i7-4770K flagship chip, i7-4770S, and a yet-unknown i7-4765T. The Core i5 processor list is exhaustive, and it appears that Intel wants to leave no price-point unattended. The Core i5-4570K could interest enthusiasts. In comparison to the Core i5 list, the LGA1150 Core i3 list is surprisingly short, indicating Intel is serious about phasing out dual-core chips. The Pentium LGA1150 list is even shorter. 

The list of LGA1150 processor models appears to have been leaked in the data-sheets of one of its coolers, in the section that lists compatible processors. LGA1150 appears to have the same exact cooler mount-hole spacing as LGA1155 and LGA1156 sockets, and as such upgrading CPU cooler shouldn't be on your agenda. Intel's 4th generation Core processor family is based on Intel's spanking new "Haswell" micro-architecture, which promises higher performance per-core, and significantly faster integrated graphics over previous generation. The new chips will be built on Intel's now-mature 22 nm silicon fabrication process. The new chips will begin to roll out in the first-half of 2013.



 



*View at TechPowerUp Main Site*


----------



## Prima.Vera (Dec 31, 2012)

So no more for LGA1155 ??


----------



## btarunr (Dec 31, 2012)

trijesh313 said:


> wow there is 2 unlocked i7



Yes, i7-4670K is 4770K with GT2 IGP.


----------



## Frick (Dec 31, 2012)

Prima.Vera said:


> So no more for LGA1155 ??



Of course not.


----------



## de.das.dude (Dec 31, 2012)

Prima.Vera said:


> So no more for LGA1155 ??



Yeah intel usually doesnt change sockets like this. Oh wait, they do


----------



## Nordic (Dec 31, 2012)

2 cpu generations a socket seem reasonable. I would prefer three. I bet intel could of made haswell 1155 if they had wanted to.


----------



## micropage7 (Dec 31, 2012)

at least they need to use the same retention
but too bad new socket = different retention


----------



## Aquinus (Dec 31, 2012)

james888 said:


> I bet intel could of made haswell 1155 if they had wanted to.



I'm sure they could, but I'm willing to bet that they wouldn't have been able to implement the VRM control changes if they didn't switch.


----------



## dj-electric (Dec 31, 2012)

Wow, someone's gonna get their ass kicked hard.


----------



## 3870x2 (Dec 31, 2012)

james888 said:


> 2 cpu generations a socket seem reasonable. I would prefer three. I bet intel could of made haswell 1155 if they had wanted to.



These are not full generations though.  Look at 775, that was great.  I guess Intel is on the motherboard manufacturers payroll.


----------



## Zubasa (Dec 31, 2012)

3870x2 said:


> These are not full generations though.  Look at 775, that was great.  I guess Intel is on the motherboard manufacturers payroll.


Back in the 775 days Intel had some real competition, now they don't.


----------



## FordGT90Concept (Dec 31, 2012)

Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory?  Probably because it is.

I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales).  Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

So depressing.


----------



## 3870x2 (Dec 31, 2012)

FordGT90Concept said:


> Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory?  Probably because it is.
> 
> I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.
> 
> ...


I know exactly what you mean.  It seems like things are stagnant.  There are mostly just architectural changes that are marginally better than the last.

Now they have tablets and phones with graphics superior to the Xbox 360 and PS3 (if you are unaware of this, check it out.  One good example is the Unreal engine, but there are a few others).

These devices are much more powerful in their graphics than laptops with IGPs.  I will be doing another build this year, probably my final build, and I will keep it in tact to remember the days when we could build our own computer.


----------



## Frick (Dec 31, 2012)

FordGT90Concept said:


> Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory?  Probably because it is.
> 
> I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.
> 
> ...



It's because people don't need more speed. You don't need more speed, you said so yourself. The speed race for consumers ended with the Core 2 Duo. Computers are getting faster, but the applications that actually make use of the speed is getting rarer. This is a good thing, because people don't have to spend tons of cash on computers anymore, and we can focus on better performance/watt ratios. Why is this depressing? Did you really expect everyone would have a super computer in their home, even if that would be retardedly wasteful?


----------



## FordGT90Concept (Dec 31, 2012)

I'm a gamer.  It's depressing because gaming technology isn't advancing.  With all these idle cores and RAM, there should be more than enough hardware to run a pseudo-AI in games yet, it isn't being done.  Where are the slew of simulators that we saw in the 1990s which pushed the bounds of what is possible?  Where's the FarCry titles that attempt to render insane amounts of foilage?  Where's all the physic's-based games that Aegia promised?  Why did gamers kill great inovative titles like Spore because of DRM?  Most of the innovations are coming from indie developers (e.g. Minecraft) but they don't have the resources to take the game design to the next level.  Case in point: look at Conan O'Brien's review of Minecraft.

We're inventing ways to make computers dumber and slower (to the point they're virtualized on "clouds"), not smarter and faster (which created tons of optimism up until about 2008).  Someone needs to modify AMD's logo and change it to "Gaming Devolved" and stamp it on the entire industry.


A Core i7 today is the equivilent of a supercomputer 20 years ago.


----------



## Frick (Dec 31, 2012)

FordGT90Concept said:


> We're inventing ways to make computers dumber and slower (to the point they're virtualized on "clouds"), not smarter and faster (which created tons of optimism up until about 2008).



We're making them "slower" because we don't need them to be faster. As you yourself point out. And the cloud and virtualization is being smarter, not dumber.



FordGT90Concept said:


> A Core i7 today is the equivilent of a supercomputer 20 years ago.



They are, but that is not what I meant.

Also Minecraft is huuuggeellly overrated as a game. I'm not sure I want to call it a game. But that is OP.


----------



## badtaylorx (Dec 31, 2012)

wow....i think you guys are missing something....


Sandy Bridge was THAT GOOD!!!

it's skewing your perceptions


----------



## NinkobEi (Dec 31, 2012)

Intel is focusing on integrating more video processing power into its processors rather than just creating more powerful processors. I can see the reasoning behind that. Once they get a good architecture they can trim the fat, shrink it and slap it into the new iPad7 and make way more money than they could by selling us tech-jerks new hardware.


----------



## jihadjoe (Dec 31, 2012)

If we are going to get faster hardware, the ball really is in the software developers' park to make something that will really push current CPUs to their knees. Otherwise Intel will be more than happy focusing on increasing performance-per-watt over outright performance itself.


----------



## newtekie1 (Dec 31, 2012)

FordGT90Concept said:


> Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory?  Probably because it is.



Yep, I'll likely still be running my 875K until at least the end of 2013. I haven't found anything yet it wasn't good enough for, and I don't see anything coming in the next year either.



FordGT90Concept said:


> I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.
> 
> Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales).  Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.
> 
> ...




Back in the early 2000's Software was outpacing hardware.  For a lot of what people wanted to do a single core Pentium 4 just wasn't enough anymore, consumers were starting to move more into digital photograph editing and digital movie editing. Functions that used to only be reserved for high end multi-processor workstations.  Now the reverse seems true, the hardware has outpaced the software, and there isn't really a large consumer demand for faster hardware, the hardware we have or have had for a while, does everything pretty fast already.


----------



## TheinsanegamerN (Dec 31, 2012)

i dont see why it is so bad if intel focuses on performance per watt. i for one would love to have the core i5's performance in a 45 watt desktop package.


----------



## Frick (Dec 31, 2012)

newtekie1 said:


> isn't really a large consumer demand for faster hardware, the hardware we have or have had for a while, does everything pretty fast already.



Conroe was the end all solution. A computer from that time, with upgraded RAM, is still more than enough for the majorityof consumers. Imagine using a 1997 CPU in 2002..


----------



## Prima.Vera (Dec 31, 2012)

I'm really curious how the new i7 4770 will perform compared to 3770. If it will be the same performance gap like between 3770 and 2700 I foresee some dark times for Intel...


----------



## TRWOV (Dec 31, 2012)

Frick said:


> Conroe was the end all solution. A computer from that time, with upgraded RAM, is still more than enough for the majorityof consumers. Imagine using a 1997 CPU in 2002..



True that. And if you take into account that 1366x768 is the most popular resolution you don't even need a gaming grade graphics card. About a year ago I build a PC for a relative with a Celeron G540, H61 board, 4GB RAM, HD6670 GDDR5 and he couldn't be happier (he was "gaming" on a P4 3.2C + X800XT).


----------



## ensabrenoir (Dec 31, 2012)

ALL so true......well almost all. The pad era will either fade away or spawn a pad with the power of a lap/desktop.  Evolve or die guys...There will always be modders and overclockers..just in new formats...


----------



## Aquinus (Dec 31, 2012)

Prima.Vera said:


> I'm really curious how the new i7 4770 will perform compared to 3770. If it will be the same performance gap like between 3770 and 2700 I foresee some dark times for Intel...



Maybe. I think it really depends on how much less power it would use when running faster as well and how well AMD does with their processors. I don't think Intel is store for dark times though regardless of how much faster the 4770 is over the 3770.


----------



## 3870x2 (Dec 31, 2012)

Prima.Vera said:


> I'm really curious how the new i7 4770 will perform compared to 3770. If it will be the same performance gap like between 3770 and 2700 I foresee some dark times for Intel...



marginally better.  Intel releases are like Call of Duty titles now.


----------



## ensabrenoir (Dec 31, 2012)

Aquinus said:


> Maybe. I think it really depends on how much less power it would use when running faster as well and how well AMD does with their processors.* I don't think Intel is store for dark times though regardless of how much faster the 4770 is over the 3770.*




Honestly I think even intel sees the writing on the wall....times are a changing... who wants to be the best vinyl record, casette tape & cd maker in an ipod world?  I think were gonna see a new focus from intel.


----------



## 3870x2 (Dec 31, 2012)

TRWOV said:


> True that. And if you take into account that 1366x768 is the most popular resolution you don't even need a gaming grade graphics card. About a year ago I build a PC for a relative with a Celeron G540, H61 board, 4GB RAM, HD6670 GDDR5 and he couldn't be happier (he was "gaming" on a P4 3.2C + X800XT).



The X800XT is the highest of the high end, he must have been doing great.


----------



## jagd (Dec 31, 2012)

There are  2 things you need more powerful / faster cpu ; gaming and video editing ( for a reguler home user of course ) . Anything else does  not come to my mind need better ,any cpu on market will work for every work outside gaming/video editing for usual person i guess ( probably i missed some software with high demands , )




FordGT90Concept said:


> Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory?  Probably because it is.
> (


----------



## ensabrenoir (Dec 31, 2012)

the debate continues http://hexus.net/qabqtn and a little more on Haswell http://www.legitreviews.com/news/14784/


----------



## Jstn7477 (Dec 31, 2012)

Although LGA 1150 is coming a little "too soon," I'm actually happy that Intel released Ivy Bridge processors. I hate to sound like some kind of eco friendly nutcase, but compare the power consumption of a 45nm i7 processor with that of a 3770K and it is a huge difference. I haven't gotten my 3770K @ 4.3GHz/1.175v to consume more than 80 watts of power in IntelBurnTest (~60w in crunching, measured with HWMonitor reading the digital VRM interface) when my 2600K at 4.3GHz/1.3v took 130w in IntelBurnTest and 80w crunching. My i7-870 seems to dump out even more heat and is slower, so the advances in the process nodes are more significant than the actual performance increases. 

Intel in the past few years has essentially done what the automotive industry has done in recent years: increase efficiency. You wouldn't use an old Chevy 454 big block these days over a Vortec 5300 or a brand new V6, would you? (unless you enjoy getting 8 MPG)


----------



## newtekie1 (Dec 31, 2012)

jagd said:


> There are 2 things you need more powerful / faster cpu ; gaming and video editing ( for a reguler home user of course ) . Anything else does not come to my mind need better ,any cpu on market will work for every work outside gaming/video editing for usual person i guess ( probably i missed some software with high demands , )



I disagree, for the average consumer, gaming and video editing does not require a faster CPU than his or my several generation old i7.  Gaming is basically all GPU limited at this point, the CPU makes little difference, and most video editors on the consumer level run pretty well on any decent quad-core.  The only thing a faster CPU helps on is render times, but even still, the difference between the two is maybe a minute to render a decent length video.


----------



## Steevo (Dec 31, 2012)

The hardware side is there, we are waiting on the software side to catch up, and the next set of instructions that will enable the next major breakthroughs in processing. Look at what extra instruction sets did, and now how much of them are in use in all reality VS plain basic operation? Software companies are looking to cover as many systems as possible and to remain compatable with consoles and other devices, once we get past these issues I would imagine the whole need for speed will become more of a need for optimization.


----------



## Frick (Dec 31, 2012)

ensabrenoir said:


> [/B]
> 
> Honestly I think even intel sees the writing on the wall....times are a changing... who wants to be the best vinyl record, casette tape & cd maker in an ipod world?  I think were gonna see a new focus from intel.



This is OP but audio is a very VERY bad example as it's a different beast alltogether. The best of those are titanicly expensive, and titanicly good despite being "old". If you get a chance, listen to a good setup with a Linn LP12 and be prepared to be blown away.

When talking about computers it's often best to talk about computers. Don't bring in other stuff that has nothing to do with them.


----------



## ensabrenoir (Dec 31, 2012)

Frick said:


> This is OP but audio is a very VERY bad example as it's a different beast alltogether. The best of those are titanicly expensive, and titanicly good despite being "old". If you get a chance, listen to a good setup with a Linn LP12 and be prepared to be blown away.
> 
> When talking about computers it's often best to talk about computers. Don't bring in other stuff that has *nothing* to do with them.



1. You miss the point 

2. Seriously?  its just an analogy


----------



## EarthDog (Dec 31, 2012)

Frick said:


> Of course not.


Of course. There may be some low end chips that sneak out, otherwise, its toast. 



de.das.dude said:


> Yeah intel usually doesnt change sockets like this. Oh wait, they do


Yeah, 2 gen's just arent enough these days... 



james888 said:


> 2 cpu generations a socket seem reasonable. I would prefer three. I bet intel could of made haswell 1155 if they had wanted to.


+1


----------



## Cortex (Dec 31, 2012)

Frick said:


> We're making them "slower" because we don't need them to be faster. As you yourself point out. And the cloud and virtualization is being smarter, not dumber.



Think about this when you need to render something (video or raytracing), or play Crysis 3...

And virtualization is violation of user's privacy. So, it is being dumber from user's point of view. It just means that hardware and software are owned by an Corporation (could be evil, what do you know; If you ask me, even USA government is "evil", most governments are), and information is controlled by them.

Edit: I have logo of maybe largest cloud computing vendor for avatar. Ups.


----------



## Frick (Dec 31, 2012)

ensabrenoir said:


> 1. You miss the point
> 
> 2. Seriously?  its just an analogy



Maybe so, but it was a bad analogy. A lot of analogys are stupid.



Cortex said:


> Think about this when you need to render something (video or raytracing), or play Crysis 3...
> 
> And virtualization is violation of user's privacy. So, it is being dumber from user's point of view. It just means that hardware and software are owned by an Corporation (could be evil, what do you know; If you ask me, even USA government is "evil", most governments are), and information is controlled by them.



How many people render stuff, and do you need anything more than an i5 to play Crysis 3 properly? I have no idea, but I coldly assume it'll play just fine on an i3 or anything from AMD. Power users and people who needs workstations are a different lot from most end consumers, which was what I was talking about.

And you're confusing virtualization with cloud stuff. And the rest is good ol' American paranoia which is kinda silly imo.


----------



## Delta6326 (Dec 31, 2012)

I can't wait foe these to come out! I can finally upgrade from my old Core2Quad! Now I just need to buckle down, study and pass my test and get a job.

I will gladly take a i7 4770k and AMD 8970/Nvidia 780...


----------



## ensabrenoir (Dec 31, 2012)

Frick said:


> Maybe so, but it was a bad analogy. A lot of analogys are stupid.
> 
> 
> 
> ...



  you think quite highly of yourself... don't you....thankfully  not everyone and or everything conforms to your realm of reasoning... but that's what makes us all  special..... carry on!


----------



## EarthDog (Dec 31, 2012)

I understood exactly what the analogy was... LOL!


----------



## Jurassic1024 (Dec 31, 2012)

FordGT90Concept said:


> Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory?  Probably because it is.
> 
> I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.
> 
> ...




None of that is actually true. In fact, reading it again sounds a LOT like you're describing AMD's CPU division....

Higher clock speeds year after year?  Show me.  Oh right you can't because [Haswell] clocks are the same as Ivy Bridge.

Computers getting slower?  Can you say GPU acceleration, impressive GPU drivers from AMD this year, Boost clocks from both nVIDIA and AMD (+ new SKU), Hyper-Threading, super cheap RAM, and <$1/GB SSD's?

Stay off the crack and out of the comment sections.


----------



## Frick (Dec 31, 2012)

ensabrenoir said:


> you think quite highly of yourself... don't you....thankfully  not everyone and or everything conforms to your realm of reasoning... but that's what makes us all  special..... carry on!



So what is wrong with my reasoning? Seriously, if there is anything actually wrong with it I'll correct it. FOr realz.

And I still maintain it was a bad analogy, and that most of them are bad.


----------



## EarthDog (Dec 31, 2012)

LOL what makes a bad analogy bad is that its not understood and isnt relevant. At least for me, I could easily pick up on it.


----------



## FordGT90Concept (Jan 1, 2013)

Jurassic1024 said:


> Higher clock speeds year after year?  Show me.  Oh right you can't because [Haswell] clocks are the same as Ivy Bridge.


Core i7 980X = 3.33 GHz
Core i7 3970X = 3.5 GHz

There are no 6-core IvyBridge-based processors out yet.




Jurassic1024 said:


> Computers getting slower?  Can you say GPU acceleration, impressive GPU drivers from AMD this year, Boost clocks from both nVIDIA and AMD (+ new SKU), Hyper-Threading, super cheap RAM, and <$1/GB SSD's?


The average of all computing devices is getting slower.  Desktops are losing market share as weak ultrabooks, tablets, and phones take over more suited to play Angry Birds than Crysis.


----------



## Major_A (Jan 1, 2013)

FordGT90Concept said:


> I'm a gamer.  It's depressing because gaming technology isn't advancing.  With all these idle cores and RAM



That's no lie.  Very few games take advantage of quad core CPUs and even less run 64 bit natively to use more than a couple gigs of RAM.  I remember the rumor of the original Athlon FX working on reverse hyper threading.  I still wish that would come to fruition.  I'd rather run a game on a 16Ghz single core machine than have the game barely tax 2 of my 4 cores.


----------



## FordGT90Concept (Jan 1, 2013)

Reverse hyperthreading was the rumor for Bulldozer.  When it was found to be false, enthusaism for it plummeted.


----------



## HumanSmoke (Jan 1, 2013)

3870x2 said:


> james888 said:
> 
> 
> > 2 cpu generations a socket seem reasonable. I would prefer three. I bet intel could of made haswell 1155 if they had wanted to.
> ...


I wouldn't say its an Intel-only trait. FM1 lasted how many generations ?

I also seem to remember that mobo makers sold a reasonable number of 990FX/X boards leading up to Bulldozers launch, largely off the back of some AMD guerrilla marketing. What huge advance do the 900 chipsets offer that the 800's don't?


----------



## chinmi (Jan 1, 2013)

finally !! time to upgrade from my old 1155 i5 750 !!! 

probably gonna get that unlocked 4770 version 

- or -

should i wait for the next generation processor & socket ?? (after 1150) ??


----------



## jihadjoe (Jan 1, 2013)

Frick said:


> Conroe was the end all solution. A computer from that time, with upgraded RAM, is still more than enough for the majorityof consumers. Imagine using a 1997 CPU in 2002..



Very true. I'm still on my B3 Q6600 from 2007. I've upgraded to an SSD, added some ram RAM and changed GPU to a modern one, but after a mild OC to 3GHz I've never felt it to be lacking on the CPU side.

Intel's problem really is that they've build something so good it's hard to offer a truly tangible upgrade on the performance side, since most games very quickly become GPU limited anyway. That's probably one of the reasons they've been focusing on improving power instead.

Until the next big game (or other program) comes out and brings the best CPU to its knees, then there's no compelling reason to be pushing for outright performance, rather than the far more reasonable performance-per-watt.


----------



## Prima.Vera (Jan 1, 2013)

I know what you mean. I am still on a Q9650 and really I cannot justify buying a new CPU, mobo and RAMs...For what I mean?


----------



## Aquinus (Jan 1, 2013)

Prima.Vera said:


> I know what you mean. I am still on a Q9650 and really I cannot justify buying a new CPU, mobo and RAMs...For what I mean?



I upgraded from my Phenom II 940 because I was hitting a bottleneck with memory speeds and the only way to do that was to replace the entire platform (considering both the motherboard and CPU where AM2+ with support for only DDR2.) Upgrading eliminated that bottleneck (and provided me 8 DIMM slots for future upgrades) and I don't see my 3820 being inadequate any time soon.

It's a consideration, but if you're rig is working fine for your purposes there is no reason to upgrade. You're right.



chinmi said:


> should i wait for the next generation processor & socket ?? (after 1150) ??



Is it slowing you down? Prima.Vera has a good point, if you're not fully taxing your platform then there is no need to upgrade it.


----------



## Wile E (Jan 1, 2013)

Frick said:


> Maybe so, but it was a bad analogy. A lot of analogys are stupid.
> 
> 
> 
> ...



I love vinyl as an end user, but his analogy is right. I wouldn't want to make them right now. It's a niche market, much like high end work stations and desktops. It would be a hard field to enter into without a lot of resources at your disposal.

And cloud is inferior not because of privacy, but because of uncertainty and security. What if the service goes down? What if there's an error on your account? What if it's hacked? Things of that nature. They are all legitimate concerns. Granted, there are good things about it, but it's far from perfect.

As far as these Haswell chips, I'll likely stick with my 980X for the foreseeable future. At least until I get a larger performance increase from upgrading. That will likely mean an unlocked 8 core or better.


----------



## WhiteLotus (Jan 1, 2013)

I am still kicking myself after selling my Q9600 for an i3-230 because I wanted to scale back and save some money. To this day I still don't know why my reasoning centre of my brain shut down and made me think that was a good idea. Should have just stuck with it. I will get the top offerings of the last generation and then call it a day, when I eventually save enough money to finish a complete overhaul of my system.


----------



## Dimi (Jan 1, 2013)

I think what people forget is that nowadays, who doesn't have a 1080p recording camera or smartphone? My current i7 920 is painfully slow at converting video's compared to a 3570K or 3770K using Intel QuickSync technology. Try converting a DVD or BluRay to iPad format.


----------



## Frick (Jan 1, 2013)

Wile E said:


> I love vinyl as an end user, but his analogy is right. I wouldn't want to make them right now. It's a niche market, much like high end work stations and desktops. It would be a hard field to enter into without a lot of resources at your disposal.
> 
> And cloud is inferior not because of privacy, but because of uncertainty and security. What if the service goes down? What if there's an error on your account? What if it's hacked? Things of that nature. They are all legitimate concerns. Granted, there are good things about it, but it's far from perfect.



Yeah when I thought about it some more it's a quite good analogy actually, as you say niche markets etc etc. I didn't think enough of it.

And it's not perfect, but it's far from horrible as some people make it out to be. It's all about ones needs.

Edit: also i apologize if i came across as bitchy. Cant remember now what i wrote and im on the phone but i have a feeling i was bitchy, and if so i apologize.

@aquinus: as P.M say below, what did you do to have a bottleneck in specifically the memory?


----------



## Prima.Vera (Jan 1, 2013)

Aquinus said:


> I upgraded from my Phenom II 940 because I was hitting a bottleneck with memory speeds and the only way to do that was to replace the entire platform (considering both the motherboard and CPU where AM2+ with support for only DDR2.) Upgrading eliminated that bottleneck (and provided me 8 DIMM slots for future upgrades) and I don't see my 3820 being inadequate any time soon.
> ...



That's interesting. What apps are you using that caps your DDR2 ? Usually I thought is the CPU or GPU is the main problem. I have DDR2 occ at 1200Mhz and still got better scores than DDR3@1600Mhz for example.


----------



## Aquinus (Jan 1, 2013)

Prima.Vera said:


> That's interesting. What apps are you using that caps your DDR2 ?



Believe it or not Starcraft 2 was the worst offender. Frame rates dropped by large amounts and there was a ton of CPU and GPU power handy and you could watch on certain scenes where both GPU and CPU usage dropped along with the frame rate. I started adjusting the CPU/NB speed and it was making a difference. I was running DDR2-800 @ 5-5-5-15 at the time, but it wouldn't run much faster than 950Mhz if I wanted to keep it stable. So it was either the L3 cache or DRAM. Either way, since I upgraded to the 3820 even with CFX disabled, it's day and night difference as more and more units are on the map.

As far as the other software, it's hard to say if it was the DRAM because I only tested it a little bit and making the switch from the 940 to the 3820 made everything run faster in general. It was just a good upgrade, sans the power consumption, but that doesn't bother me since the 940 could eat just as much and give me less. 

Not to say that the 940 was a bad chip, it was a very capable rig. I had it for years and it was due to be upgraded. Plus, SB-E handles virtualization very well. I can't complain.

Also I have my machine crunching right now and my rig itself (no monitors) isn't eating more than 400-watts (385-watts to be exact, both GPUs loaded and only the CPU overclocked). Overclocking the 6870s can push that usage up towards 500-watts and in some cases up to 550-watts.


----------

