• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Ivy Bridge Quad-Core to Have 77W TDP, Intel Plans for LGA1155 Ivy Bridge Entry

I guess that they have to at least compete with themselves if they want to sell something and for that they probably need to soundly beat their previous CPUs. Bear in mind how long it's been since first Quad cores were released, and those are enough for most people. Even Core i7 is been here long enough. Enthusiasts will upgrade for a little improvement but the mayority of people will not upgrade unless there's a significant advancement*, and the market is pretty much saturated. Intel has already seen that the market is slowing down, not all quarters have been as good as they thought despite not having a real competitor, so they need to keep moving or simply see their market shrink and eventually die of starvation (Intel is big company that needs high revenues in order to survive).

* I know many people that only do office work, that still use single cores like P4, Athlon XP and heck I still have a 486 that works (if only was useful). Obviously those are not usable by todays standards, or not desirable (P4/Athlon XP), but a quad core CPU will always be enough for "home use" unless tasks like web surfing and office are deliberately made a lot harder to run, wich would stink even to illitertes IMO. So people would hold on to those for as long as the hardware lived or until there's a new CPU that brings a massive performance advantage for very little price. Intel can't just afford to sell you 1 CPU every 10 years, they need you tu buy their CPUs every 3-4 years minimum.

Excellent points. :toast:
 
I remember 775, i also remember having to buy a new mobo everytime i upgraded to a newer chip, regardless of it having the same socket.
I recently got an ASRock z68 extreme4 gen3, but i bet despite it supporting Pci-e 3 theres still some reason we'll all need a new mobo when Ivy Bridge releases.

You got unlucky. I used the same motherboard for Cedar Mill, Conroe and Wolfdale and never felt that I was missing out on anything. I can't actually think of a 775 chipset that couldn't support Wolfdale through bios flashing. Nforce 5 maybe?
 
Sure, both a car and a motorcycle can take you places, it's just that each is fundamentally incapable of certain tasks the other is capable of, and neither can replace the other.

That's not to say that ARM and x86 will never cross paths. Intel is struggling to miniaturize x86 to maintain performance per watt levels comparable to ARM, while ARM designers like NVIDIA dream of one day kicking out x86 processor from the PC.

Can't you see all the things that have literally migrated over to ARM platforms? See the incredible evolution since the iPhone was released?

I don't need x86 to do any of these things anymore.

-Email
-Websurfing
-Games (think Angrybirds which is now much more popular on ARM than PC)
-Videoconfrencing via skype
-Productivity software (is still in its infancy)

Hell, if I could walk up to a keyboard and monitor and wirelessly sync my phone with them I don't think I would need a PC (once MS gets their ass in gear and makes a proper ARM port of Office). The only problem right now with smartphones and tablets is the interface is sometimes not as good as the PC counterparts especially for productivity tasks. Optional sync solves this completely.

Sure some tasks require serious horsepower, but not many and rarely does the average person need it. Also, cloud processing will lend a helping hand in this respect when the time comes. This is the heavy lifting that Intel will be doing.

Anyways, its crystal clear to me. I'm surprised others don't see it.
 
You got unlucky. I used the same motherboard for Cedar Mill, Conroe and Wolfdale and never felt that I was missing out on anything. I can't actually think of a 775 chipset that couldn't support Wolfdale through bios flashing. Nforce 5 maybe?

I had Nforce 6 which couldn't overclock 65nm quads or any 45nm chips for shit, and an old 975 board that didn't like either.
I also got a p45 to replace the p35 i had for pci-e 2, admittedly that one wasn't necessary
 
^ Even Microsoft is now into the ARM game.

Windows 8 will run on ARM.

Honestly, what part of this isn't a direct, immediate and future threat to Intel and x86?
 
Intel's threat isn't AMD.

Intel's threat is ARM, and they need to move fast on die shrinks and aggressive power saving to meet this threat. It's coming.

Intel's thread is also Vision. Where AMD is doing relatively well.
 
Why impressed? Remember LGA 775?

Yep I remember, my Pentium 4 board (socket 775) wasn't compatible with a Pentium D and a Pentium D board wasn't compatible with Core 2 Duo. They were changing the chipset instead of the socket.

But anyway, Ivy looks really good I think we could expect an 8 core desktop part as well...
 
When ARM finally will meet requirements of a modern workstation (execute complex javascript powered webpages/webapps; run games; encode/edit videos etc; able to run VMs/virtualize) it will be just as heavy/"power inefficient" as x86. And everybody will choose x86 because of tons of legacy software supported.
 
Can't you see all the things that have literally migrated over to ARM platforms? See the incredible evolution since the iPhone was released?

I don't need x86 to do any of these things anymore.

-Email
-Websurfing
-Games (think Angrybirds which is now much more popular on ARM than PC)
-Videoconfrencing via skype
-Productivity software (is still in its infancy)

Hell, if I could walk up to a keyboard and monitor and wirelessly sync my phone with them I don't think I would need a PC (once MS gets their ass in gear and makes a proper ARM port of Office). The only problem right now with smartphones and tablets is the interface is sometimes not as good as the PC counterparts especially for productivity tasks. Optional sync solves this completely.

Sure some tasks require serious horsepower, but not many and rarely does the average person need it. Also, cloud processing will lend a helping hand in this respect when the time comes. This is the heavy lifting that Intel will be doing.

Anyways, its crystal clear to me. I'm surprised others don't see it.

None of them things have 'literally migrated' to mobile devices at all, sure they can do them, but every single one is substandard compared to doing them on a pc or mac
 
When ARM finally will meet requirements of a modern workstation (execute complex javascript powered webpages/webapps; run games; encode/edit videos etc; able to run VMs/virtualize) it will be just as heavy/"power inefficient" as x86. And everybody will choose x86 because of tons of legacy software supported.

Those ARM Processors are doubling in cores with pretty much the same power efficiency and increasing in Ghz.

Legacy software support can only be there for so long, if Intel and the market keep progressing for so long. The idea is, everything pretty much get's to old, and sometimes it's replaced with something completely different. (Take for instance, Steve Job's and everything he has changed drastically.)

@ the rate ARM processor's are going, there going to be very fast, fast enough to run java, encode video's, and add VM's.
What make's you think they will progress to heavy, power hogging design's where EVERYTHING in there trend is to stay small, no matter what.

ARM processor's are going to keep there market, they have to because that's what they compete the best in, there going to grow enough where they can finally address other market's with there own version's of software and compatibility.

It's like saying, ARM's will never progress, because the only thing ARM processors are not doing right now, is everything you mentioned above. Believe it when you see ARM processors with java capability's, Vm's, Encoding, Heavy gaming.

Because that's all they got left to start progressing towards.

And if anything there will be ground breaking new software that will do exactly everything you said above, for ARM processor's that's all they have to look forward to. Nothing's necessarily stopping them.
 
How do we know if our motherboards support
" The motherboards feature ME8L UEFI update. For this:
o Your motherboard support currently feature a UEFI firmware
o It should support ME8L update process at the physical level, where the EEPROM is sufficiently large" ?

Will be buying the MSI Z68A-65GD G3

You can find out if the board you're about to buy features UEFI by looking up reviews. As for "sufficiently large" EEPROM, We need to find out more about what's "sufficiently large", and what else goes into making a UEFI-driven motherboard capable of ME8Legacy update. One thing that's clear is that upgrading to that Ivy Bridge-compatible firmware isn't looking to be as easy as flashing your board. Boards with legacy BIOS are technically already locked out of Ivy Bridge, unless Gigabyte's BIOS engineers pull off some hack like they did with HybridEFI (which isn't actually EFI, but legacy BIOS with an address space tweak that lets it boot from large volumes).
 
You can find out if the board you're about to buy features UEFI by looking up reviews. As for "sufficiently large" EEPROM, We need to find out more about what's "sufficiently large", and what else goes into making a UEFI-driven motherboard capable of ME8Legacy update. One thing that's clear is that upgrading to that Ivy Bridge-compatible firmware isn't looking to be as easy as flashing your board. Boards with legacy BIOS are technically already locked out of Ivy Bridge, unless Gigabyte's BIOS engineers pull off some hack like they did with HybridEFI (which isn't actually EFI, but legacy BIOS with an address space tweak that lets it boot from large volumes).

I'll be keeping my PCIe 3.0 samples to check out how it works, when an updated BIOS and these CPUs are released. I've made commitments to both MSi and GIgabyte to cover the issue when devices are available in the retail space.

Because most gigabyte boards support dual BIOS, I could see them flashing to backup BIOS first, and then an updated EFI firmware might be possible.

And you are right, it's nothing more than a "tweaked drive controller BIOS" currently, as "address space tweak", i do not think covers how much of a hack it really is.
 
Can't you see all the things that have literally migrated over to ARM platforms? See the incredible evolution since the iPhone was released?

I don't need x86 to do any of these things anymore.

-Email
-Websurfing
-Games (think Angrybirds which is now much more popular on ARM than PC)
-Videoconfrencing via skype
-Productivity software (is still in its infancy)

Hell, if I could walk up to a keyboard and monitor and wirelessly sync my phone with them I don't think I would need a PC (once MS gets their ass in gear and makes a proper ARM port of Office). The only problem right now with smartphones and tablets is the interface is sometimes not as good as the PC counterparts especially for productivity tasks. Optional sync solves this completely.

Sure some tasks require serious horsepower, but not many and rarely does the average person need it. Also, cloud processing will lend a helping hand in this respect when the time comes. This is the heavy lifting that Intel will be doing.

Anyways, its crystal clear to me. I'm surprised others don't see it.

What makes the most money out of us consumers? Games. The gaming industry is huge. No matter how powerful smartphones become, playing Fallout 5 or COD MW17 on a 3.5" screen still sucks balls.
Consoles and PC's will continue to evolve as long as the desire for immersive escapism is there. Cloud will address some things but latency issues and down time still make it a not for fun option.
Productivity tasks again done on a small screen form are tiresome. Sometimes you need a large 15" screen to get things done.
Miniaturisation of form factor us all fine on dandy but it entirely depends upon the needs basis of the consumer.

I'm with BTA on this one. It is two different avenues that run parallel. And please note as devices become more powerful, battery life is not progressing as much so devices still do not give the operational run time of a mains powered 10kg leviathon (or 17KG as my fat desktop is!).

No, I like my bike for zipping about but when i need to take the kids and dog to the beach, I'll use my 4x4.
 
What makes the most money out of us consumers? Games. The gaming industry is huge. No matter how powerful smartphones become, playing Fallout 5 or COD MW17 on a 3.5" screen still sucks balls.
Consoles and PC's will continue to evolve as long as the desire for immersive escapism is there. Cloud will address some things but latency issues and down time still make it a not for fun option.
Productivity tasks again done on a small screen form are tiresome. Sometimes you need a large 15" screen to get things done.
Miniaturisation of form factor us all fine on dandy but it entirely depends upon the needs basis of the consumer.

I'm with BTA on this one. It is two different avenues that run parallel. And please note as devices become more powerful, battery life is not progressing as much so devices still do not give the operational run time of a mains powered 10kg leviathon (or 17KG as my fat desktop is!).

No, I like my bike for zipping about but when i need to take the kids and dog to the beach, I'll use my 4x4.

If ARM processor's get power enough to run game's like you said, what makes you think there going to play it on a 3.5inch screen.......

What's stopping those engineer's from creating an docking station, or link between wired connection to a monitor if its that fast.

See what you guy's are forgetting is, ARM processor's have no limitation's, what's stopping it from communicating with a big monitor via any type of connection.

There are no limitation's to ARM processor's.

I could see in the future, devices so powerful it is literally a computer in your pocket, and if the user wanted to have more capability's he can connect that device dock/wire/wireless to a bigger panel.....

Imagine the portable platform, when you do not need that big platform, you can remove the device and take it on the go like a psp, something so intertwined with you it compete's in multiple market's.

Nothing is stopping ARM processor's, and there continued innovation to be faster, more portable, and reliable to the consumer, its open....

**EDIT** With battery life, there are limitation's on how fast that progresses, but with all this hogging of Non-renewable resource's, were eventually going to really have to rely on batteries.
Eventually there will be enough of a movement and innovation that we will eventually master the art of saving electricity, because if we dont.
Were going to in the dark age's very very soon.
 
Last edited:
Good stuff Intel, finally thinking and not just spam producing mixed confusion parts.

Still would reduce this to 3 levels. (high - mid - low) and having max 3 or 4 cpu's per level.
12 cpu models vs 128 cpu models should cut down on time + money wasted and Intel running around like a headless chicken that Intel used to be.

Laptop should have max 3 or 4 cpu's in total, embedded / Atom 4x at max products.
Laptop and low power can be merged as they share same concepts except.

Would be ideal if desktop and server could be the same where server has the cherry on the cake or just extra cache or some extra featured using the same base / actual desktop cpu model.
This can then share same sockets and those with the cash can use the server cpu's in their desktops if they have the cash or high end motherboards that allow this.
 
I could see in the future, devices so powerful it is literally a computer in your pocket, and if the user wanted to have more capability's he can connect that device dock/wire/wireless to a bigger panel.....

This will happen, but i doubt it will be anytime soon. Have a look at desktop PC processors/mobos/vgas - size haven't changed at all in last 15 years. Those components mybe even got beefier - customers demand more and more power and chipmakers cant just shrink their chips - they have to shrink them AND double/tripple the transistor count :) . Something really crazy must happen, if we want sizes to go down on mainstream PCs/consoles/laptops - worldwide cloud adaption, some new crazy tech that goes beyon 1nm (NOW!) etc.
 
A computer in your pocket in the future? That's the way it's always been. My smartphone is about as powerful as a 10 year old PC.
 
I doubt smartphones and tablets will pwn the PC. Aside from the many hardware limitations, the OSs and frameworks running them just can't stand to those of the PC. Sure you can pop a few green pigs on your iPhone, but ever thought on what was this game programmed ? Yep, on a PC !

What makes the most money out of us consumers? Games. The gaming industry is huge. No matter how powerful smartphones become, playing Fallout 5 or COD MW17 on a 3.5" screen still sucks balls.

I see what you did there ;)
 
Office computer is still running Socket 478 Pentium 4 2.66Ghz Northwood (130 nm)!:rockout: Still runs great on XP. and my gaming comp. LGA 775 Q6600 is still rocking! But I might upgrade to Ivy Bridge and turn my current comp. into my office comp.
 
My guess is that the TDP is 7w for the GPU part and 70w for the CPU part.
 
My guess is that the TDP is 7w for the GPU part and 70w for the CPU part.

you're forgetting the GPU has always been 1 process behind on Intel CPUs, it may very well be the tradditional 65W CPU with a 12W GPU
 
Last edited:
If ARM processor's get power enough to run game's like you said, what makes you think there going to play it on a 3.5inch screen.......

Why, is that a 17" TouchPad I see sticking out of your pocket? :roll:

ARM processors still have a long way to go before they can compete with the likes of even a low-end Sandy Bridge processor.
 
What makes the most money out of us consumers? Games. The gaming industry is huge.

That's not true at all. If it was true Intel wouldn't have 60% of GPU shipments. And half (or more) of the remaining market share wouldn't be integrated graphics from both AMD and Nvidia. And the remaining wouldn't be primarily dominated by low end GPUs not suitable for gaming.

Laptops and nettops wouldn't outsell desktops, etc.

If ARM can grab the entire non-gaming, non-enthusiast, non-workstation market Intel could still easily loose 80% of their consumer market. Sure that's not going to happen anytime soon, but even a 15% loss would completely change the landscape and would make them have to cut some corners in the company. Add the fact that they also compete with their own previous generations* and Intel could be facing some real challenges in the near future.

* Like I said any Quad from the past 4 years is enough for 95% of the people. Also most people expect their PCs to last more than 5 years. These people who are buying PCs right now will not be willing to buy anything for the next 5-8 years, because they trully don't need it. If by 2015 ARM can put out octo+ cores, that are out of order and can reach 4 Ghz, that will be a good enough upgrade for them. Cortex A-15 is already OoO superscalar and is expected to come in at 2.5 Ghz. So what I said above is more than doable, hence it would all become a matter of price, wattage etc. and there ARM is a much better contender. Not that Intel could not contend in that situation, but it would need to become a much more "slim" company in order to be able to be sustained selling $10-$50 chips intead of current ones.
 
If this is true that's pretty dammed amazing.
 
you're forgetting the GPU has always been 1 process behind on Intel CPUs, it may very well be the tradditional 65W CPU with a 12W GPU

Yeah that makes more sense, plus they are supposed to buff up the GPU from 12 to 16 EUs
 
Back
Top