# Intel to Debut its Core "Skylake" Processors at Gamescom 2015



## btarunr (Jun 19, 2015)

Intel is expected to debut its 6th generation Core "Skylake" desktop processor family at Gamescom 2015, which will be held in Cologne, Germany, between 5th and 9th August. PC enthusiasts should look forward to two parts in particular, the Core i7-6700K, and the Core i5-6600K. The two quad-core chips will be built in the LGA1151 package, compatible with upcoming motherboards based on Intel's 100-series chipset. Motherboards based on the Intel Z170 Express chipset will allow CPU overclocking. The integrated memory controller of "Skylake" CPUs support both DDR3 and DDR4 memory standards, and should prove to be a transition point between the two. 

Following the i7-6700K and i5-6600K with Z170 chipset motherboards, at Gamescom; Intel will launch other parts in the 6th gen. Core processor family between late-August and early-September. Those launches will include i7-6700/6700T, Core i5-6600, 6500, 6400, 6600T, 6500T and 6400T, and H170 and B150 chipsets. Then in late-September, the company will launch the entry-level H110 chipset. 





*View at TechPowerUp Main Site*


----------



## Nihilus (Jun 19, 2015)

The 'tock' will be called Skylake, the 'tick' will be called Skynet.


----------



## Xzibit (Jun 19, 2015)

Nihilus said:


> The 'tock' will be called Skylake, the 'tick' will be called Skynet.



Doesnt it go, Tick, Tock, *BOOM!!!*


----------



## Haytch (Jun 19, 2015)

Waiting for the Extreme version of this processor.


----------



## the54thvoid (Jun 19, 2015)

Really need info on DX12 core usage.  What will the differences be in gaming with 4 to 6? This is Skylakes only 'weakness'. 

My 3930k will be 4 yrs old when these come out to retail.


----------



## iSkylaker (Jun 19, 2015)

Those hardware companies seem to be taking advantage of gaming conferences to announce their products, I'm pretty sure it was more than confirmed they will be launching those at IDF 15. But it makes sense, most gamers have their eyes on new games releases and what better time or place than E3 or Gamescom to announce the hardware that will most likely leverage their game experience.


----------



## Uplink10 (Jun 19, 2015)

Four chipsets (Z170, H170, B150, H110) is a big mistake because there is too much choice and just because Intel wants to save money. I mean seriously, TWO Hxxx chipsets? WTF!?!


----------



## mroofie (Jun 19, 2015)

Uplink10 said:


> Four chipsets (Z170, H170, B150, H110) is a big mistake because there is too much choice and just because Intel wants to save money. I mean seriously, TWO Hxxx chipsets? WTF!?!


do some research please 

haswell/haswell refresh nothing new


----------



## mroofie (Jun 19, 2015)

the54thvoid said:


> Really need info on DX12 core usage.  What will the differences be in gaming with 4 to 6? This is Skylakes only 'weakness'.
> 
> My 3930k will be 4 yrs old when these come out to retail.


Well from the charts we have seen dx12 becomes worse with more cores especially 8 O.O 6 is okay but 4 is still the best


----------



## Kaynar (Jun 19, 2015)

mroofie said:


> Well from the charts we have seen dx12 becomes worse with more cores especially 8 O.O 6 is okay but 4 is still the best




uhmm? so what exactly do u mean DX12 becomes worse with more cores? people with 6 or 8 cores gota disable them in the bios?  or we just wont see linear increase in performance (like now, 6/8 core is pretty much useless for gaming)


----------



## birdie (Jun 19, 2015)

Uplink10 said:


> Four chipsets (Z170, H170, B150, H110) is a big mistake because there is too much choice and just because Intel wants to save money. I mean seriously, TWO Hxxx chipsets? WTF!?!



Actually there are six chipsets for socket 1151.

And it's been like that for the past two generations.


----------



## Deleted member 138597 (Jun 19, 2015)

Skylake will be a game changer imo, BUT... if there was some serious competition to Intel from any competitor (well, of-course AMD), DDR4 and more importantly, PCIe 4.0 standard would've debuted to Skylake probably. However, in defense of it, there's no need on that anyway, soooooo........ cheers guys


----------



## Quattroking (Jun 19, 2015)

*The two quad-core chips will be built in the LGA1151 package*

Does that mean that i can't use those CPU's on my Intel LGA1150 motherboard?

And is LGA1150 about to die out as a socket?


----------



## Petey Plane (Jun 19, 2015)

Tom-Helge said:


> *The two quad-core chips will be built in the LGA1151 package*
> 
> Does that mean that i can't use those CPU's on my Intel LGA1150 motherboard?
> 
> And is LGA1150 about to die out as a socket?



Correct.  1151 socket CPUs are not compatible with 1150 sockets, and vice versa.  1151 = socket pin count.

Don't worry though, compared to your current system (in your system specs), you could only expect to see about a 5% increase (at best) for FPS in all games with the Skylake chips and DDR4.  You can wait at least 2-3 more generation before updating.  DDR4 and PCI 4.0 will have very little effect on game performance.  Cards todays still can't fully populate PCI 3.0 lanes, which is why you see virtually no difference between PCI 3.0 8x and 16x.  Same goes for RAM, with virtually no difference between DDR3 3000 and DDR3 1600, as far as game FPS goes.  If your primary use for your PC is gaming, your current CPU and RAM setup is good for at least 3 more years.  Spending money on a GPU is far more important.


----------



## FordGT90Concept (Jun 19, 2015)

Sign me up for a i7-6700 and H170.

Gamescon is a weird venue to choose though.  I wonder if they plan on releasing more than just Skylake.


----------



## nickbaldwin86 (Jun 19, 2015)

Tom-Helge said:


> *The two quad-core chips will be built in the LGA1151 package*
> 
> Does that mean that i can't use those CPU's on my Intel LGA1150 motherboard?
> 
> And is LGA1150 about to die out as a socket?



You got it... don't know why anyone would think 1151 can connect to a 1150... but you did ask the question :|

LGA1150 will be a thing of the past... like 1155 and the other 100 sockets before those.


----------



## TheGuruStud (Jun 19, 2015)

With all the AMD drama.

REBRAND!


----------



## Quattroking (Jun 19, 2015)

Petey Plane said:


> Correct.  1151 socket CPUs are not compatible with 1150 sockets, and vice versa.  1151 = socket pin count.
> 
> Don't worry though, compared to your current system (in your system specs), you could only expect to see about a 5% increase (at best) for FPS in all games with the Skylake chips and DDR4.  You can wait at least 2-3 more generation before updating.  DDR4 and PCI 4.0 will have very little effect on game performance.  Cards todays still can't fully populate PCI 3.0 lanes, which is why you see virtually no difference between PCI 3.0 8x and 16x.  Same goes for RAM, with virtually no difference between DDR3 3000 and DDR3 1600, as far as game FPS goes.  If your primary use for your PC is gaming, your current CPU and RAM setup is good for at least 3 more years.  Spending money on a GPU is far more important.


Ok. Well yeah. I'm sure i will be fine for another 3-4 years with my computer, so that's not an issue.

I use my computer both for gaming and video rendering. But my GPU here will do a much better job at gaming and video rendering than any 'normal' CPU's will be able to achieve anyways. So in my case, i will be in more need of a good GPU over a new CPU / RAM. Or the only thing i need now is to replace my 8 GB of DDR3 RAM with 16 GB DDR3 RAM and a new SSD.

My question is if the LGA1150 socket will be replaced by LGA1151 or something else pretty soon?


----------



## Prima.Vera (Jun 19, 2015)

We should have by now 8 cores / 16 threads as cheap mainstream by now. But because of 0 (zero) competition, we won't get this too soon...
I'm really curious what will be the difference between my i7 3770K and this i7 6770K on gaming perf.. REALLY curious!


----------



## Petey Plane (Jun 19, 2015)

Tom-Helge said:


> My question is if the LGA1150 socket will be replaced by LGA1151 or something else pretty soon?



Correct.   The 1151 is replacing the 1150 socket.  Only Skylake and the subsequent CPUs will fit that socket.  Your current CPU will not fit a 1151 board, nor will a Skylake CPU fit a 1150 socket.


----------



## nickbaldwin86 (Jun 19, 2015)

Prima.Vera said:


> We should have by now 8 cores / 16 threads as cheap mainstream by now. But because of 0 (zero) competition, we won't get this too soon...
> I'm really curious what will be the difference between my i7 3770K and this i7 6770K on gaming perf.. REALLY curious!



why 8 cores.?.?. just to have bigger numbers like AMD... which means nothing... what do you do that needs 8 cores?

4 fast cores faster then 6 or 8 slow cores... moar! Ghz!

game performance is 5FPS more maybe

You want 6 cores... get a 5960x or a Xeon and get 12 cores


----------



## Petey Plane (Jun 19, 2015)

Prima.Vera said:


> We should have by now 8 cores / 16 threads as cheap mainstream by now. But because of 0 (zero) competition, we won't get this too soon...
> I'm really curious what will be the difference between my i7 3770K and this i7 6770K on gaming perf.. REALLY curious!



You can expect about a 5% to 10% increase in game FPS going from the 3770K to the 6770K, under best case scenarios.  In resolutions of 1440 and above, the difference will be lower, more like 2% to 3%.  Some large scale multilayer games, like Battlefield 4 (and therefore Starwars Battlefront and the eventual Battlefield 5) and Planetside, do benefit from a more powerful CPU, but it's still going to be less than 10% difference at resolutions over 1080.


----------



## Quattroking (Jun 19, 2015)

Petey Plane said:


> Correct.   The 1151 is replacing the 1150 socket.  Only Skylake and the subsequent CPUs will fit that socket.  Your current CPU will not fit a 1151 board, nor will a Skylake CPU fit a 1150 socket.


Ok, so i'll guess there wont be any new LGA1150 CPU's from Intel then?

Not that i need a new one for a long time, but it's just good to know if Intel have stopped the production of new LGA1150 CPU's or not.


----------



## Petey Plane (Jun 19, 2015)

Tom-Helge said:


> Ok, so i'll guess there wont be any new LGA1150 CPU's from Intel then?
> 
> Not that i need a new one for a long time, but it's just good to know if Intel have stopped the production of new LGA1150 CPU's or not.



Right, the recently released Broadwell 5775C and 5675C _should_ be the last 1150 CPUs.  Those are targeted more toward small systems without dedicated GPUs


----------



## Hood (Jun 19, 2015)

Uplink10 said:


> Four chipsets (Z170, H170, B150, H110) is a big mistake because there is too much choice and just because Intel wants to save money. I mean seriously, TWO Hxxx chipsets? WTF!?!


You might be thinking from an enthusiast's veiwpoint only - in the broader sense, more choice is always a good thing - why pay for features you don't want or need?  This allows system builders to hit certain price points by using the cheapest chipset/motherboard for the desired feature set. Without these choices, fewer systems would be sold overall, since they'd all be more expensive.  If all the choices are confusing, just consult a simple chart before ordering any motherboard.  


 BTW, this doesn't save Intel money, it costs them more to offer these options


----------



## slick530 (Jun 19, 2015)

Hey guys, what if I'm upgrading from the 2600k to this. Will I see any significant improvements? I'm planning to do SLI for my rig, would it be more viable to wait for the X series instead? Thanks!


----------



## slick530 (Jun 19, 2015)

Petey Plane said:


> You can expect about a 5% to 10% increase in game FPS going from the 3770K to the 6770K, under best case scenarios.  In resolutions of 1440 and above, the difference will be lower, more like 2% to 3%.  Some large scale multilayer games, like Battlefield 4 (and therefore Starwars Battlefront and the eventual Battlefield 5) and Planetside, do benefit from a more powerful CPU, but it's still going to be less than 10% difference at resolutions over 1080.



Yeah, we can all blame AMD for the slow advancements in the CPU. It's really a joke actually given how miniscule the advancement is. And this is not going to get better either, after Skylake, AMD is still 5 generations behind given how pathetic their chipsets and "APUs" are. This makes me really mad ;(


----------



## mroofie (Jun 19, 2015)

TheGuruStud said:


> With all the AMD drama.
> 
> REBRAND!


how is skylake a rebrand ?


----------



## Petey Plane (Jun 19, 2015)

slick530 said:


> Hey guys, what if I'm upgrading from the 2600k to this. Will I see any significant improvements? I'm planning to do SLI for my rig, would it be more viable to wait for the X series instead? Thanks!



If your primary interest is gaming you will see an FPS improvement of about 10% in games from the new CPU and DDR4 RAM at any resolution above 1080.  I have 2500K, and i'm going to wait one more generation myself, for Cannonlake, the 10nm die shrink of Skylake.

As far as SLI, you don't need an X (or whatever they call the follow up to 2011-v3) if you are only using 2 GPUs.  2 8X lanes of PCI 4.0 will be more than enough.  Even with PCI 3.0, there is virtually no difference between a card running at PCI 16x and PCI 8x.  Unless you're running a 3 way 4k setup, even modern GPUs cant fully populate 16x PCI lanes.  Save your money on the CPU and Mobo and buy the most expensive GPUs your budget can afford.  You only need the X platform if you are thinking about 3 or more GPUs.


----------



## CrAsHnBuRnXp (Jun 19, 2015)

TheGuruStud said:


> With all the AMD drama.
> 
> REBRAND!


Skylake is a whole new architecture. 




Prima.Vera said:


> We should have by now 8 cores / 16 threads as cheap mainstream by now. But because of 0 (zero) competition, we won't get this too soon...
> I'm really curious what will be the difference between my i7 3770K and this i7 6770K on gaming perf.. REALLY curious!


I cant wait to see the difference going from an i5 2500k stock (im oc'd atm @4.5GHz but ill run stock for comparisons sake) to that of the i5 6500k.


----------



## GorbazTheDragon (Jun 19, 2015)

I find it kinda funny how they went full circle with the naming scheme... They are basically back to SB...


----------



## Uplink10 (Jun 19, 2015)

birdie said:


> Actually there are  six chipsets for socket 1151.
> 
> And it's been like that for the past two generations.


I liked it very much when 90 series debuted and there were only 2 chipsets (if you exclude X99) because there were only two choices and that meant that there was less thinking and planning. After this series comes out there is going to be so much products on the market and that is what I do not like. Take a look at this (https://www.asus.com/motherboards/) and select Z97 chipset and you will find two pages of motherboards but with different configuaration. Now add to that more companies and more chipsets and you get a nightmare when buying a motherboard.




Hood said:


> BTW, this doesn't save Intel money, it costs them more to offer these options


Actually it does save Intel money because some chipsets especially X99 are a lot costlier. But if we exclude overpriced X99 then I can see how Intel could save money by producing only one chipset.


----------



## GorbazTheDragon (Jun 19, 2015)

Uplink10 said:


> Actually it does save Intel money because some chipsets especially X99 are a lot costlier. But if we exclude overpriced X99 then I can see how Intel could save money by producing only one chipset.


Why are you even bringing X99 up in this argument??? It's the most irrelevant chipset when talking about intel giving too many options.

H81 is a good budget option, H87/97 is a good midrange option, and Z87/97 is good for enthusiasts. X99 is on a completely different platform so should not even be considered for anyone who is not going to need 8 cores or just want to have a 2011 system... What's more is X99 boards use a completely different layout, there are much more connections to the chipset...


----------



## Uplink10 (Jun 19, 2015)

GorbazTheDragon said:


> H81 is a good budget option, H87/97 is a good midrange option, and Z87/97 is good for enthusiasts.


But it would be a lot better if there would be only Zxxx available which has all the things the other cheaper ones have and buying a motherboard would be easier.



GorbazTheDragon said:


> Why are you even bringing X99 up in this argument??? It's the most irrelevant chipset when talking about intel giving too many options.


You are right, shouldn't have brought it up since it was not a part of the argument but it does show that too many chipsets per socket/platform can be troubling (6 chipsets per 1151) and one per socket/platform can simplify things (X-type chipsets).


----------



## GorbazTheDragon (Jun 19, 2015)

Q and B chipsets are rarely used outside of business AIOs.

Remember Ivy Bridge? That had two Z chipsets... I'd say ATM it is much less confusing now than it was back then.

And what about back on 775, when NVidia also made chipsets...


----------



## zzzaac (Jun 20, 2015)

I wonder if this will be the time to upgrade my ivy bridge


----------



## iSkylaker (Jun 20, 2015)

Uplink10 said:


> But it would be a lot better if there would be only Zxxx available which has all the things the other cheaper ones have and buying a motherboard would be easier.


wow are you serious? encourage people that don't care about overclocking or getting all the feature they offer to buy those $140+ chipsets?

And actually happens the opposite, from what I have seen most people only know about Z chipset and higher end K series about Intel, people that are on the fence of jumping to Intel, this is the argument they bring. "idk an Intel rig cost too much" , "you have to spend a lot on a motherboard"...etc and that's because they don't know about CHEAPER options like B85, H81, H87/97 or CPU options like i5-4440, i5-4460, i5-4560 that aren't all about overclocking but still deliver great performance over AMD. 

Chipsets like B81 are getting known as of late because of the Pentium G3258 fuzz that has the eye of every budget gamer, it can be overclocked even on those cheap chipsets.


----------



## NC37 (Jun 20, 2015)

nickbaldwin86 said:


> why 8 cores.?.?. just to have bigger numbers like AMD... which means nothing... what do you do that needs 8 cores?
> 
> 4 fast cores faster then 6 or 8 slow cores... moar! Ghz!
> 
> ...



Because competition would have pushed higher and thus leveraged Microsoft. Likely we'd have seen something like Windows 10 much earlier. Intel has stagnated to the point that now other industries are pressing Microsoft for better multi-core awareness. Gaming is certainly one of them given that all the consoles are 8 core machines. With extremely weak cores that desperately need multithreading to even function. 

Still why stop at 6 core? Why not make 8 the optimal setup for 10? Answer is likely Intel. I suspect we'll see more 6 core CPUs from Intel hitting outside the server/workstation market. They've already done it within the last couple years. I wouldn't be surprised if eventually i7s were all 6 core and up. Then you'd have i5s being Quads and i3s being duals. So 6/12, 4/8, 2/4. If prelim reports are correct and AMD CPUs become viable again thanks to the changes in Win 10, it would be a natural shift for Intel. They'd have to because the only area AMD CPUs could beat Intel in was multithreading and perhaps Win 10 pronounces that even more. Thus 8 core optimization would have hurt Intel worse. Especially with Zen coming. 

AMD can wave a ton of cores in Intel's face but if the OS can't utilize them properly, it'll just be more of the same and Intel will win.


----------



## GorbazTheDragon (Jun 20, 2015)

FX is not going to get any less unviable than it already is...


----------



## cyneater (Jun 20, 2015)

slick530 said:


> Yeah, we can all blame AMD for the slow advancements in the CPU. It's really a joke actually given how miniscule the advancement is. And this is not going to get better either, after Skylake, AMD is still 5 generations behind given how pathetic their chippers and "APUs" are. This makes me really mad ;(



We can also blame SUN, SGI , HP , IBM power pc (apple) and digital... All the other CPU vendors that failed.

Lets face it 10 - 15 years ago intel held 1 or 2 cards. They had huge competition in the server market and high end specialized work stations. Even amd went 64bit before intel....

Now intel holds a full deck. They have no competition.
They control server , work station , gamming.

Back then CPU advancements where huge now its tiny. But everyone seems to be like OMG new revison have to upgrade everything. It dosn't help developers are lazy as well and wont program any games that only 30% of the PC market or people with high end machines can run.

Lets face it PC's arent as sexy as they use to be. Programers and lazy and everyone wants to make as much  $$$ as possible. With out any competition Intel will continue to slowly trickle fed the technology.


----------



## Uplink10 (Jun 20, 2015)

iSkylaker said:


> wow are you serious? encourage people that don't care about overclocking or getting all the feature they offer to buy those $140+ chipsets?


Fist of all, you can get a new motherboards with Z97 chipset for less than a $100 (http://www.newegg.com/Product/Produ...scription=z97&bop=And&Order=PRICE&PageSize=30) and secondly in my case where there would be only one chipset for 1151 CPUs I did not mean that it would be as expensive as Zxxx chipsets but more along the lines of Hxxx and Bxxx chipsets.


----------



## GorbazTheDragon (Jun 20, 2015)

I wouldn't OC any quad core on those VRMs...


----------



## iSkylaker (Jun 20, 2015)

Uplink10 said:


> Fist of all, you can get a new motherboards with Z97 chipset for less than a $100 (http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007627 600438202 4814&IsNodeId=1&Description=z97&bop=And&Order=PRICE&PageSize=30)


But there is no need to spend even $100, if there are $50-$80 options that can be adjusted to people's wallet and needs.



Uplink10 said:


> and secondly in my case where there would be only one chipset for 1151 CPUs I did not mean that it would be as expensive as Zxxx chipsets but more along the lines of Hxxx and Bxxx chipsets.


Well, that's exactly what they are for, Hxx is cheaper, no overclocking but same features as Z and Bxx is cheaper, no overclocking and less features of a Z.


----------



## cheesy999 (Jun 20, 2015)

cyneater said:


> Back then CPU advancements where huge now its tiny. But everyone seems to be like OMG new revison have to upgrade everything. It dosn't help developers are lazy as well and wont program any games that only 30% of the PC market or people with high end machines can run.
> 
> Lets face it PC's arent as sexy as they use to be. Programers and lazy and everyone wants to make as much  $$$ as possible. With out any competition Intel will continue to slowly trickle fed the technology.



*It doesn't help developers are *clever* as well and won't program any games that only 30% of the PC market or people with high end machines can run


----------



## GreiverBlade (Jun 20, 2015)

btarunr said:


> The integrated memory controller of "Skylake" CPUs support both DDR3 and DDR4 memory standards, and should prove to be a transition point between the two.


great... that would hurt me to lose a "still plenty" 2x8gb DDR3 2400 C10 kit ... waiting on the review of the CPU is now the main idea tho i don't think skylake should be a worthy update over a Haswell DC i5-4690K... (what they expect? +10% increase?)
and DDR4 ... is not really a "BIG" improvement... 

oh well after that little post : no upgrade before the next next line from Intel (or maybe Zen from AMD ... who knows ...)


----------



## Assimilator (Jun 20, 2015)

GorbazTheDragon said:


> I find it kinda funny how they went full circle with the naming scheme... They are basically back to SB...



You mean Core 2 / Conroe?



cyneater said:


> Back then CPU advancements where huge now its tiny.



Completely untrue, because today's CPUs do far more with far less power than the CPUs of yesteryear. Example: the Athlon 64 3200+ is a single-core design that consumes 89W and was released in 2003. The Celeron J1800 is a dual-core that consumes 10W and was released in 2013. The Celeron outperforms the Athlon while consuming far less power - in other words, it took 10 years to lower the power consumption by a factor of 10. That's pretty damn impressive any way you look at it.


----------



## Hood (Jun 20, 2015)

I hear Elon Musk is considering starting a new company that makes high-end GPUs and CPUs, which would certainly throw a wrench in the works...but he's probably too busy re-inventing several other industries.   That headline would scare the crap out of Intel, nVidia, and AMD, considering his past efforts at disruption. Just attaching his name to the new company would guarantee massive capital investment.  C'mon, Iron Man, we know you're into computers and gaming, how 'bout it?  Just as a hobby?


----------



## btarunr (Jun 21, 2015)

Hood said:


> I hear Elon Musk is considering starting a new company that makes high-end GPUs and CPUs, which would certainly throw a wrench in the works...but he's probably too busy re-inventing several other industries.   That headline would scare the crap out of Intel, nVidia, and AMD, considering his past efforts at disruption. Just attaching his name to the new company would guarantee massive capital investment.  C'mon, Iron Man, we know you're into computers and gaming, how 'bout it?  Just as a hobby?



Nope, Musk cannot create CPUs or GPUs, at least not the ones that can ever compete with Intel, AMD, and NVIDIA, no matter how much money he throws at it. You see, there's this thing called IPR and patents. The modern x86 CPU and PC GPU are a result of an IPR cross-licensing clusterfvck between Intel, NVIDIA, and AMD, which leaves no room for additional players. At best, Musk can create chips that compete with Samsung and Qualcomm in the mobile device space.


----------



## xenocide (Jun 22, 2015)

cyneater said:


> Even amd went 64bit before intel....


 
That's not entirely true.  Intel released a 64-bit CPU a bit before AMD.  Intel developed and released the Itanium line in 2001, which was a completely 64-bit CPU.  The problem was it didn't run x86 programs that were written for 32-bit CPU's.  So in 2003 AMD released the Athlon 64's which used the AMD64 instruction set, and was backwards compatible with 32-bit programs.  If AMD had embraced a straight x64 approach all programs would be 64-bit by now, but we've slowly migrated over because AMD offered a "better" solution to the transition for x86 to x64.


----------



## GreiverBlade (Jun 22, 2015)

xenocide said:


> That's not entirely true.  Intel released a 64-bit CPU a bit before AMD.  Intel developed and released the Itanium line in 2001, which was a completely 64-bit CPU.  The problem was it didn't run x86 programs that were written for 32-bit CPU's.  So in 2003 AMD released the Athlon 64's which used the AMD64 instruction set, and was backwards compatible with 32-bit programs.  If AMD had embraced a straight x64 approach all programs would be 64-bit by now, but we've slowly migrated over because AMD offered a "better" solution to the transition for x86 to x64.


ok let say the 1st 64bit "normal" customer cpu (and affordable ...) ... was AMD ... (and would have been a pure 64 bit transition good? no legacy support? i don't think so , so then thanks AMD  )


----------



## Disparia (Jun 22, 2015)

GreiverBlade said:


> ok let say the 1st 64bit "normal" customer cpu (and affordable ...) ... was AMD ... (and would have been a pure 64 bit transition good? no legacy support? i don't think so , so then thanks AMD  )



Intel had a 5-6 year 64bit top-down transitional plan. First the Itanium, then Xeons, then Pentiums and so on. AMD jumped the gun and caused a bit of havoc before they could get out the IA64 Xeons (which would have spurred most of the development of IA64 consumer applications). Today you will still get opinions from either side since some would have preferred pure 64bit with any legacy handled by emulation while others liked the more seamless albeit slow transition to 64bit.


----------



## FordGT90Concept (Jun 22, 2015)

Itanium was IA-64.  Intel had no plans to bring 64-bit to consumers (other than forcing IA-64 on everyone, anyway).  AMD introduced AMD64 (x86-64) with Athlon 64 (Clawhammer).  Intel introduced EM64T (today called Intel 64) with Xeon (Nocona) and Pentium 4 (Prescott).  IA-64 is effectively dead except for the systems that already have them.


----------



## Disparia (Jun 22, 2015)

FordGT90Concept said:


> Itanium was IA-64.  Intel had no plans to bring 64-bit to consumers (other than forcing IA-64 on everyone, anyway).  AMD introduced AMD64 (x86-64) with Athlon 64 (Clawhammer).  Intel introduced EM64T (today called Intel 64) with Xeon (Nocona) and Pentium 4 (Prescott).



So they had no plans concerning 64bit for consumers, except for the plan to force 64bit for consumers? I don't really know where you're going with that 

Simply, they _had_ plans for IA64 dominance and AMD really upset those plans.

Wikipedia citing a 2006 TechWorld article,


> Although Itanium did attain limited success in the niche market of high-end computing, Intel had originally hoped it would find broader acceptance as a replacement for the original x86 architecture.



University of Washington CS course paper from 2007:


> Ultimately, IA64 saw modest market share in the server market and failed to break into the client. Meanwhile, with AMD64, former second-source AMD was able to rise from nearly 0% server market share to, as Weber described, nearly 25% share. Though not a complete failure for Intel, IA64 failed to deliver on its original vision as a replacement for x86.



A little harder is finding the exact information back in 1999-2001 detailing their top-down approach for the replacement of x86 (Itanium > Xeon > Pentium). I remember this as it was a very exciting time. Can probably go through my boot magazines if I can't find an Anandtech, ARS, or Tomshardware article from back then. Intel of course never got to IA64 Xeon stage because with the first Itanium doing so poorly and AMD's announcement being so hopeful (good 64 and 32bit performance) they decided on making EMT64T Xeons instead.



Not to tread too far off-topic, I am looking forward to Skylake. More for the chipsets than anything else. Get me a moderately-priced H170 ITX board with an i5-6600! More than enough lanes there for an Ultra M.2 (why isn't it called M.4? Ughh....)


----------



## <-_-> (Jul 10, 2015)

Core i7-6700K(ES)

CPU-Z

 
GPU-Z 
 
Cinebench Core i7-6700K(ES) and Core i7-4790K
 

source
http://www.techbang.com/posts/24629


----------



## Frick (Jul 10, 2015)

Assimilator said:


> You mean Core 2 / Conroe?
> 
> 
> 
> Completely untrue, because today's CPUs do far more with far less power than the CPUs of yesteryear. Example: the Athlon 64 3200+ is a single-core design that consumes 89W and was released in 2003. The Celeron J1800 is a dual-core that consumes 10W and was released in 2013. The Celeron outperforms the Athlon while consuming far less power - in other words, it took 10 years to lower the power consumption by a factor of 10. That's pretty damn impressive any way you look at it.



Not to mention you can only do so much with silicone, and we're still on x86. I don't think our cpus would have been a million times faster than they are now even if all those old players were active. Look at AMD/Nvidia and how they've been stuck on 28nm and they have every incentive to make things as fast as possible. Plus avarage joes (and even me) do not need anything faster than the old core 2 cpus, plus they're interested in phones and tablets these days, not Pentiums and Athlons.


----------



## FordGT90Concept (Jul 10, 2015)

Jizzler said:


> So they had no plans concerning 64bit for consumers, except for the plan to force 64bit for consumers? I don't really know where you're going with that


IA64 is completely unrelated to x86-64.  x86 will not run on IA-64 and visa versa.

AMD, by launching x86-64, tied Intel's hands.  x86-64 did pretty much everything IA64 did but also maintained backwards compatibility.  Intel knew IA64 was done at that point so they rushed to slap EM64T on Pentium 4 chips.


I'm planning on buying a Skylake-S 6700 ASAP.


----------

