Thursday, May 3rd 2018

AMD & Intel Roadmaps for 2018 Leaked

Bluechip computer, a German IT distribution company, has inadvertently spilled the beans on AMD and Intel's plans for the remainder of this year, shedding some more light on a number of products whose existence was still somewhat marred in fog. The information comes straight from a webinar Bluechip presented to its industry partners - a 30-minute presentation which made its way to YouTube.

The information gleaned is a confirmation, of sorts, of AMD's planned launch of their Z490 platform in June; the B450 chipset coming a little bit later, in July (an expected product, in every sense); and AMD's second-gen Threadripper, a known-quantity already, which should accompany a X399 platform refresh.
On Intel's side of the camp, some ill-kept secrets have apparently been confirmed: the launch of the company's Z390 chipset, for instance, is expected to happen in Q3 2018 - some time after Computex, which might mean a relatively sparse landscape when it comes to teases and new product announcements based on Intel's upcoming top of the line chipset. Oh, andthat unicorn of an 8-core Coffee Lake part is apparently being prepped for 4Q 2018, with silicon being moved to partner hands as early as June.
Sources: AnandTech, Bluechip Webinar (since removed)
Add your own comment

53 Comments on AMD & Intel Roadmaps for 2018 Leaked

#26
evernessince
dwadeAMD has always been a copycat. When Intel and Nvidia releases something new, they try to imitate their every move. TressFX -> Hairworks, RTX -> ProRender, G-Sync -> FreeSync, x79 x99 x299 z270 -> x399 x370, i3 i5 i9 -> r3 r5 r7. i9 -> R9 is going to happen. AMD is like the little brother always looking up to the bigger bros.
AMD invented x64, was the first to reach 1 GHz, was the first to have a dual core server chip, first quad core server chip, first to multi-screen gaming (eyefininity), first to integrate a x86 processor with an iGPU that can play games making an APU, first to fully support DX11, First to DX support DX12, and first x86 processor to use a MCM.

You seem to have a very selective memory. Those are just off the top of my head, I'm sure I'm missing more.
Posted on Reply
#27
AnarchoPrimitiv
dwadeAs if x399 wasn’t bad enough. AMD Z490... monkey see monkey do
kinda like x64 architecture with intel
evernessinceAMD invented x64, was the first to reach 1 GHz, was the first to have a dual core server chip, first quad core server chip, first to multi-screen gaming (eyefininity), first to integrate a x86 processor with an iGPU that can play games making an APU, first to fully support DX11, First to DX support DX12, and first x86 processor to use a MCM.

You seem to have a very selective memory. Those are just off the top of my head, I'm sure I'm missing more.
Don't forget about developing GDDR3, GDDR5, and HBM
Posted on Reply
#28
R-T-B
AnarchoPrimitivDon't forget about developing GDDR3, GDDR5, and HBM
They didn't "develop" it.
Posted on Reply
#29
Bones
oxidizedX299 and X99 are Intel's, it's pretty predictable how the next one will be called, AMD on the other hand...



No AMD chipset was ever called Zxxx in recent history at least

It's so blatant what they're trying to do, i don't understand why you purposely keep ignoring that...Actually i might know.
If you happen to know then tell us.... And me too so I'll know as well.
I run both Intel and AMD stuff here so I'm not a fanboy of either make, current DD rig is a 7700K based setup and I'm perfectly happy with it.
As for another comment about AMD always being and only have been a copycat, that's just wrong.

AMD has led more than once before, one such instance was what was called the Itanic disaster that had Intel trying to copy what AMD had already succeeded in doing (64 bit CPUs) yet Intel coudn't make it work themselves.
Everything they tried blew up in their faces and had them scrambling like headless chickens to try and figure out HOW 64 bit chips worked.
They finally had to make a deal with AMD to learn how it worked for making these on their own.

That's why for awhile back then (About 2004-2005) AMD was pushing foward with the 64 bit CPU and Intel kept cranking out 32 bit chips and pushing the clocks higher and higher during that time.
After they finally made a deal and learned how it worked Intel used their deep pockets to run away with the performance crown - That's how it went down back in the day.

What serves my needs is what I go for and I don't care who makes it, my main concern is getting what I need for the best price possible.
Unfortunately Intel is always the higher priced stuff and for my needs it doesn't make alot of sense to spend more that you have to.

I do hope based on the roadmap we have some good stuff from both camps coming soon, I see alot of griping about "Nothing New" to come soon and now that we have something of a schedule shown at least we have an idea of when it should be.
BTW speaking of AMD copying, isn't Intel the one currently having problems making 7nm and isn't AMD the one already with "At least" something to show for it?
There's a thread in here somewhere on that.
Posted on Reply
#30
cdawall
where the hell are my stars
I am just going to go through these a bit here.
evernessinceAMD invented x64,
AMD holds the patents for x86-64. RISC based computing used 64 bit as early as 1975. In 1989 Intel had 64 bit processing available. In 1994 intel started development on IA-64 which is the original Itanium based products that were short lived. It was not until 1999 that AMD released their instruction set which became x86-64, this was later what became EM64T for intel.

en.wikipedia.org/wiki/64-bit_computing
evernessincewas the first to reach 1 GHz,
This is one is actually true.
evernessincewas the first to have a dual core server chip,
Minus the ones IBM did, actual first multi core CPU.

www-03.ibm.com/ibm/history/ibm100/us/en/icons/power4/
evernessincefirst quad core server chip,
Kentsfield based X32x0 Xeons shipped 7 January 2007, it wasn't until November 19, 2007 that AMD dropped Agena B2 stepping products out and those all had the wonderful TLB bug.
evernessincefirst to multi-screen gaming (eyefininity),
Arcades in the 1990's had multiscreen gaming already. AMD may have been the first to brand the tech and distribute it to consumers, but it long since had existed. Example being Sega's F355 Challenge from 1999 which again used 3 28" monitors for the sit-down cockpit version.

en.wikipedia.org/wiki/Multi-monitor
evernessincefirst to integrate a x86 processor with an iGPU that can play games making an APU,
This isn't even worth a source...It is not correct and is based off of AMD marketing. Their iGPU was a trashcan fire, just less of a trashcan fire as Intel's.
evernessincefirst to fully support DX11,
Correct
evernessinceFirst to DX support DX12,
I like how you adjusted this VS DX11. Technically the Fermi series of cards is DX12 (feature level 11.0) compliant. So it is still incorrect. The first fully compliant DX12 GPU was Nvidia with Maxwell.

www.extremetech.com/computing/190581-nvidias-ace-in-the-hole-against-amd-maxwell-is-the-first-gpu-with-full-directx-12-support
evernessinceand first x86 processor to use a MCM.
No. That would be Intel with the Pentium D. May of 2005, Intel released the Pentium D which took an MCM of two P4's and had them talk across the FSB.

en.wikipedia.org/wiki/Pentium_D
evernessinceYou seem to have a very selective memory. Those are just off the top of my head, I'm sure I'm missing more.
You should check yourself.
Posted on Reply
#31
DeathtoGnomes
R-T-BI think everyone here knows I am not an Intel spin doctor...
everyone knows its too easy to not spin for the other team. :kookoo:
Posted on Reply
#32
evernessince
cdawallI am just going to go through these a bit here.



AMD holds the patents for x86-64. RISC based computing used 64 bit as early as 1975. In 1989 Intel had 64 bit processing available. In 1994 intel started development on IA-64 which is the original Itanium based products that were short lived. It was not until 1999 that AMD released their instruction set which became x86-64, this was later what became EM64T for intel.

en.wikipedia.org/wiki/64-bit_computing



This is one is actually true.



Minus the ones IBM did, actual first multi core CPU.

www-03.ibm.com/ibm/history/ibm100/us/en/icons/power4/



Kentsfield based X32x0 Xeons shipped 7 January 2007, it wasn't until November 19, 2007 that AMD dropped Agena B2 stepping products out and those all had the wonderful TLB bug.



Arcades in the 1990's had multiscreen gaming already. AMD may have been the first to brand the tech and distribute it to consumers, but it long since had existed. Example being Sega's F355 Challenge from 1999 which again used 3 28" monitors for the sit-down cockpit version.

en.wikipedia.org/wiki/Multi-monitor



This isn't even worth a source...It is not correct and is based off of AMD marketing. Their iGPU was a trashcan fire, just less of a trashcan fire as Intel's.



Correct



I like how you adjusted this VS DX11. Technically the Fermi series of cards is DX12 (feature level 11.0) compliant. So it is still incorrect. The first fully compliant DX12 GPU was Nvidia with Maxwell.

www.extremetech.com/computing/190581-nvidias-ace-in-the-hole-against-amd-maxwell-is-the-first-gpu-with-full-directx-12-support



No. That would be Intel with the Pentium D. May of 2005, Intel released the Pentium D which took an MCM of two P4's and had them talk across the FSB.

en.wikipedia.org/wiki/Pentium_D



You should check yourself.
"RISC based computing used 64 bit as early as 1975

No, some super computers had 64 bit integer arithmetic and 64 bit registers but all the CPU stages were not 64-bit. That is a requirement.

In addition from the webpage you linked

"Intel i860[4] development began culminating in a (too late[5] for Windows NT) 1989 release; the i860 had 32-bit integer registers and 32-bit addressing, so it was not a fully 64-bit processor "

So your claim of Intel having a 64 bit processor in 1989 is not correct.


"Minus the ones IBM did, actual first multi core CPU. "

If you consider the power 4 a true dual core, which is very debatable given it shares L2 and L3 cache among all the cores. In addition, the L3 cache has to go through both the fabric and the NC units before it even gets to the processor. Technically speaking this doesn't mean either AMD's or Intel's current definition of "cores".


"Kentsfield based X32x0 Xeons shipped 7 January 2007, it wasn't until November 19, 2007 that AMD dropped Agena B2 stepping products out and those all had the wonderful TLB bug. "

phys.org/news/2006-12-amd-world-native-quad-core-x86.html


"Arcades in the 1990's had multiscreen gaming already. AMD may have been the first to brand the tech and distribute it to consumers, but it long since had existed. Example being Sega's F355 Challenge from 1999 which again used 3 28" monitors for the sit-down cockpit version. "

Off topic. I could care less what they did in arcades. I guess I should have been more specific as you will nitpick. Of course on a PC related article on a PC enthusiast website I meant in relation to PCs. I do not go into boxing forums and say "actually no, I'm the first person to knock out Floyd Mayweather in a professional bout in his home arena, in a video game".

"This isn't even worth a source...It is not correct and is based off of AMD marketing. Their iGPU was a trashcan fire, just less of a trashcan fire as Intel's. "

Here, let me

techterms.com/definition/apu


"I like how you adjusted this VS DX11. Technically the Fermi series of cards is DX12 (feature level 11.0) compliant. So it is still incorrect. The first fully compliant DX12 GPU was Nvidia with Maxwell. "


Not even Pascal has full DX 12 support yet and even worse a portion of the features have to be emulated.



"No. That would be Intel with the Pentium D. May of 2005, Intel released the Pentium D which took an MCM of two P4's and had them talk across the FSB. "

If that's your interpretation of a MCM then technically the IBM Power4 and many other processors quality as well. Go read the link you provided earlier, IBM connected up to four chips over their data fabric. Of course there are serious difference between AMD's implementation and Intel's / IBMs.


I don't know what kind of day your having but nitpicking someone else's post as the superior fact man and failing at it isn't doing anyone any good. I'll admit I'm not correct all the time but I did not deserve a reply in the tone you provided.
Posted on Reply
#33
JalleR
Now we know why the employees are running away, Childish games no thankyou… :D
Posted on Reply
#34
oxidized
And here you are moving away from the topic, talking about how AMD invented this and that, and how intel copied them with x64 cpus, etc...
Posted on Reply
#35
zo0lykas
You never think about that, they doing similar code name just because easy to compare to eatch other?
And dont confuses people's.
dwadeAMD has always been a copycat. When Intel and Nvidia releases something new, they try to imitate their every move. TressFX -> Hairworks, RTX -> ProRender, G-Sync -> FreeSync, x79 x99 x299 z270 -> x399 x370, i3 i5 i9 -> r3 r5 r7. i9 -> R9 is going to happen. AMD is like the little brother always looking up to the bigger bros.
Posted on Reply
#36
Vayra86
10 points for the first actually on topic post in this thread.

No idea how I'm going to give out 10 points but still, try it
Posted on Reply
#37
Gungar
cdawallIf I was Intel I would change the naming scheme. Like cool AMD wants X399, that is fine welcome to the intel OVER9000. TOP THAT AMD.
Haha so true.
Posted on Reply
#38
Xpect
dwadeAMD has always been a copycat. When Intel and Nvidia releases something new, they try to imitate their every move. TressFX -> Hairworks, RTX -> ProRender, G-Sync -> FreeSync, x79 x99 x299 z270 -> x399 x370, i3 i5 i9 -> r3 r5 r7. i9 -> R9 is going to happen. AMD is like the little brother always looking up to the bigger bros.
If I remember correctly, TressFX is a software library made by AMD, first used in early 2013 game Tomb Raider, while Hairworks is just a 3ds and Maya Plugin made by Nvidia AFTER TressFX was announced and released. Hairworks then was quickly patched into the end-2013 game CoD:Ghosts. So yeah, great copycatting there.

And well, first GPU to 1GHz, first consumer X64 CPU, first CPU wih 1GHz as well, HBM development etc...

Tbh, the only thing AMD is missing, is a bigger backing by third party software developers. AMD is innovating and making new stuff, Nvidia and Intel are just repurposing old stuff and finetuning it.

Just my 2 Pfennig
Posted on Reply
#39
Dartenor
dwadeAMD has always been a copycat. When Intel and Nvidia releases something new, they try to imitate their every move. TressFX -> Hairworks, RTX -> ProRender, G-Sync -> FreeSync, x79 x99 x299 z270 -> x399 x370, i3 i5 i9 -> r3 r5 r7. i9 -> R9 is going to happen. AMD is like the little brother always looking up to the bigger bros.
I agree on most thing, but you can't pull everything together:
-TressFX annoucement was in February 2013 and Hairworks was in October 2013
-FreeSync and G-Sync where both announced in the same week in March 2015
So in the first 2 cases you can't really claim that AMD copied NVIDIA, while in the latters i agree with you
Posted on Reply
#40
phill
I love threads like this, it goes from information to AMD v Intel v everything else lol

I feel like quoting Jack Nicholson from Mars Attacks when he says, "Why can't we all just get along??"

I'm disappear now :) ....
Posted on Reply
#41
ssdpro
dwadeAs if x399 wasn’t bad enough. AMD Z490... monkey see monkey do
This was as far as I needed to read in the comments. Perfect abstract from the article. AMD has two feet - one foot is value the other foot is socket/chipset stability. If you keep the socket but market 3 chipsets in the same calendar year you have shot one foot.
Posted on Reply
#42
LiveOrDie
I don't see anything about intels X399 on here? X299 refresh?
Posted on Reply
#43
cdawall
where the hell are my stars
evernessince"RISC based computing used 64 bit as early as 1975

No, some super computers had 64 bit integer arithmetic and 64 bit registers but all the CPU stages were not 64-bit. That is a requirement.

In addition from the webpage you linked

"Intel i860[4] development began culminating in a (too late[5] for Windows NT) 1989 release; the i860 had 32-bit integer registers and 32-bit addressing, so it was not a fully 64-bit processor "

So your claim of Intel having a 64 bit processor in 1989 is not correct.
Intel had processors that could do 64 bit computing and had fully 64 bit processors before AMD. IA64 as short lived as it was, but was still first. I notice you skipped that part of what I said to nit pick (in your own words) between fully 64 bit and partially 64 bit products, since you don't get to decide what is and what is not 64 bit of guess we will have to leave that to the computing world as a whole.
evernessince"Minus the ones IBM did, actual first multi core CPU. "

If you consider the power 4 a true dual core, which is very debatable given it shares L2 and L3 cache among all the cores. In addition, the L3 cache has to go through both the fabric and the NC units before it even gets to the processor. Technically speaking this doesn't mean either AMD's or Intel's current definition of "cores".
So is AMD FX a quad core or an octa core? Same argument for shared parts can be had for that. If you want to call it something else make sure to scroll down and view Intels Pentium D which still predated A64 X2.
evernessince"Kentsfield based X32x0 Xeons shipped 7 January 2007, it wasn't until November 19, 2007 that AMD dropped Agena B2 stepping products out and those all had the wonderful TLB bug. "

phys.org/news/2006-12-amd-world-native-quad-core-x86.html
You should read all the way through that. It matches the timeline I gave (oddly enough the dates have never changed for past releases).

AMD trying to garner additional sales is using a marketing ploy to say a "native quad core" is better than Intel MCM setup. Which absolutely zero performance tests showed. Intel released the first quad core, AMD released the first "native" quad core.
evernessince"Arcades in the 1990's had multiscreen gaming already. AMD may have been the first to brand the tech and distribute it to consumers, but it long since had existed. Example being Sega's F355 Challenge from 1999 which again used 3 28" monitors for the sit-down cockpit version. "

Off topic. I could care less what they did in arcades. I guess I should have been more specific as you will nitpick. Of course on a PC related article on a PC enthusiast website I meant in relation to PCs. I do not go into boxing forums and say "actually no, I'm the first person to knock out Floyd Mayweather in a professional bout in his home arena, in a video game".
So the only real video games are the ones on a pc? Curious. I was actually kind of excited when amd took Matrox's basic tech mainstream.
evernessince"This isn't even worth a source...It is not correct and is based off of AMD marketing. Their iGPU was a trashcan fire, just less of a trashcan fire as Intel's. "

Here, let me

techterms.com/definition/apu
So are you talking about APU's or anything with an iGPU. There was no designation to that in your other post. APU is merely another term AMD used to market a product. If you notice even the substantially better performing Iris based Intel products are not referenced as APU's nor are the new vega (Polaris) based Intel MCM.
evernessince"I like how you adjusted this VS DX11. Technically the Fermi series of cards is DX12 (feature level 11.0) compliant. So it is still incorrect. The first fully compliant DX12 GPU was Nvidia with Maxwell. "


Not even Pascal has full DX 12 support yet and even worse a portion of the features have to be emulated.
www.extremetech.com/computing/190581-nvidias-ace-in-the-hole-against-amd-maxwell-is-the-first-gpu-with-full-directx-12-support

According to Microsoft asynchronous compute isn't a requirement to meet DX12 full flag. If memory serves correctly that was an after the fact add on.
evernessince"No. That would be Intel with the Pentium D. May of 2005, Intel released the Pentium D which took an MCM of two P4's and had them talk across the FSB. "

If that's your interpretation of a MCM then technically the IBM Power4 and many other processors quality as well. Go read the link you provided earlier, IBM connected up to four chips over their data fabric. Of course there are serious difference between AMD's implementation and Intel's / IBMs.
You are absolutely correct. What's cool is that cpu is also actually one of the first chips to get an L3 cache. Pretty neat little chips. Really shows how IBM paves the way for most of what we see in the market now.
evernessinceI don't know what kind of day your having but nitpicking someone else's post as the superior fact man and failing at it isn't doing anyone any good. I'll admit I'm not correct all the time but I did not deserve a reply in the tone you provided.
AMD took tech that existed and rebranded it then marketed it as their own. Very little as of late has been new. There was no tone attached to anything there was misinformation on the page that needed to be shut down. I see you found a couple nuances in what I posted I enjoy getting corrected information it grows me and the forums having that spread around as opposed to the wrong stuff.
Posted on Reply
#44
Gasaraki
cdawallI am just going to go through these a bit here.



AMD holds the patents for x86-64. RISC based computing used 64 bit as early as 1975. In 1989 Intel had 64 bit processing available. In 1994 intel started development on IA-64 which is the original Itanium based products that were short lived. It was not until 1999 that AMD released their instruction set which became x86-64, this was later what became EM64T for intel.

en.wikipedia.org/wiki/64-bit_computing



This is one is actually true.



Minus the ones IBM did, actual first multi core CPU.

www-03.ibm.com/ibm/history/ibm100/us/en/icons/power4/



Kentsfield based X32x0 Xeons shipped 7 January 2007, it wasn't until November 19, 2007 that AMD dropped Agena B2 stepping products out and those all had the wonderful TLB bug.



Arcades in the 1990's had multiscreen gaming already. AMD may have been the first to brand the tech and distribute it to consumers, but it long since had existed. Example being Sega's F355 Challenge from 1999 which again used 3 28" monitors for the sit-down cockpit version.

en.wikipedia.org/wiki/Multi-monitor



This isn't even worth a source...It is not correct and is based off of AMD marketing. Their iGPU was a trashcan fire, just less of a trashcan fire as Intel's.



Correct



I like how you adjusted this VS DX11. Technically the Fermi series of cards is DX12 (feature level 11.0) compliant. So it is still incorrect. The first fully compliant DX12 GPU was Nvidia with Maxwell.

www.extremetech.com/computing/190581-nvidias-ace-in-the-hole-against-amd-maxwell-is-the-first-gpu-with-full-directx-12-support



No. That would be Intel with the Pentium D. May of 2005, Intel released the Pentium D which took an MCM of two P4's and had them talk across the FSB.

en.wikipedia.org/wiki/Pentium_D



You should check yourself.
Actually Pentium 3 1GHz came out in Q1 2000. T-Bird 1GHz came out in June 2000. So no AMD was not first to 1GHz.
Live OR DieI don't see anything about intels X399 on here? X299 refresh?
LOL, see what happens when AMD starts naming their stuff the exactly like Intel's stuff.
X399 is already "taken" by AMD so Intel can't do a X399.
Posted on Reply
#45
n-ster
Not to be rude but... nobody cares about what steam achievements AMD or Intel got first.

I'm not really sure how pci-e lanes work (cpu vs chipset and the communication between both) but I know I wish there was room for an extra x4 NVMe, even if it's a pci-e slot drive
Posted on Reply
#46
cdawall
where the hell are my stars
GasarakiActually Pentium 3 1GHz came out in Q1 2000. T-Bird 1GHz came out in June 2000. So no AMD was not first to 1GHz.
There was a lot of arguments on who was first. AMD had oem parts out between Compaq and gateway first from the articles out. Neither had a solid launch of 1ghz products.
Posted on Reply
#47
Bones
I'm gonna sum things up on my part.
We can complain and take jabs at each other all we'd like over who's first at what and why for, based on a roadmap or really any release schedule made or names given to what - Thing is we're not the ones making the decisions about it.

If I had a slot on the board for deciding such things then maybe what I'd say would really matter but I don't so anything I'd have to say doesn't..... And it's the same for all here that I know of.

So... Why even argue about it?
It's pointless.
Posted on Reply
#48
TheGuruStud
cdawallI am just going to go through these a bit here.



AMD holds the patents for x86-64. RISC based computing used 64 bit as early as 1975. In 1989 Intel had 64 bit processing available. In 1994 intel started development on IA-64 which is the original Itanium based products that were short lived. It was not until 1999 that AMD released their instruction set which became x86-64, this was later what became EM64T for intel.

en.wikipedia.org/wiki/64-bit_computing



This is one is actually true.



Minus the ones IBM did, actual first multi core CPU.

www-03.ibm.com/ibm/history/ibm100/us/en/icons/power4/



Kentsfield based X32x0 Xeons shipped 7 January 2007, it wasn't until November 19, 2007 that AMD dropped Agena B2 stepping products out and those all had the wonderful TLB bug.



Arcades in the 1990's had multiscreen gaming already. AMD may have been the first to brand the tech and distribute it to consumers, but it long since had existed. Example being Sega's F355 Challenge from 1999 which again used 3 28" monitors for the sit-down cockpit version.

en.wikipedia.org/wiki/Multi-monitor



This isn't even worth a source...It is not correct and is based off of AMD marketing. Their iGPU was a trashcan fire, just less of a trashcan fire as Intel's.



Correct



I like how you adjusted this VS DX11. Technically the Fermi series of cards is DX12 (feature level 11.0) compliant. So it is still incorrect. The first fully compliant DX12 GPU was Nvidia with Maxwell.

www.extremetech.com/computing/190581-nvidias-ace-in-the-hole-against-amd-maxwell-is-the-first-gpu-with-full-directx-12-support



No. That would be Intel with the Pentium D. May of 2005, Intel released the Pentium D which took an MCM of two P4's and had them talk across the FSB.

en.wikipedia.org/wiki/Pentium_D



You should check yourself.
Are you seriously trying to compare fsb to HTT for multicore CPUs? Bwhahahahaha. You must be joking. There's a reason it wasn't really used...the bus sucks. Intel's was effectively just dual socket config in one socket...big deal. AMD did it right. Of course the king of lazy engineering can spit out a technically counted multi-core CPU, b/c it used 100% existing tech. It could have been done years prior, but was market segementation to dig money for dual socket boards and their compatible CPUs.
Posted on Reply
#49
cdawall
where the hell are my stars
TheGuruStudAre you seriously trying to compare fsb to HTT for multicore CPUs? Bwhahahahaha. You must be joking. There's a reason it wasn't really used...the bus sucks. Intel's was effectively just dual socket config in one socket...big deal. AMD did it right. Of course the king of lazy engineering can spit out a technically counted multi-core CPU, b/c it used 100% existing tech. It could have been done years prior, but was market segementation to dig money for dual socket boards and their compatible CPUs.
Technically the original AMD cores being heterogeneous didn't talk across the bus... Doesn't change that Intel quads came out before AMD ones and they can be don't "right" all you want to say (even Intel made statements saying the could not do that on the 45nm node), but that doesn't change performance. First Gen c2q beat phenom II released what two years after the fact?...
Posted on Reply
#50
HammerON
The Watchful Moderator
Enough off topic discussion. Please keep on topic or move your conversation to PM.
Posted on Reply
Add your own comment
Jul 23rd, 2024 11:34 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts