Tuesday, February 23rd 2010
AMD Starts Shipping 12-core and 8-core ''Magny Cours'' Opteron Processors
AMD has started shipping its 8-core and 12-core "Magny Cours" Opteron processors for sockets G34 (2P-4P+), and C32 (1P-2P). The processors mark entry of several new technologies for AMD, such as a multi-chip module (MCM) approach towards increasing the processor's resources without having to complicate chip design any further than improving on those of the Shanghai and Istanbul. The new Opteron chips further make use of third-generation HyperTransport interconnect technologies for 6.4 GT/s interconnect speeds between the processor and host, and between processors on multi-socket configurations. It also embraces the Registered DDR3 memory technology. Each processor addresses memory over up to four independent (unganged) memory channels. Technologies such as HT Assist improve inter-silicon bandwidth on the MCMs. The processors further benefit from 12 MB of L3 caches on board, and 512 KB of dedicated L2 caches per processor core.
In the company's blog, the Director of Product Marketing for Server/Workstation products, John Fruehe, writes "Production began last month and our OEM partners have been receiving production parts this month." The new processors come in G34/C32 packages (1974-pin land-grid array). There are two product lines: the 1P/2P capable (cheaper) Opteron 4000 series, and 2P to 4P capable Opteron 6000 series. There are a total of 18 SKUs AMD has planned some of these are listed as followed, with OEM prices in EUR:
Sources:
AMD Blogs, TechConnect Magazine
In the company's blog, the Director of Product Marketing for Server/Workstation products, John Fruehe, writes "Production began last month and our OEM partners have been receiving production parts this month." The new processors come in G34/C32 packages (1974-pin land-grid array). There are two product lines: the 1P/2P capable (cheaper) Opteron 4000 series, and 2P to 4P capable Opteron 6000 series. There are a total of 18 SKUs AMD has planned some of these are listed as followed, with OEM prices in EUR:
- Opteron 6128 (8 cores) | 1.5 GHz | 12MB L3 cache | 115W TDP - 253.49 Euro
- Opteron 6134 (8 cores) | 1.7 GHz | 12MB L3 cache | 115W TDP - 489 Euro
- Opteron 6136 (8 cores) | 2.4 GHz | 12MB L3 cache | 115W TDP - 692 Euro
- Opteron 6168 (12 cores) | 1.9 GHz | 12MB L3 cache | 115W TDP - 692 Euro
- Opteron 6172 (12 cores) | 2.1 GHz | 12MB L3 cache | 115W TDP - 917 Euro
- Opteron 6174 (12 cores) | 2.2 GHz | 12MB L3 cache | 115W TDP - 1,078 Euro
125 Comments on AMD Starts Shipping 12-core and 8-core ''Magny Cours'' Opteron Processors
Work smarter, not harder.
The best upgrade to a human would be a calculator. Humans are ridiculously bad at seeing things in 1s, 0s, and derivatives thereof. If you could add that capability to the brain, it would be far more efficient at processing numbers (rather, the concept of numbers). Likewise, if we could create a co-processor that works in terms of shapes, it would drastically increase the capability of computers. For instance, it could look at a web page and read everything on it so long as it can identify the character set. It could look at a picture and identify people and what those people are most likely doing. It could also name every object it knows that appears in the picture like cars, bikes, signs, symbols, etc. In order to engineer said processor, we'd have to throw what we know about processing out the window and start from scratch with that goal in mind. As far as I know, that's not going to happen any time soon because their all too busy milking the ARM, POWER, and x86 cash cows.
They would not be marketed by instructions per second like current CPUs; they would be marketed by shapes per second and detail per shape. And hey, because it works on shapes, it could actually create a seamless arch on an analog display (digital would pixelate it). ;)
This is the best we got for now.
After shapes, we'd need a speech processor (decodes sound waves and can produce its own including pitch, tone, and expressiveness). With some good programming, it could completely replace call centers and you'd never be able to tell you were actually talking to a computer.
Speech processing has already advanced far beyond what most people realize, because the text to speech and automated calling systems to which most people have been exposed are far from state of the art. That crap that comes bundled with operating systems and even some of the more expensive speech processing software a regular consumer can buy, are not representative of the speech processing computers are already capable of. Speech processing in real time might be one of those things that is best done without too much parallel processing due to the latency introduced—but then again, it would be small-minded to assume that said latency will always be the issue it is today.
I especially liked the first bit.
Its true, the brain can be trained just as well as muscles can be trained.
( albeit differently off course heh)
_______________________________
But okay, my arguments aside. Let's say that the problems of multi-core latency, overhead, etc. are impossible to ever improve or overcome and there's no alternative to a "clog-prone" one-core-managing-many ("master thread") architecture. Let's assume that hardware-managed thread states on multi-core CPUs simply cannot work (you mentioned earlier that that would virtually eliminate software overhead, but you still argue against multi-core CPUs, so that's out). Basically, let's say multi-core is simply unacceptable tech and you get to determine the design of future CPUs, and they will all be single-core monsters that smoke their multi-core inferiors. How are you going to do it?
1. Will the performance come from streamlining processes via new instruction sets?
2. You stated earlier that "a 12 GHz CPU can handle more work than a 4 x 3 GHz CPU because of having no overhead." Will you succeed where AMD and Intel have failed and find ways to overcome the ILP, memory, and power walls that in our current reality makes such high operating frequencies unfeasible?
3. If you were running AMD, starting five years ago, what path would you have set the company down, and what products would they now be releasing instead of these 8 and 12-core CPUs that you criticize?
I ask out of a genuine desire to learn, seriously. I'm completely up for better ways of doing things than the norm.
Other than that, we need to look at Bulldozer before deciding if they screwed up since Athlon 64 or not.
The objective to accelerate both single and multi-threaded performance with smarter engineering.
last I checked we have several memebers with quad sli, quadfire, i7 rigs with 12gb memery and etc. all of which is overkill for gaming or anything else a home user typically does.
but epeen plays a role in the purchase.
these chips are more than likely going to be used in enterprise and server environements but a few home users will toss them in as well because you gain x amount of epeen for ever core your machine hase.
on an unrelated topic, why of why is it "magny cours" that's far to close to mangy cores if you ask me.
That's true though. Many of us do WCG or Folding@home (got ten video cards/fifteen GPUs and five computers running myself and GT90 himself with a dual Xeon quad-core system).
GT90: you said yourself that you've written applications that can fully load 8+ cores to 100%. If some but not all software can utilize all this cores, then that tells me the problems lies with software. Anyway, judging multi-core CPU architecture based on how current software utilizes it would be a mistake.
There's always lag between new technology and mainstream software support for it. The first mainstream dual-core CPUs came out in 2005, quads in late '06/early '07 (although there were earlier multi-core CPUs, they were not mainstream enough to get the attention of mainstream software developers). Most applications I know of can already utilize two cores, and many can already fully utilize quads—just three years after they first hit the mainstream consumer market.
There are probably still far more single- and dual-core CPUs than there are quad-core CPUs out there in consumer systems, and you're saying because of how well 4+ cores perform with today's software, it's no good? If the Wright brothers had that attitude about new technology, they would have given up before their first flight because obstacles like gravity were too hard to overcome. But they had faith they could tackle the obstacles, and thus they did. The technology wasn't bad—it just needed time and work to survive its own infancy.