Tuesday, February 23rd 2010

AMD Starts Shipping 12-core and 8-core ''Magny Cours'' Opteron Processors

AMD has started shipping its 8-core and 12-core "Magny Cours" Opteron processors for sockets G34 (2P-4P+), and C32 (1P-2P). The processors mark entry of several new technologies for AMD, such as a multi-chip module (MCM) approach towards increasing the processor's resources without having to complicate chip design any further than improving on those of the Shanghai and Istanbul. The new Opteron chips further make use of third-generation HyperTransport interconnect technologies for 6.4 GT/s interconnect speeds between the processor and host, and between processors on multi-socket configurations. It also embraces the Registered DDR3 memory technology. Each processor addresses memory over up to four independent (unganged) memory channels. Technologies such as HT Assist improve inter-silicon bandwidth on the MCMs. The processors further benefit from 12 MB of L3 caches on board, and 512 KB of dedicated L2 caches per processor core.

In the company's blog, the Director of Product Marketing for Server/Workstation products, John Fruehe, writes "Production began last month and our OEM partners have been receiving production parts this month." The new processors come in G34/C32 packages (1974-pin land-grid array). There are two product lines: the 1P/2P capable (cheaper) Opteron 4000 series, and 2P to 4P capable Opteron 6000 series. There are a total of 18 SKUs AMD has planned some of these are listed as followed, with OEM prices in EUR:
  • Opteron 6128 (8 cores) | 1.5 GHz | 12MB L3 cache | 115W TDP - 253.49 Euro
  • Opteron 6134 (8 cores) | 1.7 GHz | 12MB L3 cache | 115W TDP - 489 Euro
  • Opteron 6136 (8 cores) | 2.4 GHz | 12MB L3 cache | 115W TDP - 692 Euro
  • Opteron 6168 (12 cores) | 1.9 GHz | 12MB L3 cache | 115W TDP - 692 Euro
  • Opteron 6172 (12 cores) | 2.1 GHz | 12MB L3 cache | 115W TDP - 917 Euro
  • Opteron 6174 (12 cores) | 2.2 GHz | 12MB L3 cache | 115W TDP - 1,078 Euro
Sources: AMD Blogs, TechConnect Magazine
Add your own comment

125 Comments on AMD Starts Shipping 12-core and 8-core ''Magny Cours'' Opteron Processors

#51
FordGT90Concept
"I go fast!1!11!1!"
TIGRThe human brain (the most powerful computer known) is massively parallel.
The human brain is designed to process senses; computers are designed to process binary. As a human brain to process binary and it fails. Ask a computer to process imagery and it fails.
TIGRImproving per-core performance is still extremely important, and I don't think AMD or Intel are abandoning that in favor of just increasing core count. Look at the per-core difference between C2D and Core lines of CPUs.
Core performs miserably without Hyperthreading enabled. The major improvements to Core are improving the instruction sets which streamline complex processes.
TIGRAnyway, mainstream multi-core computing is still in its infancy. The main issue seems to be software algorithms and implementation, not some flaw with the concept of multiple CPU cores itself. There will be challenges in the future, such as the jump from multi-core to many-core CPUs, but I see no signs that multi-core computing is a dead end.
The hardware causes the software flaws but maybe that's just it. Instead of having multiple asynchronous cores, why not make the cores themselves synchronous. Actual thread states are handled on the hardware, instead of software level. That would virtually eliminate software overhead.

I see the signs although they generally aren't anything to be worried about now on a quad-core; however, the more cores there are, the bigger the problem becomes. I don't want to imagine how much trouble it will be to multithread the code that handles multiple cores. The potential for errors, collisions, and other problems are exponentially increased.
Posted on Reply
#52
ShiBDiB
FourstaffDesktop versions in AM3 or new socket? I still can't see (above) average Joe using more than 4 cores, let alone 6
ditto

These r useless for everyday users. Most games arent even coded to use 2 cores let alone 12
Posted on Reply
#53
Hayder_Master
prices are awesome, top performance per dollar cpu's
Posted on Reply
#54
Phxprovost
Xtreme Refugee
ShiBDiBditto

These r useless for everyday users. Most games arent even coded to use 2 cores let alone 12
and the rate by which pc devs are jumping ship, there never will be a time where games use them all
Posted on Reply
#55
ShiBDiB
Phxprovostand the rate by which pc devs are jumping ship, there never will be a time where games use them all
?

dual cores have been out for how long now and we still dont see universal acceptance of them by dev's... I <3 people who spew bs
Posted on Reply
#56
Phxprovost
Xtreme Refugee
ShiBDiB?

dual cores have been out for how long now and we still dont see universal acceptance of them by dev's...
:wtf: and my point is pretty much all devs are abandoning pc game releases or release crap ports that are hardly optimized...... I <3 people who cant read
Posted on Reply
#57
FordGT90Concept
"I go fast!1!11!1!"
Consoles are going the same way as PCs though. Xbox360 has a tri-core w/ SMT (6 threads at a time) CPU and PS3 has a dual-core with up to 8 sub-processors. The only exception is the Wii which still has a single core CPU (as far as anyone can tell).
Posted on Reply
#58
driver66
ShiBDiB?

dual cores have been out for how long now and we still dont see universal acceptance of them by dev's... I <3 people who spew bs
Lay off the booze bro .........:toast: He was agreeing with you :slap:
Posted on Reply
#59
Melvis
troyrae360I was under the understanding that that the 12 core was 2x 6 core cpu's "sticky taped" I could be wrong though
Yea i new about the 12 cores been two 6 cores sticky taped together (AMD's lingo) But i had no idea on the 8 core as i have not heard anything about the 8 core CPU's till now :ohwell:
Posted on Reply
#60
Mussels
Freshwater Moderator
well, my next system might just be AMD... would let me re-use my 4870's in crossfire at least (my crossfire problems stem from the intel chipset)



one thing all you naysayers are forgetting, is that DX11 comes with multithreading as part of its basic design.. next gen games are going to use our spare threads quite well :)
Posted on Reply
#61
Wile E
Power User
Meh. Useless for desktop market. And I doubt we'll see this in a desktop variant. Just look at the package size.

And what Intel Crossfire issues? Your Crossfire issues do not stem from the Intel chipset, they stem from the shitty ATI drivers.
Posted on Reply
#62
Mussels
Freshwater Moderator
Wile EMeh. Useless for desktop market. And I doubt we'll see this in a desktop variant. Just look at the package size.

And what Intel Crossfire issues? Your Crossfire issues do not stem from the Intel chipset, they stem from the shitty ATI drivers.
no, they stem from a problem where my cards flicker with Vsync off on intel chipsets. they dont do it on AMD boards.
Posted on Reply
#63
Wile E
Power User
Musselsno, they stem from a problem where my cards flicker with Vsync off on intel chipsets. they dont do it on AMD boards.
And if they coded proper drivers, it wouldn't be an issue.
Posted on Reply
#64
Mussels
Freshwater Moderator
Wile EAnd if they coded proper drivers, it wouldn't be an issue.
its a chipset issue. works on x58 boards, just not on 965 through x48/45. only seems to happen on 38x0 and 48x0 cards too
Posted on Reply
#65
FordGT90Concept
"I go fast!1!11!1!"
Musselsone thing all you naysayers are forgetting, is that DX11 comes with multithreading as part of its basic design.. next gen games are going to use our spare threads quite well :)
But that doesn't alleviate the problem of the master thread (orchestrates the worker threads) bringing everything else to a crawl; moreover, Windows 7 does a really, really bad job at synchronizing threads. For example, you can't play most games with WCG running because performance will drop like a rock despite 4 cores being completely idle. One core gets held back just a tiny bit then other cores end up waiting for it. We also can't forget that Windows 7 itself suffers from the same thread prioritizing problems when dragging and dropping files while the CPU is 100% loaded (idle).

It's difficult to explain but multi-core doesn't have a very bright future. Everything about them multiplies complexity of operating systems to software. Until that is fixed on the hardware level, no one is going to be excited about more cores except Intel/AMD (because its cheap and easy) and consumers (because it's the new fad for incorrectly cataloguing performance like clockspeeds were up to Pentium 4/D).

Call me a pessimist but this trend is more harmful than helpful to developers and by extension, consumers.
Posted on Reply
#66
Mussels
Freshwater Moderator
FordGT90ConceptBut that doesn't alleviate the problem of the master thread bringing everything else to a crawl; moreover, Windows 7 does a really, really bad job at synchronizing threads. For example, you can't play most games with WCG running because performance will drop like a rock despite 4 cores being completely idle. One core gets held back just a tiny bit then other cores end up waiting for it. We also can't forget that Windows 7 itself suffers from the same thread prioritizing problems when dragging and dropping files while the CPU is 100% loaded (idle).

It's difficult to explain but multi-core doesn't have a very bright future. Everything about them multiplies complexity of operating systems to software. Until that is fixed on the hardware level, no one is going to be excited about more cores except Intel/AMD (because its cheap and easy) and consumers (because it's the new fad for incorrectly cataloguing performance like clockspeeds were up to Pentium 4/D).

Call me a pessimist but this trend is more harmful than helpful to developers and by extension, consumers.
it may no solve it, but it'll help - and in every (DX11) game, too.
Posted on Reply
#67
Wile E
Power User
Musselsits a chipset issue. works on x58 boards, just not on 965 through x48/45. only seems to happen on 38x0 and 48x0 cards too
If it only happens with 38x0 and 48x0 and only on certain chipsets, it's a driver problem, or a hardware fault by ATI. Either way, it's ATI's fault.
Posted on Reply
#68
FordGT90Concept
"I go fast!1!11!1!"
Musselsit may no solve it, but it'll help - and in every (DX11) game, too.
It makes it easier for the developer by making the GPU render stream work somewhat asymmetrically. The CPU load is the same.
Posted on Reply
#69
TIGR
Parallel computing is without a doubt the future; it just needs to mature, like every other technology in the history of humankind.
Posted on Reply
#70
pantherx12
FordGT90ConceptThis is getting ridiculous. Most applications aren't very good candidates for multithreading so more per-core performance is still ideal. Someone has to change this trend of gluing more cores on to more core performance. Multiple cores create needless overhead and before long, applications will be slower tomorrow than they are today because overhead exceeds actual work done.
Your thinking about this the wrong way fella.

Firstly these are for servers at the moment, as people were saying what used to take 12 cpus ( 4 cores each) can be done with 4 cpus.

That's space saving! ( aswell as cheaper eventually)

Also it means servers can process more incoming requests etc so online games could hold much more avatars in one area etc .

Also means if someone made a modified L4D server they could have 1000 or more zombies come at you at once rather then the typical 50 or so :P



Ontop of that imagine running several OS at once simultaneously, got a program that won't run on windows, no problem just switch to linux instantly.

You need to think outside your current thinking and see the potential.



Oh also your statement about computers not being able to recognise imagery is quickly becoming less and less true, hell hondas little robot can recognise chairs and cars etc, even recognise the model of the car if its been taught it.

With more powerful cpus with more cores it will be able to function even better.

Can use bunches of 10 cores to control individual body parts as well to give it much greater dexterity etc.
Posted on Reply
#71
TIGR
pantherx12Oh also your statement about computers not being able to recognise imagery is quickly becoming less and less true, hell hondas little robot can recognise chairs and cars etc, even recognise the model of the car if its been taught it.
I'm too lazy/tired to respond to him on a point to point basis at the moment, but this is an important consideration. Research of the human body shows we are more like computers than ever thought before (DNA a digital code, for example), and R&D into the most powerful and promising future computer systems is being done by reverse engineering the way the human brain works. Things our brains can do well are what we increasingly want our computers to do, so it makes sense: things like pattern recognition (identifying distinct objects in two or three dimensional video/simulations), learning (evolutionary programming), etc.

Seeing how effective massively parallel computing makes the human brain at such tasks, is teaching researchers that if we want our computers to perform increasingly "intelligent" and profound operations, we're going to have to step out of the box to take computing to the next level. We have to think beyond traditional methods, because they can only take us so far. At this point, the "next level" is massively parallel hardware. The ability of software to utilize it well will come as the technology matures.
Posted on Reply
#72
btarunr
Editor & Senior Moderator
pantherx12Your thinking about this the wrong way fella.

Firstly these are for servers at the moment, as people were saying what used to take 12 cpus ( 4 cores each) can be done with 4 cpus.

That's space saving! ( aswell as cheaper eventually)

Also it means servers can process more incoming requests etc so online games could hold much more avatars in one area etc .

Also means if someone made a modified L4D server they could have 1000 or more zombies come at you at once rather then the typical 50 or so :P



Ontop of that imagine running several OS at once simultaneously, got a program that won't run on windows, no problem just switch to linux instantly.

You need to think outside your current thinking and see the potential.



Oh also your statement about computers not being able to recognise imagery is quickly becoming less and less true, hell hondas little robot can recognise chairs and cars etc, even recognise the model of the car if its been taught it.

With more powerful cpus with more cores it will be able to function even better.

Can use bunches of 10 cores to control individual body parts as well to give it much greater dexterity etc.
To put that in one word: Virtualization.

In one line: Virtual servers in data centers, where one physical server with one or two physical CPUs can be used to rent 12 web-servers, each suiting the customer's needs.
Posted on Reply
#73
FordGT90Concept
"I go fast!1!11!1!"
TIGRParallel computing is without a doubt the future; it just needs to mature, like every other technology in the history of humankind.
Parallel computing is the past. Super computers have been doing it for decades. It comes to your home and everyone is in awe. Problem is: what use is a screw driver without screws? Hence, the fad.
pantherx12Firstly these are for servers at the moment, as people were saying what used to take 12 cpus ( 4 cores each) can be done with 4 cpus.
I know that and they suite server tasks well. The problem is these processors have no use in workstations because most workstations aren't highly scalable like server applications. That's not very likely to change either so Intel/AMD are trying to convince corporations to virtualize and cloud compute. Well, cloud computing especially doesn't work in homes because very few homes have a server and gaming through virtualization is nothing more than a pipe dream today.

Intel/AMD is trying to cater to one crowd (enterprise) while consumers get shafted because workstation/home computers are well-rounded machines and not task oriented.
pantherx12Also means if someone made a modified L4D server they could have 1000 or more zombies come at you at once rather then the typical 50 or so :P
Your GPU will be crying for mercy long before your CPU. And still, there is little one fast core couldn't do than 100 slow cores. Personally, I think mainstream processors should have no more than four cores. The focus needs to be on core performance. If, as I stated earlier, that takes symmetrical core design, so be it. The point is: most users with quad cores rarely see their CPU usage over 50% if not 25% doing anything they do on a day to day basis.
pantherx12Ontop of that imagine running several OS at once simultaneously, got a program that won't run on windows, no problem just switch to linux instantly.
Unless you are talking about virtualization, that doesn't work: resource collisions.
pantherx12You need to think outside your current thinking and see the potential.
I'm looking 10-50 years out here. The prognosis starts getting grim in about 6 years when die shrinks are no longer possible. From there, it's nothing but question marks. Nothing revolutionary has happened in computing since the 1970s. We're still using the same old techniques with different materials.
pantherx12Oh also your statement about computers not being able to recognise imagery is quickly becoming less and less true, hell hondas little robot can recognise chairs and cars etc, even recognise the model of the car if its been taught it.
Which demonstrates the brain is falling behind. We can build processors faster, not brains (at least not yet). It is still inefficient because images don't translate well to binary but that's the nature of the beast.
pantherx12With more powerful cpus with more cores it will be able to function even better.
Only if the process is not linear. If step b requires the result from a, step c requires the result from b, step d require the result from c, and so on, it is doomed to forever be slow in the foreseeable future. That is what most concerns me (aside from manufacture process).
pantherx12Can use bunches of 10 cores to control individual body parts as well to give it much greater dexterity etc.
A 486 could handle that with lots of room to spare. Computer controlled robots have been in use a long time.
btarunrTo put that in one word: Virtualization.

In one line: Virtual servers in data centers, where one physical server with one or two physical CPUs can be used to rent 12 web-servers, each suiting the customer's needs.
Oh, so you want some nameless corporation 1,000 miles from where you live to know everything you did and are doing? That's the Google wet-dream there. They would know everything about you from how much is in your checking account to which sites you frequent, to all your passwords and user names, to your application usage, to everything that would exist in your "personal computer." Cloudcomputing/virtualization is the epitome of data-mining. Google already knows every single search you made in the past six months with their search engine.

Corporations want this. It is vital we not give it to them. Knowledge is power.
Posted on Reply
#74
TIGR
In terms of parallel computing, you haven't seen anything yet.

If you want to fight the concept, go build a better system that doesn't utilize it. Otherwise, take a look around. Multi-core CPUs, multi-CPU systems, Crossfire and SLI, RAID ... running components in parallel isn't perfectly efficient, but guess what: neither is anything else.

Sure, maybe there's overhead, and maybe more than we'd like (although that's improving), but as a car guy [I gather] you should understand very well that sometimes you have to take a loss to make bigger gains (unless you don't believe in forced induction either?).
Posted on Reply
#75
pantherx12
You not seen how slow the bastarding thing is?

That's not due to its motors its has it doesn't have the processing power to run !

Unlike humans that build up muscle memory and automatic responses a machine has to think about moving, so once its phsyical speed starts building up it becomes more and more difficult.

Where as having a CPU core for each sensor it has it will be able to adjust things all that much quicker.

( thus move quickly)

Same reason the thing falls arse over tit when climbing stair cases sometimes :p
Posted on Reply
Add your own comment
Jul 6th, 2024 08:10 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts