• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Opteron "Piledriver" Processors Arrive Mid-November

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,885 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
While AMD FX "Piledriver" client processors in the AM3+ package are just around the corner, slated for a little later this month, the company's first enterprise processors for servers and workstations, based on the new micro-architecture, are slated for mid-November, according to a report. AMD could begin with an overhaul of its multi-socket enabled Opteron 6200 series and single/dual-socket enabled Opteron 4200 series with the new Opteron 6300/4300 series, featuring the 8-core "Piledriver" silicon. The multi-socket enabled Opteron 6300 series will consist of nine models, tabled below.



View at TechPowerUp Main Site
 
Piledriver cores...
 
Remember Remember the Middle of November... couldn't resist

anyhow I think it's kinda odd the amount of L3 cache doesn't scale with the number of cores.
 
anyhow I think it's kinda odd the amount of L3 cache doesn't scale with the number of cores.

Because Opteron 6300 is an MCM (like Core 2 Quad). It's one package with two 8-core "Piledriver" dies.
 
that explains it, thanks
 
16 AMD cores - nice - yipiii... maybe can compete with 4 intel cores, finaly.
 
Trolls need banned. Either they're just bags of c***s or are too stupid to realize how powerful these chips are.

Intel cannot compete against these chips in server. They're much cheaper and as fast or faster for the same solution. The only exception is if you're using some draconian software licensing per core.
 
Trolls need banned. Either they're just bags of c***s or are too stupid to realize how powerful these chips are.

Intel cannot compete against these chips in server. They're much cheaper and as fast or faster for the same solution. The only exception is if you're using some draconian software licensing per core.

The 10-core and 8-core Xeons that have been out for some time will most likely blow these Opterons out of the water in most applications, the only saving grace being that Opterons are way cheaper.
 
what could i do with a 16 core processor?
 
The 10-core and 8-core Xeons that have been out for some time will most likely blow these Opterons out of the water in most applications, the only saving grace being that Opterons are way cheaper.

10 core, no, but I can't imagine those being a big hit haha. Cost is becoming a major issue (power and cooling exacerbate this). If you have one server that's HALF the price, uses less power and perhaps is slightly slower depending on workload (I've seen many benchmarks showing the opposite), it's still a no brainer.

But, nothing will change, just as we saw with the XP and 64. Marketing, brainwashing, and the money to coerce will win out.
 
Remember Remember the Middle of November...

of the piledriver and opteron lot!

i know of no reason why intel's treason
should make AMDs performance drop ;p
 
Last edited:
The 10-core and 8-core Xeons that have been out for some time will most likely blow these Opterons out of the water in most applications, the only saving grace being that Opterons are way cheaper.

10 core, no, but I can't imagine those being a big hit haha (and could always just go for another socket, opterons like to scale). Cost is becoming a major issue (power and cooling exacerbate this). If you have one server that's HALF the price, uses less power and perhaps is slightly slower depending on workload, it's still a no brainer. Spending the same amount of money, it wouldn't even be a contest.

But, nothing will change, just as we saw with the XP and 64. Marketing, brainwashing, and the money to coerce will win out.
 
10 core, no, but I can't imagine those being a big hit haha (and could always just go for another socket, opterons like to scale). Cost is becoming a major issue (power and cooling exacerbate this). If you have one server that's HALF the price, uses less power and perhaps is slightly slower depending on workload, it's still a no brainer. Spending the same amount of money, it wouldn't even be a contest.

But, nothing will change, just as we saw with the XP and 64. Marketing, brainwashing, and the money to coerce will win out.

1 2 4 8 16 32 64. Bits even out properly with those cores

16 AMD cores - nice - yipiii... maybe can compete with 4 intel cores, finaly.

careful your video card is made by them.
 
10 core, no, but I can't imagine those being a big hit haha (and could always just go for another socket, opterons like to scale). Cost is becoming a major issue (power and cooling exacerbate this). If you have one server that's HALF the price, uses less power and perhaps is slightly slower depending on workload, it's still a no brainer. Spending the same amount of money, it wouldn't even be a contest.

But, nothing will change, just as we saw with the XP and 64. Marketing, brainwashing, and the money to coerce will win out.

http://www.intel.com/content/www/us/en/processor-comparison/processor-specifications.html?proc=53580

You mention power consumption, and you're right that it's a big deal. But when you can buy one 130W Xeon to do the work of two 140W Opterons, it just falls as an argument in Intel's favour.

All AMD have going for them in this space is that the initial purchase cost of Opterons is far lower.
 
http://www.intel.com/content/www/us/en/processor-comparison/processor-specifications.html?proc=53580

You mention power consumption, and you're right that it's a big deal. But when you can buy one 130W Xeon to do the work of two 140W Opterons, it just falls as an argument in Intel's favour.

All AMD have going for them in this space is that the initial purchase cost of Opterons is far lower.

which infact works for majority of companies, they are looking for bang for buck honestly.

I mean isnt that what you bought your machine initially for too?
 
which infact works for majority of companies, they are looking for bang for buck honestly.

I mean isnt that what you bought your machine initially for too?

I don't run it 24/7 at full speed, and I don't have to hire and air condition a massive space to keep it in. I know people who do server procurement, and they aren't interested in "bang for buck". You and I have a set budget, and we get the most performance we can for that. People who buy servers have a set performance level that they need, and they get the most cost efficient solution for that performance level, where cost efficiency includes space efficiency and energy efficiency. Adding another socket will cost ÂŁ1000s extra over the lifetime of the server, on top of the original purchase price.
 
I don't run it 24/7 at full speed, and I don't have to hire and air condition a massive space to keep it in. I know people who do server procurement, and they aren't interested in "bang for buck". You and I have a set budget, and we get the most performance we can for that. People who buy servers have a set performance level that they need, and they get the most cost efficient solution for that performance level, where cost efficiency includes space efficiency and energy efficiency. Adding another socket will cost ÂŁ1000s extra over the lifetime of the server, on top of the original purchase price.

= bang for buck

btw TDP is measured differently between both companies so it still is a moot point, there is no standard of measuring it

/thread
 
Last edited:
= bang for buck

btw TDP is measured differently between both companies so it still is a moot point, there is no standard of measuring it

/thread

Sure, TDP measurement is pretty similar, but it isn't directly comparable. So go read Anand's reviews, where he (shock horror) finds that Xeons are far more power efficient.

Also,
max(bang/buck)
does not equal
min(buck) for which (bang=x)

And in addition, when we talk about bang/buck in consumer terms, we normally totally ignore "buck" contributors other than initial purchase price.

The sad thing about this is that I actually partially agree with you. Opterons are the superior value proposition for most users, providing that no floating point capability is required. But your irrational argumentation and unwillingness to acknowledge rational arguments that oppose you totally discredits you.
 
16 AMD cores - nice - yipiii... maybe can compete with 4 intel cores, finaly.

4 AMD cores can easily beat 2 Intel cores in multithreaded workloads, so 16 cores would be able to crush 4 Intel cores without even breaking a sweat. Of course, in singlethreaded environments Intel holds the lead, but server is all about multithreaded.

Intel cannot compete against these chips in server. They're much cheaper and as fast or faster for the same solution. The only exception is if you're using some draconian software licensing per core.

Intel's claim to fame is its much lower TDP. Given that over a lifetime of a server the bulk of the cost goes to power consumption I wouldn't be surprised if these chips actually work out to be roughly equal, with Intel having a higher front-end cost, and AMD's spread out a bit more (in terms of power bills).
 
Actually the higher clocks on the Xeons (plus better IPC) gives the Xeons a slight advantage in most situations. A small boost, but a noticeable one nonetheless. However, for how powerful the Opterons are, their price tag is much nicer than Intel's without much loss in performance. AMD does multi-threading and multi-"core" well. (When I say multi-core, I'm really meaning CPUs in general that support some form of symmetric multi-processing). AMD has a much better bang for your buck with excelling in a number of tasks, where Intel does come ahead on more of these tests, you're paying almost twice as much from your entry 8-core Intel CPU. So consider for a moment, if you can pay for two 16 core server CPUs for the price of one Xeon.

So lets consider the things that each does best.

Xeon:
Good IPC
High Clocks
8 real cores, plus 8 non-linear scaling threads (consider that HT performance is not consistent and can suffer when doing similar tasks at once).

Opteron:
Lower price
16 "almost real cores", (Integer ops here are the best, floating point ops aren't as powerful but you know how much every module will give you as far as performance is concerned.)

So with all of this said, yeah Intel might be a little bit faster per thread because of the improved IPC but AMD is getting really good at cramming a lot of cores on to a CPU. HyperThreading is great but if you're using the same parts of the CPU for both threads that are running you're performance could suffer on your HT threads. At least with AMD all 16 of those threads have dedicated resources with the exception of the floating point unit, but even that can be efficiently used with the proper instructions.

So all in all, as a System Admin, I would rather have an Opteron for a database or a computing cluster where for an application server, a Xeon might be better. For storage, I would opt whichever used the least power in combination with the price tag.

So how about everyone who is uninformed doing this bulls**t move of "AMD Opteron is better!" and "Intel Xeon is better!" just drop it. Both the Xeons and Opterons are both very good processors and do certain things better than the other.

Might want to ask BuckNasty about his one and about to be two 4P Opteron servers using the older magny cours 12 core procs, that's 48 cores. I'm willing to be he paid no more for that entire machine than for one middle model 8-core Xeon. The Xeon might be faster, but the big question is, would it be worth it and is it cost effective.

Actually, one Intel HT core > one AMD module in most multithreaded workloads.

http://www.anandtech.com/bench/Product/287?vs=434

Most of that performance mostly isn't off the hyper-threading. It's also highly task dependent. Consider for a moment if your "core" is using the ALU to add and your HT thread wants to add too, the HT thread has to wait until the core "thread" is done adding. Then the HT thread goes to add, and if the core has to do it again, it will have to wait for the HT thread. (This is more true of instructions that take more than a couple cycles, so ADD isn't an accurate one because I'm pretty sure both Intel and AMD ALUs do integer ADDs in one cycle.) The general point being, is that AMD CPUs scale almost linearly as more workload is applied to the server, where Intel servers with HT enabled do not.
 
Last edited:
http://www.intel.com/content/www/us/en/processor-comparison/processor-specifications.html?proc=53580

You mention power consumption, and you're right that it's a big deal. But when you can buy one 130W Xeon to do the work of two 140W Opterons, it just falls as an argument in Intel's favour.

All AMD have going for them in this space is that the initial purchase cost of Opterons is far lower.

Do those 130W Xeons have 16 cores? No? Then they can't do the same work.
Example: for Virtualization you need as many physical cores you can get (in this case 4x16). Hyper-threading won't help you here, and I remember reading a document (provided by HP for their servers) that you should actually disable HT.

Simply saying Intel does everything better is not true.

which infact works for majority of companies, they are looking for bang for buck honestly.

I mean isnt that what you bought your machine initially for too?

Depends....sometimes you only look at the performance demands which need to be met.

And also a thing to remember is that the initial cost of hardware is lower than software (licensing etc. sometimes a lot lower). You also need to plan the power consumption over a longer period of time (my ML350G4 worked 24/7 from the day it was turned on) and the cooling costs.
 
Back
Top