# AMD Readies 16-core Processors with Full Uncore



## btarunr (Jan 17, 2014)

AMD released developer documentation for a new processor it's working on, and the way it's worded describes a chip with 8 modules, working out to 16 cores, on a single piece of silicon, referred to as Family 15h Models 30h - 3fh. This is not to be confused with the company's Opteron 6300-series "Abu Dhabi" chips, which are multi-chip modules of two 8-core dies, in the G34 package. 

What's more, unlike the current "Abu Dhabi" and "Seoul" chips, the new silicon features a full-fledged uncore, complete with a PCI-Express gen 3.0 root complex that's integrated into the processor die. In what's more proof that it's a single die with 8 modules and not an MCM of two dies with 4 modules each, the document describes the die as featuring four HyperTransport links; letting it pair with four other processors in 4P multi-socket configurations. Such systems would feature a total core count of 64. There's no clarity on which exact micro-architecture the CPU modules are based on. Without doubt, AMD is designing this chip for its Opteron enterprise product stack, but it should also give us a glimmer of hope that AMD could continue to serve up high-performance client CPU, only ones that can't be based on socket AM3+.





*View at TechPowerUp Main Site*


----------



## RCoon (Jan 17, 2014)

I sincerely doubt this will ever come to any other market besides Opteron. If it does, enjoy the 50000W TDP.


----------



## FreedomEclipse (Jan 17, 2014)

16 _'real' _cores, amirite?


----------



## Fiery (Jan 17, 2014)

"Family 15h Models 30h - 3fh"

"There's no clarity on which exact micro-architecture the CPU modules are based on"

Family 15h Models 30-3Fh means Steamroller uarch.


----------



## buildzoid (Jan 17, 2014)

If there was a Opteron compatible motherboard that was laid out for desktops not green PCB, high power VRM, BIOS with OC features, less lan connectors and more USB I'd buy this or a 12 core version for my next build.


----------



## Yorgos (Jan 17, 2014)

RCoon said:


> I sincerely doubt this will ever come to any other market besides Opteron. If it does, enjoy the 50000W TDP.


call it 100 Watt SDP and sell it to customers then 

edit: all jokes aside, for those of us that are in deeper waters there is this thing called linux that provides info about cpus and their μ-arch earlier than anyone else.
https://lkml.org/lkml/2013/6/5/989 
On Wed, Jun 05, 2013 at 03:50:03PM


----------



## west7 (Jan 17, 2014)

it will be great to see 8-10 cores desktop version of this


----------



## ZetZet (Jan 17, 2014)

west7 said:


> it will be great to see 8-10 cores desktop version of this


not happening. just get the apu


----------



## Baum (Jan 17, 2014)

Well why not start to work on better power efficency for the high performance segment, just not to get phased out by cpu power some years ahead....


----------



## Mathragh (Jan 17, 2014)

Fiery said:


> "Family 15h Models 30h - 3fh"
> 
> "There's no clarity on which exact micro-architecture the CPU modules are based on"
> 
> Family 15h Models 30-3Fh means Steamroller uarch.



If this indeed is a CPU based on the steamroller core, putting even more cores on die makes sense. With steamroller they repositioned the power-optimum of the cores way downwards, resulting in slightly worse scaling in high frequencies, but much better scaling at lower frequencies(just take a look at the anandtech 45W vs 95W parts, 45W keeps a huge chunk of performance while using way less power). If you want to optimally use this property, more lower clocked cores makes a lot of sense, and SR cores should mitigate atleast some of lost performance as a result of lower clockspeeds.

I'm not to sure about whether they'll be able to effectively compete with intel high core count products though, but I suppose they're certain enough themselves since they're bringing out this product.


----------



## Fiery (Jan 17, 2014)

Mathragh said:


> If this indeed is a CPU based on the steamroller core, putting even more cores on die makes sense. With steamroller they repositioned the power-optimum of the cores way downwards, resulting in slightly worse scaling in high frequencies, but much better scaling at lower frequencies(just take a look at the anandtech 45W vs 95W parts, 45W keeps a huge chunk of performance while using way less power). If you want to optimally use this property, more lower clocked cores makes a lot of sense, and SR cores should mitigate atleast some of lost performance as a result of lower clockspeeds.
> 
> I'm not to sure about whether they'll be able to effectively compete with intel high core count products though, but I suppose they're certain enough themselves since they're bringing out this product.



What AMD uses since they won the console deal is a LEGO-like architecture.  They have a 2-core Bulldozer/Piledriver/Steamroller module or a 4-core Jaguar/Puma module, and they can pack in as many of them on a CPU/APU die as the target market expects them to do.  They also have a variety of uncore modules (e.g. 2-channel DDR3-2133, 4-channel GDDR5, etc), and they can pick the one best for the target market.  Then of course they have various GCN iGPUs, ranging from 128 units up to 3072 units, and so they can pair it with virtually any CPU.  That approach is very useful when you build so many variants of APUs for very different markets, e.g. Temash, Kabini, XBox One, PS4, Kyoto (8-core Jaguar), Kaveri, etc.

The only problem is: AMD doesn't have any Steamroller or Excavator uarch based 2P/4P/8P CPUs or APUs on their roadmap.  No matter if they talk about such possibilities in one of their many documents, they do not wish to compete with Intel in those markets anymore.  The best they have and will have is Kaveri for 1P servers/workstations, a.k.a. Berlin.


----------



## TheoneandonlyMrK (Jan 17, 2014)

btarunr said:


> AMD released developer documentation for a new processor it's working on, and the way it's worded describes a chip with 8 modules, working out to 16 cores, on a single piece of silicon, referred to as Family 15h Models 30h - 3fh. This is not to be confused with the company's Opteron 6300-series "Abu Dhabi" chips, which are multi-chip modules of two 8-core dies, in the G34 package.
> 
> What's more, unlike the current "Abu Dhabi" and "Seoul" chips, the new silicon features a full-fledged uncore, complete with a PCI-Express gen 3.0 root complex that's integrated into the processor die. In what's more proof that it's a single die with 8 modules and not an MCM of two dies with 4 modules each, the document describes the die as featuring four HyperTransport links; letting it pair with four other processors in 4P multi-socket configurations. Such systems would feature a total core count of 64. There's no clarity on which exact micro-architecture the CPU modules are based on. Without doubt, AMD is designing this chip for its Opteron enterprise product stack, but it should also give us a glimmer of hope that AMD could continue to serve up high-performance client CPU, only ones that can't be based on socket AM3+.
> 
> ...


 
I like your assertion that you were right about AM3+ Bta

No comment about you being WRONG about AMD big cores though eh FX is dead my ASSSSSS.

and you may have to re address your assertion AM3+ is gone bacause even with that uncore this could all still go in a G34 socket and likely will,, since in server land socket swaps are even rarer then consumer land.

If it has the hyperlink bus still in it and likely will have 2-3 ,,everything stays on the table as you can theoretically slap a NB off of it, and pciex 3 support direct linked to the uncore could very easily bypass any Nbridge which would still have pciex lanes available. The memory controller as happens today is available on or off die via the NB or CPUNB  rending this chip supportable by older sockets even with ddr3, though the memory controller on this chip is not listed so is up in the Air.

The Big Cores live nice and steamy too 


As for the road map dude, Those we have actually seen have been complete BS or only consumer level parts covering 2013- end of 2014 start 0f 15 and not beyond so wadya know that we dont and any slides??.


----------



## anonymous6366 (Jan 17, 2014)

and then a quad core i7 is still faster -.- lol


----------



## JDG1980 (Jan 17, 2014)

ZetZet said:


> not happening. just get the apu



Did it occur to you that AMD wants people to _think_ that it's not happening, because they want people to buy Kaveri now instead of waiting for something better later on?

Roadmaps are marketing documents first and foremost.


----------



## JDG1980 (Jan 17, 2014)

I wonder if this will be done on the 28nm process (like Kaveri), or back-ported to 32nm.

28nm is more energy-efficient, but 32nm offers higher clock speeds. On one hand, server parts generally don't require the highest possible clock rates. On the other hand, higher clock rates would be helpful if they want to sell the lower-binned parts to enthusiasts (as is the common practice), and AMD may still have obligations to buy a certain number of 32nm wafers from Global Foundries for the next few years.

Either way, it's good news that AMD has not given up on the high-margin big-core server segment.


----------



## librin.so.1 (Jan 17, 2014)

IF this is not only for the Opteron line and will include models for desktop computers, even if they have a smaller core count (e.g. max 8 cores for desktop models)...
...I am going to cry in happy manly tears.

like this:


Spoiler


----------



## lilhasselhoffer (Jan 17, 2014)

...interesting...

I don't understand it, but I see where this is going.

AMD has pushed for more cores.  In order to get these extra cores, they first compromised components, and are now compromising integration.  From a consumer level CPU standpoint, this is absolutely moronic.  From AMD's perspective this is gold.  Allow me to explain.

Let's say that AMD is one cohesive unit, not a GPU/CPU divided company.  The CPU is one set of cores, designed to work (originally) in very linear tasks.  The GPU is a collection of processors, which lack the raw performance of a CPU with regards to precision.  AMD is trying to marry the two ideas together, like they have with the APU.

Now, I don't believe this is a great idea for everything.  I don't care that a computer can have integrated graphics and play a 4 year old game at low resolution with an acceptable frame rate.  I do want my tablet to be something more than a toy.  Give me two different processors, and we'll be on the same page.  The truth is that software isn't out there, on the consumer side, to make use of huge core count processors.  In a few year, it'll be different.  Hopefully the APU isn't too far ahead of the curve to survive.  They'll have the core counts high, the heat output low (due to lower clocking), and a decent install base.  I definitely hope they will beat Intel, without even being seen as a valid competitor.

Edit:
Forgot to mention this.  The software doesn't exist on the consumer side, but the server side is aching for more cores.  This isn't great news for the consumer, but it is great for servers.


----------



## fullinfusion (Jan 17, 2014)

Hell yeah I'll take one! I miss amd as intel is just boring as rcoons posts lol..


----------



## harry90 (Jan 17, 2014)

All amd Needed to do was to improve the single core performance of their cpu's. who needs more than 8 cores? Just enhance the IPC, single core performance by 40-60% and they could compete with intel!!!


----------



## Blue-Knight (Jan 17, 2014)

anonymous6366 said:


> and then a quad core i7 is still faster -.- lol


I was thinking exactly that... LOL!


----------



## newtekie1 (Jan 17, 2014)

I will say it again, I still want AMD to move to a unified socket.  Use the same LGA socket for servers, for performance desktops, and for mid-range to low end desktops.  Of course all their processors will be APUs(Intel's already are), allow us to put anything from dual-cores up to 16-core processors in desktops or even put Opterons in desktops if we wanted to.  That to me is a winning idea for AMD.  Don't segregate the market by sockets, let someone start with a super cheap processor to get them up and running, and still have the option to upgrade to a high end processor if they want.



harry90 said:


> All amd Needed to do was to improve the single core performance of their cpu's. who needs more than 8 cores? Just enhance the IPC, single core performance by 40-60% and they could compete with intel!!!





Blue-Knight said:


> I was thinking exactly that... LOL!



The thing is, AMD designs their CPUs very differently from Intel.  Intel, because they rely on hyper-threading, designs their single core to do the work of two cores.  Of course this means when it is only loaded with a single threaded work load, it is extremely fast.  This is one of the reasons AMD was reasonably close to Intel in single threaded performance during the early Athlon 64/x2 days when they were competing with Conroe(yes, conroe was still faster, but AMD was a lot closer back then).  Intel wasn't designing their processors to use hyper-threading.

Honestly, I don't see a need for AMD to increase single threaded performance to meet Intel.  The reason is that AMD's single threaded performance is _good enough_.  Yes, they lag behind in benchmarks, but in real world use there really isn't anything that is single threaded that AMD can't handle.  Saddly, games are still heavily dependent on single threaded performance, but most modern games still run perfectly well on AMD processors despite this, because AMD's single threaded performance is good enough.  There are a few exceptions, StarCraft II comes to mind, because it is extremely CPU heavy and extremely single threaded.


----------



## BiggieShady (Jan 17, 2014)

harry90 said:


> Just enhance the IPC, single core performance by 40-60% and they could compete with intel!!!



Just? They would have to do ballsy thing Intel has done when they sucked with NetBurst - they need to go back to their older, more efficient architecture and improve on that. Just like Intel ditched NetBurst in favor of Pentium M architecture which improved became Core architecture.


----------



## Popocatepetl (Jan 17, 2014)

lilhasselhoffer said:


> ...interesting...



Wow, a post made of pure 100% stupid. Never, ever utilize your 100% stupid when posting.



> I don't care that a computer can have integrated graphics and play a 4 year old game at low resolution with an acceptable frame rate.



According to various online reviews of Kaveri that thing runs Battlefield 4 (without Mantle which shoudl enable even better performance) with reasonable framerates. Most reviews used recent titles and came up with figures of 30+ fps.

From this point on your post devolves into incoherent babbling about tablets (?!) and whatnot. Again, never use 100% stupid in your posts. Thanks.


----------



## BiggieShady (Jan 17, 2014)

Popocatepetl said:


> Wow, a post made of pure 100% stupid. Never, ever utilize your 100% stupid when posting.
> 
> 
> 
> ...



I just went through all 5 of your posts. Up until this post you seemed like a completely normal person


----------



## FX-GMC (Jan 17, 2014)

Popocatepetl said:


> Wow, a post made of pure 100% stupid. Never, ever utilize your 100% stupid when posting.
> 
> 
> 
> ...


----------



## lilhasselhoffer (Jan 17, 2014)

Popocatepetl said:


> Wow, a post made of pure 100% stupid. Never, ever utilize your 100% stupid when posting.
> 
> 
> 
> ...




Did you read the original post?  You claim that the post is 100% stupid, but you don't even bother to read.

I stated that AMD is increasing core count, and that's not an optimal strategy for the consumer CPU market.  More cores are bringing us closer to a true fusion of GPU and CPU.  I stated that the core count increase is detrimental to consumers, because they are fusing the tablet, desktop, and server market.  At no point in time did I reference graphical performance numbers.  The point of this article was not graphics, but a CPU with a monstrous core count more akin to what you might see in a GPU.

If you need to stick your foot in your mouth, please do so without calling someone else an idiot.  Every person eventually commits a stupid act (I've had my fair share); but missing the point of a post, the point of the thread, and then calling someone else stupid is an act of either brazen ignorance or willful trolling.  I have no love or hate for AMD, Nvidea, or Intel.  I find it interesting that they are so blatantly leaving the consumer CPU market.  Either their vision is so forward thinking that they will have the last laugh, or they have just run full speed into a brick wall.  It will be interesting to see what comes of this, and how it hopefully influence desktop computing.


----------



## Dent1 (Jan 17, 2014)

harry90 said:


> All amd Needed to do was to improve the single core performance of their cpu's. who needs more than 8 cores? Just enhance the IPC, single core performance by 40-60% and they could compete with intel!!!



What makes you think AMD care about competing with Intel on a performance level? The objective is to capture a larger market share and thus increase revenue for share holders.  No point having the best performing product if nobody buys it.

AMD is probably doing whatever is cost effective for them and the most beneficial to them in the long term, although we probably can't see it now their key shareholders sat in a meeting and agreed this strategy, and right or wrong this was their best solution.




newtekie1 said:


> Honestly, I don't see a need for AMD to increase single threaded performance to meet Intel.  The reason is that AMD's single threaded performance is _good enough_.  Yes, they lag behind in benchmarks, but in real world use there really isn't anything that is single threaded that AMD can't handle.  Saddly, games are still heavily dependent on single threaded performance, but most modern games still run perfectly well on AMD processors despite this, because AMD's single threaded performance is good enough.  There are a few exceptions, StarCraft II comes to mind, because it is extremely CPU heavy and extremely single threaded.



I agree. Improving single threaded performance should be secondary. There isn't a single game or application that the average desktop user can't do.

When the need for more cores becomes necessary the work AMD did on their multiple module design will pay off. Even Intel's hyper threading wasn't successful at first, it took lots of trial and error and a decade later we all see the benefits. Same thing with AMD module principle.


----------



## librin.so.1 (Jan 17, 2014)

harry90 said:


> Who needs more than 8 cores? Just enhance the IPC, single core performance by 40-60% and they could compete with intel!!!



I need. Most of the workloads I do are very heavily threaded. So I would trade my 8-core to, for example, a [hypothetical] CPU of the same architecture with 16 cores while having 35% lower clocks anytime.


----------



## Eukashi (Jan 18, 2014)

Please 4GHz 4M/8C 256SP-GCN APU on Socket FM2+.
When I record PC games, more CPU core is required for x264VFW.


----------



## Lionheart (Jan 18, 2014)

RCoon said:


> I sincerely doubt this will ever come to any other market besides Opteron. If it does, enjoy the 50000W TDP.


Every thread I see you in you're always bashing AMD, give it a rest


----------



## eidairaman1 (Jan 18, 2014)

This is definitely a preview of what AMD has in store, first Kaveri, now Hexadecimal Core- maybe the introduction of it to AM3 or an 8 core model Steamroller for AM3+ then Hexadecimal core on the next Desktop socket


----------



## TRWOV (Jan 18, 2014)

RCoon said:


> I sincerely doubt this will ever come to any other market besides Opteron. If it does, enjoy the 50000W TDP.



More like 150w, if done on 28nm.




eidairaman1 said:


> This is definitely a preview of what AMD has in store, first Kaveri, now Hexadecimal Core- maybe the introduction of it to AM3 or an 8 core model Steamroller for AM3+ then Hexadecimal core on the next Desktop socket



According to AMD's roadmap, Vishera 32mn is as far as AM3+ gets.


----------



## arbiter (Jan 18, 2014)

newtekie1 said:


> Honestly, I don't see a need for AMD to increase single threaded performance to meet Intel.  The reason is that AMD's single threaded performance is _good enough_.  Yes, they lag behind in benchmarks, but in real world use there really isn't anything that is single threaded that AMD can't handle.  Saddly, games are still heavily dependent on single threaded performance, but most modern games still run perfectly well on AMD processors despite this, because AMD's single threaded performance is good enough.  There are a few exceptions, StarCraft II comes to mind, because it is extremely CPU heavy and extremely single threaded.



AMD single thread is good enough? Yea really? How many games at this point in time for example uses more then 4 threads? Heck even a lot of programs don't really use much more then 2. Less you get in to encoding graphic design kinda stuff that really helps a ton in.  AMD really should put some R&D in to increasing performance cause don't think to many games will span much past 4 cores on best top side which puts AMD behind a bit, on top of that they use 50% more power then the competitors cpu. Yea AMD cpu looks good cause initial cheaper cost but over year or 2 that cost even's out when it add's up in an eletric bill. AMD needs to get i would say either same performance with lower wattage or same wattage with say around 30% boost in single thread work loads. But that game seems to change quick when the 6/8core haswells come out in next 6ish months.


----------



## newtekie1 (Jan 18, 2014)

TRWOV said:


> More like 150w, if done on 28nm.



The current 16-Core parts are 115w on 32nm, so I'm guessing these would likely be similar but with higher clocks.



arbiter said:


> AMD single thread is good enough? Yea really? How many games at this point in time for example uses more then 4 threads? Heck even a lot of programs don't really use much more then 2. Less you get in to encoding graphic design kinda stuff that really helps a ton in.  AMD really should put some R&D in to increasing performance cause don't think to many games will span much past 4 cores on best top side which puts AMD behind a bit, on top of that they use 50% more power then the competitors cpu. Yea AMD cpu looks good cause initial cheaper cost but over year or 2 that cost even's out when it add's up in an eletric bill. AMD needs to get i would say either same performance with lower wattage or same wattage with say around 30% boost in single thread work loads. But that game seems to change quick when the 6/8core haswells come out in next 6ish months.



There isn't anything that relies on single threaded performance, games included, that don't run well on AMDs.  There was a time when software was outpacing hardware, but we've reached a point where the hardware has caught up and software has sat stagnant.  There are a lot of people still running modern games on Core 2s(heck one of my gaming rigs is a Celeron E3300) and they are still working just fine. And the FX series is beyond Core 2 in single threaded performance.  So, yes, AMD's single threaded performance is good enough.


----------



## RCoon (Jan 18, 2014)

Lionheart said:


> Every thread I see you in you're always bashing AMD, give it a rest


I own an 8350, I can 'opinion away' all I like. I was an amd fan through and through, then they messed up and completely disappointed me and everyone else.


----------



## Frick (Jan 18, 2014)

RCoon said:


> I own an 8350, I can 'opinion away' all I like. I was an amd fan through and through, then they messed up and completely disappointed me and everyone else.



That is because you let yourself believe the hype. You're blaming AMD for your personal failures!

Seriousy though, newtekies idea with a unified socket is great. I'd buy into that.


----------



## Ravenas (Jan 18, 2014)

RCoon said:


> I own an 8350, I can 'opinion away' all I like. I was an amd fan through and through, then they messed up and completely disappointed me and everyone else.



Are you telling me that I am completely disappointed with my 8350? Ha ha ha. Please stop trying to speak for people on this thread.


----------



## Mindweaver (Jan 18, 2014)

I'll take a 16 core desktop!  These should make great crunchers/folders.


----------



## Dent1 (Jan 18, 2014)

RCoon said:


> I own an 8350, I can 'opinion away' all I like. I was an amd fan through and through, then they messed up and completely disappointed me and everyone else.



I'm happy with my 2009 Athlon II X4. Still waiting for games to take advantage of it.  Can't say I'm disappointed with AMD. Actually I'm happy that I didn't have to change boards and I was able to drop in this fantastic piece of circuitry and its lasted 4 years and going strong.

No need to feel disappointed for me. You can have your disappointment back.  Here.


----------



## librin.so.1 (Jan 18, 2014)

arbiter said:


> AMD single thread is good enough? Yea really? How many games at this point in time for example uses more then 4 threads? Heck even a lot of programs don't really use much more then 2. Less you get in to encoding graphic design kinda stuff that really helps a ton in.



Okay, most games use 4 threads or even less. But that doesn't mean other cores are useless. I can play games with _at worst case_ negligible (due to no turbo) and at best case no performance loss while recording gameplay at 1920x1080@40 (or more) with superb quality straight into h264 with enough compression to keep the filesize relatively tiny (using ffmpeg+libx264). Meanwhile, a friend with a 4-core Intel said it becomes a bit lacking, reducing performance and making him record at slightly lower video framerates for it to keep up. Video transcoding speed is also very good, as long as using sane software. As @Mindweaver already mentioned, for crunching/folding – more cores, the better. Very important for me – compiling. Compiling benefits greatly from increased core count and pretty much scales linearly. [Re]Compiling larger projects can take very long on just a few cores. As a software engineer / programmer, I often need to recompile large projects several times a day. Especially when doing regression tests, where it can easily need a over dozen recompiles. Thus, when I moved from my dual-core to a octa-core, there was much rejoicing due to reducing compile time of a certain project I work on from ~ one hour to less than five minutes. Also, it is disappointing that in the Windows world a lot of software is still poorly threaded. While on Linux (the OS I use 99% of the time), things tend to be more threaded.

Sure, I can get the same or even better MT performance with a 6-core HT'ed Intel. But for what? 2x the price or even more? Thanks, but no thanks.

P.S. +1 to what newtekie1 said



arbiter said:


> on top of that they use 50% more power then the competitors cpu. Yea AMD cpu looks good cause initial cheaper cost but over year or 2 that cost even's out when it add's up in an eletric bill. AMD needs to get i would say either same performance with lower wattage or same wattage with say around 30% boost in single thread work loads. But that game seems to change quick when the 6/8core haswells come out in next 6ish months.


Okay, I kept deciding on not pointing this out, but I will. Since people tend to needlessly bash AMD for inefficient design when it comes to power consumption.
Example, sorted from most to least watts allocated to a single core:

```
i7-3820  – 4 cores, 130W TDP; 130 / 4 =   32.5W per core
FX-4350  – 4 cores, 125W TDP; 125 / 4 =  31.25W per core
FX-9590  – 8 cores, 220W TDP; 220 / 8 =   27.5W per core
i7-3970X – 6 cores, 150W TDP; 150 / 6 =     25W per core
i7-2600K – 4 cores,  95W TDP;  95 / 4 =  23.75W per core
FX-4320  – 4 cores,  95W TDP;  95 / 4 =  23.75W per core
i5-2450P – 4 cores,  95W TDP;  95 / 4 =  23.75W per core
i7-4770K – 4 cores,  84W TDP;  84 / 4 =     21W per core
FX-6350  – 6 cores, 125W TDP; 125 / 6 =  20.83W per core
i5-3550  – 4 cores,  77W TDP;  77 / 4 =  19.25W per core
i3-2330M – 2 cores,  35W TDP;  35 / 2 =   17.5W per core
FX-6300  – 6 cores.  95W TDP;  95 / 6 =  15.83W per core
FX-8350  – 8 cores, 125W TDP; 125 / 8 = 15.625W per core
```

OH SNAP it appears that if we consider how much TDP is allocated to _a single core_, it doesn't look like AMD CPUs are inefficient – the power per core is quite low, relatively. Which is only possible if the cores are efficient enough.



RCoon said:


> I own an 8350, I can 'opinion away' all I like. I was an amd fan through and through, then they messed up and completely disappointed me and everyone else.


>completely disappointed me and everyone else
>implying
>implying

You are implying too much, sir.
P.S. I totally love it.



Ravenas said:


> Are you telling me that I am completely disappointed with my 8350? Ha ha ha. Please stop trying to speak for people on this thread.



Yeah, what He said. When You, @RCoon, say "and everyone else", You take on quite a bit of responsibility, Ya know...


----------



## RCoon (Jan 19, 2014)

Seems like there's a lot of people kidding themselves in this forum. Waste of my time.


----------



## TRWOV (Jan 19, 2014)

RCoon said:


> I own an 8350, I can 'opinion away' all I like. I was an amd fan through and through, then they messed up and completely disappointed me and everyone else.



I'm so disappointed that I went and bought _only_ 4 of them.






Vinska said:


> OH SNAP it appears that if we consider how much TDP is allocated to _a single core_, it doesn't look like AMD CPUs are inefficient – the power per core is quite low, relatively. Which is only possible if the cores are efficient enough.



Intel's TDP isn't calculated in the same way as AMD's TDP. Intel's TDP is rated higher than the actual value because it's a worst case scenario. AMD's TDP is the _average_ power draw during a set of test workloads.


Let's not kid ourselves. Intel CPUs are more energy efficient than AMD CPUs, that's a given, but AMD CPUs aren't terribly inefficient considering their core count.


----------



## fullinfusion (Jan 19, 2014)

RCoon said:


> Seems like there's a lot of people kidding themselves in this forum. Waste of my time.


Then leave Mr downer!


----------



## librin.so.1 (Jan 19, 2014)

TRWOV said:


> Intel's TDP isn't calculated in the same way as AMD's TDP. Intel's TDP is rated higher than the actual value because it's a worst case scenario. AMD's TDP is the _average_ power draw during a set of test workloads.


LOL. I thought it was the other way round.
And my FX-8320 only goes close to TDP when overclocked, where it hovers around 120-123 W on full load, as reported by internal sensors or whatever sh*t.
And when I overclock More Than I Should™, it appears to drop my voltage on load to zealously keep the power consumption below 124.75W no matter what. Unless I disable lotsa stuff and turn on several overrides (can't find a better word) that my previous mobo didn't even have. (So my previous mobo was zealously keeping the power draw like this _all the time_)
So, from what I saw with my own eyes, saying that AMD's TDP is "average power draw" must be very much false.



RCoon said:


> Seems like there's a lot of people kidding themselves in this forum. Waste of my time.


Real smooth. That just shows You are out of arguments and don't have anything genuinely useful to say. Aww well...


----------



## TRWOV (Jan 19, 2014)

AMD themselves say that it's an average:



> TEST CONDITIONS
> Given the goal of representing typical power usage in real world  conditions, environmental test conditions were chosen to reflect that aspect (room temp of 70°F, server’s fan heat sink used, closed case, etc.) *The power for the cores, memory controller, and HyperTransport™ links was logged multiple times per second throughout the entire duration of the workload tested, and the time- averaged power consumption for that workload was calculated. *The results across the suite of workloads are used to derive the ACP number. The ACP value for each processor power band is representative of the geometric mean for the entire suite of benchmark applications plus a margin based on AMD historical manufacturing experience.



http://www.amd.com/us/Documents/43761D-ACP_PowerConsumption.pdf

EDIT: That being said, AMD tests with lower binned parts so you could say that it's TDP is an average of the lowest binned CPUs.



It could be that your motherboard is undervolting your CPU, my GA-880GM-USB3 undervolts my 8350 to 1.28v when I set voltage on AUTO.


----------



## librin.so.1 (Jan 19, 2014)

isn't that just for Opterons?


----------



## micropage7 (Jan 19, 2014)

RCoon said:


> I sincerely doubt this will ever come to any other market besides Opteron. If it does, enjoy the 50000W TDP.



yeah, one major problem ofAMD is power consumption
come on AMD


----------



## Dent1 (Jan 19, 2014)

RCoon said:


> Seems like there's a lot of people kidding themselves in this forum. Waste of my time.



So  your saying this forum only makes good use of your time when we share your views and agree with you?


----------



## TheoneandonlyMrK (Jan 19, 2014)

I feckin love the power consumption card, really funny.
On Tpu most ardent members in actuality smash efficiency to the kerb in favour of Ghz or even Mhz gains , this is not Eco power up that's elsewhere and to the likes of me a few watts means nothing get over it.
Sdp is coming to Amd parts soon enough mark my words as intels subterfuge seams to have blinded their fans.
Far to busy kidding myself to rise to Rcoons Bs.


----------



## NeoXF (Jan 19, 2014)

As much as I'd like to see a (very plausible) 6M/12T/12 core Steamroller w/ L3 and PCI-Express 3.0 (that admitedly, currently would be platform-less)... I'm still rooting for APUs, now more than ever. Software developers and open sources projects need to get their shit together and cook us some HSA magic tho. So far we've only got 2-3 actual previews and a handful of promises from developers X and Y.

IMHO, HSA needs to be pushed from the ARM front too if it wants get full-scale traction.


----------



## lilhasselhoffer (Jan 19, 2014)

Allow me to make the performance arguments, so this thread doesn't become a pissing contest.

1) AMD chips have a higher TDP than Intel, so they must be more efficient.
No.  AMD and Intel do measure chips differently.  Between chips being measured differently, and completely differing architecture, efficiency cases can be made for both sides.  There is no clear winner here.
2) Intel chips don't clock as high as AMD ones, so AMD makes better chips.
This one is generally true.  If you're looking for bragging rights about the highest clock, then AMD wins.  The reality is that both manufacturers' chips take huge amounts of power to do this.  You don't run a CPU at peak frequencies constantly, unless you want a huge bill and rapidly deteriorating chip.  For every day use, either manufacturer produces a relatively solidly performing chip.
3) Intel and AMD don't measure cores the same.
Absolutely.  Intel has traditional cores, while AMD decided to share a component among the cores.  A four core Intel chip doesn't match the 4 core AMD chip, a two core with hyper-threading chip doesn't match a 4 core AMD chip, and none of this matters.  This is not a move for the consumer CPU market.  In that market only a handful of program use more than a couple of cores.  People using more cores are doing server related work, crunching, or running encoding software.


Now that the silly arguments have been made, can we get back on topic?  AMD looks to be firing for the server market, without any bashfulness.  Assuming this is the case, it seems like they are making a large step back into competing with Intel.  This bodes well for more reasonably priced servers, but more importantly could be parlayed into something interesting on the desktop CPU front.  Anyone care to comment on that, rather than on how much they think the current parts are either awesome or terrible?


----------



## Thefumigator (Jan 19, 2014)

arbiter said:


> AMD single thread is good enough? Yea really? How many games at this point in time for example uses more then 4 threads?....cause don't think to many games will span much past 4 cores on best top side which puts AMD behind a bit, on top of that they use 50% more power then the competitors cpu


as PS4 and Xbox One are based on an 8 core CPUs, there's the possibility that in the near future games will be using more than 4 cores.



arbiter said:


> Heck even a lot of programs don't really use much more then 2



If I open all the programs I use every day, I get 100% CPU on my FX8320, that includes a lot of excel, encoding audio and video, burning, browsing (like 50 tabs or so) and some calculus programs for other stuff. Sometimes I even do some gaming on this thing while doing all those kind of things on background.



arbiter said:


> Yea AMD cpu looks good cause initial cheaper cost but over year or 2 that cost even's out when it add's up in an eletric bill.



Disagree. An electric bill is composed by several electrical appliances, an oven alone may meet 3000 watts unless you use propane. An Air Conditioner reaches at peek like 1200watts. Then you have a microwave oven, all the light bulbs, water heater tank, and 90% of the bill will be of appliances. I don't believe the choice of AMD CPU over Intel will really change the game on the bill, to make it a useless choice after *just 2 years*....  If what you need is to browse internet and visit facebook, get an AMD C60, its just 8 watts, then you'll be choosing correctly. If you want to save the planet, and our wallets, we should stop gaming then, but choosing an intel over an AMD will not "save our wallets" on the long run, the difference in power consumption and efficiency is not the catastrophe you described.


----------



## NeoXF (Jan 19, 2014)

Thefumigator said:


> as PS4 and Xbox One are based on an 8 core CPUs, there's the possibility that in the near future games will be using more than 4 cores.


I don't remember exactly what they said about Mantle, but it seemed it can scale to beyond 8 cores as well, independently of how the game engine is coded. And it'd be about time, might be tricky but makes a lot more sense to make game engine/API future proof so as to take advantage of the tech of tomorrow as well, doesn't it? Anyway, here's bit of a mind-twister... imagine somewhere down the HSA line, having games and their APIs doing draw calls off of the APU itself, rather than just x86 cores... GPUception... O_O


----------



## BiggieShady (Jan 19, 2014)

lilhasselhoffer said:


> 1) AMD chips have a higher TDP than Intel, so they must be more efficient. No. AMD and Intel do measure chips differently. Between chips being measured differently, and completely differing architecture, efficiency cases can be made for both sides. There is no clear winner here.



Actually there is a clear winner here and it's Intel. AMD CPU efficiency comes close to Intel only for those kind of tasks that AMD architecture favors. Meaning, you would need a heavily multi threaded code with no floating point instructions at all (only integer SIMD and SISD instructions), to have somewhat comparable efficiencies.


----------



## Thefumigator (Jan 19, 2014)

BiggieShady said:


> Actually there is a clear winner here and it's Intel. AMD CPU efficiency comes close to Intel only for those kind of tasks that AMD architecture favors. Meaning, you would need a heavily multi threaded code with no floating point instructions at all (only integer SIMD and SISD instructions), to have somewhat comparable efficiencies.


Still not clear enough to care. Everybody is talking like if there was a massacre of a difference, and its not that deep.


----------



## Blue-Knight (Jan 19, 2014)

Thefumigator said:


> Everybody is talking like if there was a massacre of a difference, and its not that deep.


For me, 1 watt is crucial for me to choose Intel instead of AMD. Why? Energy costs a lot where I live, a few watts less is better than a few watts more in my energy bill.

The less the better, no matter if 1 or 1/x. Just my stupid opinion.


----------



## Thefumigator (Jan 20, 2014)

Blue-Knight said:


> For me, 1 watt is crucial for me to choose Intel instead of AMD. Why? Energy costs a lot where I live, a few watts less is better than a few watts more in my energy bill.
> 
> The less the better, no matter if 1 or 1/x. Just my stupid opinion.



I don't find anything stupid in your opinion, its just your opinion. However take into consideration that if that is your point for choosing one over the other, then I assume you do the same for the rest of your home appliances. LED TV over LCD, as an example to begin with. If you only care about the CPU alone over the rest of your home then your opinion wouldn't be honest enough. But I don't think its stupid.


----------



## eidairaman1 (Jan 20, 2014)

Bring on the hexadecimal core for desktop


----------



## Melvis (Jan 20, 2014)

Blue-Knight said:


> For me, 1 watt is crucial for me to choose Intel instead of AMD. Why? Energy costs a lot where I live, a few watts less is better than a few watts more in my energy bill.
> 
> The less the better, no matter if 1 or 1/x. Just my stupid opinion.



You must live in Australia then?


----------



## Frick (Jan 20, 2014)

Blue-Knight said:


> For me, 1 watt is crucial for me to choose Intel instead of AMD. Why? Energy costs a lot where I live, a few watts less is better than a few watts more in my energy bill.
> 
> The less the better, no matter if 1 or 1/x. Just my stupid opinion.



I take it you remove all LED's from everything you have. You should get night vision goggles so you don't need lights at all. Imagine how much you would save!


----------



## Aquinus (Jan 20, 2014)

Meanwhile, Intel is busy making things like low power SoCs with 8 cores. When it comes to servers, power efficiency can mean a lot if you're running a cluster, a lot of servers, or if your resources are limited. I guess the point I'm trying to make is that these are changes AMD should have started making a long time ago. What boggles me is why they didn't do it sooner. APUs have been designed this way since Llano.


----------



## Blue-Knight (Jan 20, 2014)

Thefumigator said:


> LED TV over LCD, as an example to begin with. If you only care about the CPU alone over the rest of your home then your opinion wouldn't be honest enough.


Ah no. I can watch TV on computer and eliminate TV. But I am still using tube TV.

Why? LED or LCD TV costs a lot here, I can't invest in one at the moment. But I would buy if I could, no doubt. Sometimes, to save watts costs a lot too. It's not the same with computers because I can build a low power mini ITX for much less instead of more. But I prefer micro ATX as mini ITX has some disadvantages...



Melvis said:


> You must live in Australia then?


No.



Frick said:


> You should get night vision goggles so you don't need lights at all. Imagine how much you would save!


I do not live alone. Other people prefer light... What can I do?!


----------



## Thefumigator (Jan 20, 2014)

Aquinus said:


> Meanwhile, Intel is busy making things like low power SoCs with 8 cores. When it comes to servers, power efficiency can mean a lot if you're running a cluster, a lot of servers, or if your resources are limited. I guess the point I'm trying to make is that these are changes AMD should have started making a long time ago. What boggles me is why they didn't do it sooner. APUs have been designed this way since Llano.



AMD did a great job with its kabini lineup, the A4-5000 is a good contender, 4 cores, Radeon GPU, 15watts
But that atom (8 cores!) looks nice on paper, I would like to try it out, seems a good one, but still at 20watts.

I found on amazon an ECS KBN motherboard with an A6-5200, quad core, Radeon GPU, 25watts.



Blue-Knight said:


> I do not live alone. Other people prefer light... What can I do?!



get an intel...!!


----------



## librin.so.1 (Jan 20, 2014)

Blue-Knight said:


> I do not live alone. Other people prefer light... What can I do?!


change all Your lightbulbs 'n sh*t, including those "energy saving" ones into LED lights and enjoy your >10x energy consumption for lighting. Plus these even cost less than those "power saving" ones, too. Thus, no excuse not getting those and start enjoying comically low power consumption.

If You don't already do this, I'd say Your "Intel because every single watt counts" is hypocrisy.

Back on topic:
8 core Atom? Please do enlighten me. Because I kinda remember that Atoms were friggin' slow compared to AMD's low power CPUs AKA equivalents.


----------



## Aquinus (Jan 20, 2014)

Vinska said:


> change all Your lightbulbs 'n sh*t, including those "energy saving" ones into LED lights and enjoy your >10x energy consumption for lighting. Plus these even cost less than those "power saving" ones, too. Thus, no excuse not getting those and start enjoying comically low power consumption.
> 
> If You don't already do this, I'd say Your "Intel because every single watt counts" is hypocrisy.
> 
> ...


I agree, but Atoms used to also be 32-bit dual cores with only a single memory channel. This thing has two memory channels, runs at DDR3-1600, has 8 cores that run at 2.4Ghz with a 2.6Ghz boost. All wrapped into a 20-watt TDP SoC CPU. It also has 16 PCI-E lanes and support for 6 SATA ports and 4 NICs off the CPU. Clearly it's aimed to be a server product.

Someone has to tell me why this doesn't look awesome as a cheap server.
ASRock C2750D4I Mini ITX Server Motherboard FCBGA1283 DDR3 1600/1333


----------



## Blue-Knight (Jan 20, 2014)

Thefumigator said:


> AMD did a great job with its kabini lineup, the A4-5000 is a good contender, 4 cores, Radeon GPU, 15watts


I do not like processors with integrated graphics, so I will never use their integrated graphics. Only in "emergency" cases.

And I prefer low-end NVIDIA chips as I never had any problems with their drivers and I am sure they will work for my needs, so why to change?!



Vinska said:


> "Intel because every single watt counts"


Not only every single watt but also every single instruction per second. Single core performance is crucial as I like to disable all the unnecessary cores to use even less power.

And maybe the opposite of overclocking in extreme cases. I did that already! 

EDIT:
And to mention other DECISIVE factor for me to choose Intel instead of AMD: CPU pins. I let my E2200 to fall onto the ground in December 2013 and I was happy it was not an AMD. Ohterwise, it would bend many pins and give me a lot of headache and a non-working processor.


----------



## Melvis (Jan 20, 2014)

Blue-Knight said:


> No.



Then its considered cheap. Australia is officially the hottest and the most expensive place to live on the planet at the moment.


----------



## TRWOV (Jan 20, 2014)

Vinska said:


> change all Your lightbulbs 'n sh*t, including those "energy saving" ones into LED lights and enjoy your >10x energy consumption for lighting. Plus these even cost less than those "power saving" ones, too. Thus, no excuse not getting those and start enjoying comically low power consumption.
> 
> If You don't already do this, I'd say Your "Intel because every single watt counts" is hypocrisy.
> 
> ...



The newer Atoms are 64bit OoO with HT. I don't know how they stack up to Jaguar but they're not the same Atoms we knew (and loathed).


----------



## Blue-Knight (Jan 20, 2014)

Melvis said:


> Australia is officially the hottest and the most expensive place to live on the planet at the moment.


Maybe because it is isolated from everything. It makes sense.


----------



## Aquinus (Jan 20, 2014)

Blue-Knight said:


> And to mention other DECISIVE factor for me to choose Intel instead of AMD: CPU pins. I let my E2200 to fall onto the ground in December 2013 and I was happy it was not an AMD. Otherwise, it would bend many pins and give me a lot of headache and a non-working processor.



That's nothing. Back when I had a Phenom II 940, the stock cooler was on so bad because the thermal paste got so dry, that I accidentally ripped the CPU out of the socket in the process while bending a good 100-ish pins.I straightened them out, and while getting it into the socket took a little work, it eventually went in just fine and ran without an issue.

Ever bent a pin on a motherboard with LGA? It's a bitch to fix and more often than not, you can't fix it. While I like LGA in general because of how the CPU is secured, I'm less worried about bending a pin on a CPU than on an LGA motherboard though.


----------



## Blue-Knight (Jan 20, 2014)

Aquinus said:


> Ever bent a pin on a motherboard with LGA?


No, that would require no care at all. To bend CPU is infinitely easier.


----------



## TheoneandonlyMrK (Jan 20, 2014)

Blue-Knight said:


> No, that would require no care at all. To bend CPU is infinitely easier.


For what purpose have you even replied in this thread.
Your a big telly owning amd hating fool who should stick to using a pda or pad imho, efficiency wtf and as for this last reply he was removing a heatsink harshly .

You dropped ya soddin chip er your worse.

And what the feck any of this has to do with the Op is beyond me.


----------



## Blue-Knight (Jan 20, 2014)

theoneandonlymrk said:


> Your a big telly owning amd hating fool


I hate nothing. I am just telling the facts. And I didn't understand ~90% of what you said.



theoneandonlymrk said:


> For what purpose have you even replied in this thread.


Because I am stupid and useless.


----------



## Aquinus (Jan 20, 2014)

Blue-Knight said:


> No, that would require no care at all. To bend CPU is infinitely easier.



No way. Over-tightening a cooler alone can cause LGA pins to bend. It's part of the reason why you hear people saying to re-seat their CPU when memory or PCI-E is acting funky and to not over tighten it.


----------



## Frick (Jan 20, 2014)

Blue-Knight said:


> I do not like processors with integrated graphics, so I will never use their integrated graphics. Only in "emergency" cases.
> 
> And I prefer low-end NVIDIA chips as I never had any problems with their drivers and I am sure they will work for my needs, so why to change?!



Well a high end APU GPU would be quicker than a low end Nvidia card. It would be quicker than a low end Intel chip and a low end Nvidia chip as well, with the added benefit of Hybrid Crossfire if you find a GPU that works with it. I mean if you're happy with your current setup you shouldn't change it just because (penis measuring is a terrible thing), but it is a viable alternative would you ever need to upgrade.



theoneandonlymrk said:


> For what purpose have you even replied in this thread.
> Your a big telly owning amd hating fool who should stick to using a pda or pad imho, efficiency wtf and as for this last reply he was removing a heatsink harshly .
> 
> You dropped ya soddin chip er your worse.
> ...



Drunken englishmen, they all sound so cute.


----------



## Blue-Knight (Jan 20, 2014)

Aquinus said:


> No way. Over-tightening a cooler alone can cause LGA pins to bend.


My CPU cooler will not allow over tight (not that it could bend the pins). And it was quite cheap.

I think extreme violence would be required if that is possible, with my cooler. Just need to follow simple rules and it'll be fine.


----------



## TheoneandonlyMrK (Jan 21, 2014)

Blue-Knight said:


> I hate nothing. I am just telling My facts. And I didn't understand ~90% of what anyone said.
> 
> 
> Because I am stupid and useless.



I've sorted that out for you


----------



## eidairaman1 (Jan 21, 2014)

Thefumigator said:


> AMD did a great job with its kabini lineup, the A4-5000 is a good contender, 4 cores, Radeon GPU, 15watts
> But that atom (8 cores!) looks nice on paper, I would like to try it out, seems a good one, but still at 20watts.
> 
> I found on amazon an ECS KBN motherboard with an A6-5200, quad core, Radeon GPU, 25watts.
> ...



ive used atom in the past- felt sluggish and underpowered compared to a core 2 celeron


----------



## Aquinus (Jan 21, 2014)

Blue-Knight said:


> My CPU cooler will not allow over tight (not that it could bend the pins). And it was quite cheap.
> 
> I think extreme violence would be required if that is possible, with my cooler. Just need to follow simple rules and it'll be fine.


If you're using a cooler that just has the push pins, then there is a good bet you won't over tighten it. In fact I've run into issues with push pin coolers not being tight enough, but that's a different issue. A lot of sockets have adapters for coolers to be mounted by screwing it in. Some sockets like LGA2011 have screw holes built into the socket in the motherboard and screwing in a cooler is the only way to mount it. Sockets like 1150, 1155, 1156, and 775 just have holes. AMD sockets for a while (since skt939?) have a clip-and-lever system than can be removed to expose 4 holes for mounting coolers with screws or some other bracket.

I guess the real point I'm trying to make, is that it depends on the CPU and the cooler weather or not you can over-tighten it or exert enough force on the CPU to cause problems.


Thefumigator said:


> I found on amazon an ECS KBN motherboard with an A6-5200, quad core, Radeon GPU, 25watts.


I saw that too. I read a review that said that you can only use a GPU on the PCI-E slot. I thought I remember reading a review saying that they couldn't get a RAID card to work on it and ECS replied saying that it was for GPUs only.


----------



## Blue-Knight (Jan 21, 2014)

Aquinus said:


> If you're using a cooler that just has the push pins


No, mine have screws (and springs).


----------



## Melvis (Jan 24, 2014)

Blue-Knight said:


> Maybe because it is isolated from everything. It makes sense.



No not really, we are closer to China then the USA but still we pay through the noes for everything including Computer parts, its just a joke here. Also we can easily self sustain this country but we dont and that will be our downfall.


----------



## suraswami (Jan 24, 2014)

ZetZet said:


> not happening. just get the apu


 
ok 8 to 10 cores APU.  happy?


----------



## Blue-Knight (Jan 24, 2014)

Melvis said:


> No not really, we are closer to China then the USA but still we pay through the noes for everything including Computer parts, its just a joke here.


Please, tell me the price of some computer hardware in Australia...

For example, one of these: R9-290X-ENFC, GV-N760OC-2GD, Core i5-3570K, Core i7-3770K. Or any other...


----------



## TheoneandonlyMrK (Jan 24, 2014)

How's about you start your own thread in general nonsense


----------



## Blue-Knight (Jan 24, 2014)

theoneandonlymrk said:


> How's about you start your own thread in general nonsense


Were you speaking to me?


----------



## TheoneandonlyMrK (Jan 25, 2014)

Blue-Knight said:


> Were you speaking to me?


Yeah , is there any chance you have something relevant to say on the Op


----------



## Blue-Knight (Jan 25, 2014)

theoneandonlymrk said:


> Yeah


Thanks, now I'm very sad. Perhaps I shouldn't be here as everybody hates me.

I don't know what I am doing in this planet.

Sorry, this won't happen again. I'm very stupid (as always)...


----------



## Melvis (Jan 25, 2014)

Blue-Knight said:


> Please, tell me the price of some computer hardware in Australia...
> 
> For example, one of these: R9-290X-ENFC, GV-N760OC-2GD, Core i5-3570K, Core i7-3770K. Or any other...



Sure, these are prices from the place I buy most of my hardware from since there the cheapest.  This is not RRP either. AUS Dollars.

290X = $650-$770 Depending on model/Brand

GTX 760 = $300 -$830 Depending on model/Brand ($830 is for the Mars) Otherwise a 2GB OC is around $330

GTX Titan = $1200+

i5-3570k = $275

i7-3770k = $395

Electricity = 32cents per kilowatt hour in AUS Compared to lets say Madisonville KY (where I stayed last yr) at 8c Per Kilowatt hour

Petrol/Gas Prices $1.60-$2.50per litre Depending where you live. For me around $1.70. Compare that to USA at around 0.85c Per litre or $3.25per gallon (3.8litres) ish. ($6.10 - $9.50 A gallon AUS $) Thats without converting our dollar to there's at 89c AUS to the USA Dollar. So add another 11% ontop of that, I cant be arsed to figure it out at the moment lol

On topic 16 cores on a single Die is pretty epic.


----------



## Nordic (Jan 25, 2014)

How do you aussies live? We americans moan and groan that the price is over $3 a gallon. My electricity is .0877kwh. I couldn't imagine.


----------



## TheoneandonlyMrK (Jan 25, 2014)

Bluenight said:


> Thanks, now I'm very sad. Perhaps I shouldn't be here as everybody hates me.
> 
> I don't know what I am doing in this planet.
> 
> Sorry, this won't happen again. I'm very stupid (as always)...


Not so big a deal but there is a time and place is all .


----------



## Melvis (Jan 25, 2014)

james888 said:


> How do you aussies live? We americans moan and groan that the price is over $3 a gallon. My electricity is .0877kwh. I couldn't imagine.



We don't, we just exist to make the politicians richer lol. Wish I got paid $400.000 a yr even after I retire, that be sweet.


----------



## Blue-Knight (Jan 25, 2014)

Melvis said:


> Sure, these are prices from the place I buy most of my hardware...


In terms of computer hardware, I think you are far from paying the highest prices in the world... one of the best prices in my country are as follow (USD):

i5-3570K: $294
i7-3770K: $420
GTX 760 = $392 (PNY 2GB, imagine others)
R9 290X = $960

But in energy and gas I have to agree, that's pretty high!


----------



## Melvis (Jan 25, 2014)

Blue-Knight said:


> In terms of computer hardware, I think you are far from paying the highest prices in the world... one of the best prices in my country are as follow (USD):
> 
> i5-3570K: $294
> i7-3770K: $420
> ...



I agree computer prices here arnt to bad, some items are cheap and others are expensive, nvidia in general here are alot more expensive then AMD GPU's by a good $100 or more at times for the same performance.. Your pricing is about the same as it would be for the RRP (regular retail price) as you gotta add another $50 to each item (or there abouts). So the 3570k turns into a $325 CPU if you bought it at a normal computer store or electronics store (online is always cheaper) So the pricing turns out to be pretty close for the most part except for your R9 290X is an insane price!!

But we still pay more then the USA even though we are closer and have a better economy. Even when our dollar was worth more then the USA Dollar prices for computer parts where still higher, close but still higher -_-

Yeah our energy and gas prices are a complete joke and thats not including the insane cost to just have a car on the road here now. Cost about $1000 per car before you can even drive it out the gate and I only earn 20grand a yr lol

My electricity bill is for every 3 months (quarterly) Summer time I run a fridge, on average 2 computer system (turned off at night) Hot water system thats turned off completely as I save $100 on the bill from having it off and an AC unit that runs about 12hrs each day depending on the temp (sometime none at all like today). Microwave/toaster/jug that are used every now and again( once or twice a day) and my bill is around $530 for the three months. Winter time its closer to $750-800 for the three months. Heating is expensive as my parents pay over $1000 for the electricity bill during winter.


----------

