# AMD Readies a Pair of Energy-Efficient Socket FM2 APUs



## btarunr (Jul 3, 2013)

AMD's A-series "Richland" APUs are great to have in HTPC builds, but their 65W to 100W TDPs can be a put-off for some. The company is working on a pair of energy efficient socket FM2 APUs based on the silicon, with TDP rated at 45W. The first of the two is the A10-6700T, which features a significantly lower CPU clock speed of 2.50 GHz, with an unknown TurboCore speed; and GPU clock speed of 720 MHz. It's not known if the chip is dual-core or quad-core, but given that quad-core non-K A10-6700 rates in at 65W, it's not improbable for A10-6700T to be one, as well. Also unknown is the stream processor count. Moving on, we have the A8-6500T. Its CPU clock speed is further lowered, to 2.10 GHz, and GPU to 720 MHz, like the A10-6700T, its CPU and GPU core/SP counts are under the wraps.





*View at TechPowerUp Main Site*


----------



## Fourstaff (Jul 3, 2013)

Naming scheme similar to Intel, K for overclocked and T for low TDP? I like it.


----------



## Jorge (Jul 3, 2013)

While 45w APUs are fine it's absurd to conclude that a 100w or 65w APU consumes too much power. That's the TDP and rarely ever seen in actual use. These APUs combine the performance of a CPU and a GPU so the TDP ratings are extremely good considering the application.


----------



## birdie (Jul 3, 2013)

Jorge said:


> While 45w APUs are fine it's absurd to conclude that a 100w or 65w APU consumes too much power. That's the TDP and rarely ever seen in actual use. These APUs combine the performance of a CPU and a GPU so the TDP ratings are extremely good considering the application.



Recent tests have shown that newer AMD CPUs consume significantly more power than their advertised TDP.

It's far from rosy.


----------



## arbiter (Jul 3, 2013)

birdie said:


> Recent tests have shown that newer AMD CPUs consume significantly more power than their advertised TDP.



Question is, is it really cpu using that power or is it something in the chipset doing it? When power usage is checked its done usually at the wall which takes on everything in the computer.



Jorge said:


> While 45w APUs are fine it's absurd to conclude that a 100w or 65w APU consumes too much power. That's the TDP and rarely ever seen in actual use. These APUs combine the performance of a CPU and a GPU so the TDP ratings are extremely good considering the application.



Consider their power useage vs intel at same TDP, AMD looks like 500 lb guy at McDonalds. only thing they beat intel at is built in graphic's.


----------



## Fourstaff (Jul 3, 2013)

arbiter said:


> Consider their power useage vs intel at same TDP, AMD looks like 500 lb guy at McDonalds. only thing they beat intel at is built in graphic's.



To be fair their built in graphics are pretty powerful, external graphics of that caliber will need quite a bit of power.


----------



## Ephremius (Jul 3, 2013)

I don't understand the focus on energy efficiency these days, like European countries wanting to ban high end GPU's for their energy consumption rates, I mean honestly, shouldn't you be going after large scale corporations abuse of energy and not us 0.1% contributors of green house gases?

but its whatevs.


----------



## TheoneandonlyMrK (Jul 3, 2013)

Ephremius said:


> I don't understand the focus on energy efficiency these days, like European countries wanting to ban high end GPU's for their energy consumption rates, I mean honestly, shouldn't you be going after large scale corporations abuse of energy and not us 0.1% contributors of green house gases?
> 
> but its whatevs.


The difference, ,, big companies are already dodging the tax man but we cant so easy 
The result more green tax for us,
Very valid point though , put your 100tdp chip in powersave with all eco features on and they would not use much more than these anyways.


----------



## de.das.dude (Jul 3, 2013)

yess 45W! finally we can get away with just a heatsink! no more fans!


----------



## torgoth (Jul 3, 2013)

de.das.dude said:


> yess 45W! finally we can get away with just a heatsink! no more fans!



these apus are pretty toasty Ive seen someone on OC forum that had a 6800k at 80C, and it was hardly oc'd


----------



## de.das.dude (Jul 3, 2013)

torgoth said:


> these apus are pretty toasty Ive seen someone on OC forum that had a 6800k at 80C, and it was hardly oc'd



that was 100W tdp. these are less than half. also the temp has nothing to do with the wattage, if you have poor rate heat convection, you will get higher temps.

i have a 95W cpu and mine maxes out at 45C. if i had an aluminium block for a heatsink, i would achieve 100C


btw, 80c is impossible. it will shut off after 71C


----------



## Ikaruga (Jul 3, 2013)

Jorge said:


> These APUs combine the performance of a CPU and a GPU so the TDP ratings are extremely good considering the application.



No, just look around the Internet, how they perform in tests power-consumption wise. AMD CPUs are (sadly) leaking all over the place, and these new ones are probably from the same breed, just cherry picked or undervolted/clocked.


----------



## TheoneandonlyMrK (Jul 3, 2013)

Ikaruga said:


> No, just look around the Internet, how they perform in tests power-consumption wise. AMD CPUs are (sadly) leaking all over the place, and these new ones are probably from the same breed, just cherry picked or undervolted/clocked.



And what ,,, thats the Exact point of select binning that all chip mofos do, just look around the internet, might try looking up intels new SDP spec on chips for mobile while your at it.

temps mean shit my main rigs oc never breaches 62 degrees..... @1.48CPUV
how do you call this leaky even 100 watts tdp is amazeing given the total package performance , the cpu is fine for what it is the gpu is great for what it is and at typical idle values totaling 40-100watt for the whole system compared to intel they are doing ok

imagine an Amd or arm for that matter in process node parity with intel and its clear , intels ahead by a nose/node.


----------



## Ikaruga (Jul 3, 2013)

theoneandonlymrk said:


> And what ,,, thats the Exact point of select binning that all chip mofos do, just look around the internet, might try looking up intels new SDP spec on chips for mobile while your at it.
> 
> temps mean shit my main rigs oc never breaches 62 degrees..... @1.48CPUV
> how do you call this leaky even 100 watts tdp is amazeing given the total package performance , the cpu is fine for what it is the gpu is great for what it is and at typical idle values totaling 40-100watt for the whole system compared to intel they are doing ok
> ...



The GPU is great indeed, but the CPU is far from it.  It's not even close to Sandy Bridge level, let alone the rest.


----------



## drdeathx (Jul 3, 2013)

arbiter said:


> Question is, is it really cpu using that power or is it something in the chipset doing it? When power usage is checked its done usually at the wall which takes on everything in the computer.
> 
> 
> 
> Consider their power useage vs intel at same TDP, AMD looks like 500 lb guy at McDonalds. only thing they beat intel at is built in graphic's.



There is not much difference. Actually at Idle, A10-5800K runs less power overall. The A10-5800K is head to head with the 3220 at default and owns it in graphics. Multi-threaded it beats the 3220 and overclocked it owns the 3220...



Ikaruga said:


> No, just look around the Internet, how they perform in tests power-consumption wise. AMD CPUs are (sadly) leaking all over the place, and these new ones are probably from the same breed, just cherry picked or undervolted/clocked.



I want what he is smoking


----------



## TheoneandonlyMrK (Jul 4, 2013)

Ikaruga said:


> The GPU is great indeed, but the CPU is far from it.  It's not even close to Sandy Bridge level, let alone the rest.



Yet its beyond its typical use profile v performance and beyond what most will require of it given the htpc/parents pc'iness of them.


----------



## newtekie1 (Jul 4, 2013)

torgoth said:


> these apus are pretty toasty Ive seen someone on OC forum that had a 6800k at 80C, and it was hardly oc'd



I find that hard to believe since my 5600K clocked to 4.0GHz on the stock cooler and temps didn't break 65°C.


----------



## drdeathx (Jul 4, 2013)

torgoth said:


> these apus are pretty toasty Ive seen someone on OC forum that had a 6800k at 80C, and it was hardly oc'd



You have a cooling problem. Your cooler is most likely not properly installed. They do not run hot


----------



## TheinsanegamerN (Jul 4, 2013)

Jorge said:


> While 45w APUs are fine it's absurd to conclude that a 100w or 65w APU consumes too much power. That's the TDP and rarely ever seen in actual use. These APUs combine the performance of a CPU and a GPU so the TDP ratings are extremely good considering the application.



the a10-5800k has been seen using as much as 161 watts at full load during gameplay.


----------



## newtekie1 (Jul 4, 2013)

TheinsanegamerN said:


> the a10-5800k has been seen using as much as 161 watts at full load during gameplay.



I'd like to see where you saw that.


----------



## cdawall (Jul 4, 2013)

de.das.dude said:


> that was 100W tdp. these are less than half. also the temp has nothing to do with the wattage, if you have poor rate heat convection, you will get higher temps.
> 
> i have a 95W cpu and mine maxes out at 45C. if i had an aluminium block for a heatsink, i would achieve 100C
> 
> ...



Not true there are pics floating around of mine running at 80C+. They throttle at 72C they wont shut down unless the bios is configured that way.

Also AMD has been shipping out 95W CPU's with the aluminum non-heatpipe coolers for a while. They don't "achieve 100C" they run under 60C, but the fan is screaming in warmer climates.


----------



## newtekie1 (Jul 4, 2013)

cdawall said:


> Not true there are pics floating around of mine running at 80C+. They throttle at 72C they wont shut down unless the bios is configured that way.
> 
> Also AMD has been shipping out 95W CPU's with the aluminum non-heatpipe coolers for a while. They don't "achieve 100C" they run under 60C, but the fan is screaming in warmer climates.



Yep, my A8 came with the aluminum block style heatsink and even overclocked it stayed under 65°C, and its rated at 100w.


----------



## cdawall (Jul 4, 2013)

newtekie1 said:


> Yep, my A8 came with the aluminum block style heatsink and even overclocked it stayed under 65°C, and its rated at 100w.



Weird that. I have noticed a certain someone seems to have an issue spouting off at the hip with no real world experience or knowledge of the matter,  just personal assumptions on his part.


----------



## de.das.dude (Jul 4, 2013)

drdeathx said:


> There is not much difference. Actually at Idle, A10-5800K runs less power overall. The A10-5800K is head to head with the 3220 at default and owns it in graphics. Multi-threaded it beats the 3220 and overclocked it owns the 3220...
> 
> 
> 
> I want what he is smoking



you should put that article in your signature.


----------



## torgoth (Jul 4, 2013)

http://www.overclock3d.net/reviews/cpu_mainboard/amd_a10-6800k_richland_overclocked_review/3
scroll down to CPU load temps


----------



## de.das.dude (Jul 4, 2013)

torgoth said:


> http://www.overclock3d.net/reviews/cpu_mainboard/amd_a10-6800k_richland_overclocked_review/3
> scroll down to CPU load temps



i call BS on that. probably not a professional reviewer. and definitely one of those AMD haters.

max temp before thermal shutdown is 74C on that chip. so either he exaggerated, or was using onboard sensors and software to monitor temperatures, which are never really 100% accurate.


----------



## torgoth (Jul 4, 2013)

anyway thats not the source I was speaking of initially, it was someones personal temps on oc.net, either way only reason I said that is so maybe you double check before you use a heatsink without a fan, so you dont cause damage to your cpu.


----------



## de.das.dude (Jul 4, 2013)

umm.. passive cooling doesnt necessarily mean using any heatsink without a fan.
passive heatsinks are specially designed to create their own air flow, or utilize existing air flow.


----------



## cdawall (Jul 4, 2013)

torgoth said:


> http://www.overclock3d.net/reviews/cpu_mainboard/amd_a10-6800k_richland_overclocked_review/3
> scroll down to CPU load temps



Interesting this one review must be correct not the hundreds of users on this forum alone. Interesting enough his load temps with an oc are lower than stock even lower than the lower clocked 5800K. That reviewer either had something terribly wrong with his setup or is a bold faced liar.


----------



## de.das.dude (Jul 4, 2013)

i googled his name, seems to be he reviews everything from computers to airsoft guns



cdawall said:


> Interesting enough his load temps with an oc are lower than stock even lower than the lower clocked 5800K



lol i didnt even see that XD


----------



## torgoth (Jul 4, 2013)

de.das.dude said:


> yess 45W! finally we can get away with just a heatsink! no more fans!





de.das.dude said:


> umm.. passive cooling doesnt necessarily mean using any heatsink without a fan.
> passive heatsinks are specially designed to create their own air flow, or utilize existing air flow.



yeah passive cooling, glad u cleared it up

***






http://www.xbitlabs.com/articles/cpu/display/amd-a10-6800k_9.html#sect0
related info > second paragraph after the 1st screenshot...


----------



## unholythree (Jul 4, 2013)

*Iommu my friends.*

My interest in Richland is mainly for the cheap access to new(ish) virtualization tech. Intel's chips are faster, there is little doubt of that, but you have to pay out the nose for any premium features.

The only thing I'm pissed about is that AMD seems to have broken their tradition of almost universal ECC support with these new fangled APUs.

In any case if I can build a whitebox VM server dirt cheap, and with low heat/power I'm happy to give AMD my money.


----------



## de.das.dude (Jul 4, 2013)

torgoth said:


> yeah passive cooling, glad u cleared it up
> 
> ***
> http://www.xbitlabs.com/images/cpu/amd-a10-6800k/05.png
> ...



and you point being? i still see 61C for the CPU...

if you are comparing the GPU temp, know that GPUs always run hotter than CPUs. 100C to a gpu is like 80C to a CPU..

a lot of GPUs stay at 90C load, with stock coolers.


----------



## torgoth (Jul 4, 2013)

de.das.dude said:


> and you point being? i still see 61C for the CPU...
> 
> if you are comparing the GPU temp, know that GPUs always run hotter than CPUs. 100C to a gpu is like 80C to a CPU..
> 
> a lot of GPUs stay at 90C load, with stock coolers.



I see 118 



			
				xbitlabs.com said:
			
		

> By the way, the peak permissible temperature for the A10-6800K is 120°C, so our overclocking was limited by the cooler's performance.


----------



## cdawall (Jul 4, 2013)

torgoth said:


> I see 118



First off amd is known for bad thermal sensors on die. The 118c you see is the on die sensor,  socket temps only 61c for the Asus board. It is pretty obvious the on die sensors are wrong. 118c would crash not a question and if you look his normal temp is 56c while only pulling 84 watts. Yes that software is 100% believable.

Don't worry though the on die temp sensor for the video card reads as low as 2c and peaks at 29c. So somehow a couple nanometers away thr cpu is running triple that temp without bleeding over to the at times almost frozen gpu core.


----------



## de.das.dude (Jul 4, 2013)

torgoth said:


> I see 118



as already stated.. it would crash. google the system specs of the 6800k, its 74C.
check cpu world database.


----------



## Ikaruga (Jul 4, 2013)

drdeathx said:


> I want what he is smoking



As a grown up man, I do not smoke anything, and since you are continuously insulting and disrespect me here, I'm seriously thinking about telling your name to one of my favorite script here, vbulletin:totalignore. I know you don't give a damn, just saying that you might find yourself talking to a wall one day.

The 5800K eats about 85W on stock at full load, and power consumption just skyrockets when you start OC-ing (achieving 130-150W is a piece of cake, really is). I would go for the slowest i5 + GTX650 any time tbh.


----------



## Nordic (Jul 4, 2013)

newtekie1 said:


> I find that hard to believe since my 5600K clocked to 4.0GHz on the stock cooler and temps didn't break 65°C.



My 5800k had 55c, but that was in an open air environment with the stock cooler.



TheinsanegamerN said:


> the a10-5800k has been seen using as much as 161 watts at full load during gameplay.



My 5800k system uses 155w. That is the whole system under load.


----------



## cdawall (Jul 4, 2013)

Ikaruga said:


> As a grown up man, I do not smoke anything, and since you are continuously insulting and disrespect me here, I'm seriously thinking about telling your name to one of my favorite script here, vbulletin:totalignore. I know you don't give a damn, just saying that you might find yourself talking to a wall one day.
> 
> The 5800K eats about 85W on stock at full load, and power consumption just skyrockets when you start OC-ing (achieving 130-150W is a piece of cake, really is). I would go for the slowest i5 + GTX650 any time tbh.



That's nice this is a thread about AMD APU's no one cares that you would spend more money and get an i5+GTX650.


----------



## newtekie1 (Jul 4, 2013)

torgoth said:


> http://www.overclock3d.net/reviews/cpu_mainboard/amd_a10-6800k_richland_overclocked_review/3
> scroll down to CPU load temps



I'm not sure what was wrong with his setup, probably didn't attach the cooler correctly, because:

4.0GHz Stock Cooler





4.4GHz TRUE Cooler


----------



## TheinsanegamerN (Jul 4, 2013)

james888 said:


> My 5800k had 55c, but that was in an open air environment with the stock cooler.
> 
> 
> 
> My 5800k system uses 155w. That is the whole system under load.



so was the 161w. that has the cpu, heatsink, asrock a85x motherboard, and ssd. of course, every chip is different, and different brands of ssd use different amounts of power. even so, everything else in that system pulls at most 35 watts.


----------



## TheinsanegamerN (Jul 4, 2013)

now, the real question: will the 45 watt a10 have a lower clocked a10-5800k gpu? and how does it compare to the 65 watt a10-6700?


----------



## newtekie1 (Jul 5, 2013)

TheinsanegamerN said:


> so was the 161w. that has the cpu, heatsink, asrock a85x motherboard, and ssd. of course, every chip is different, and different brands of ssd use different amounts of power. even so, everything else in that system pulls at most 35 watts.



Why talk about full system load and try to estimate how much of that is actually the CPU?  There are reviews that have measured just the CPU power consumption, so you don't need to estimate:





There you go, under load, 85w.  The APU is consuming 85w.  Either  the 161w you saw was for an overclocked chip, or the rest of the system is consuming ~75w.


----------



## de.das.dude (Jul 5, 2013)

ofcourse his cooler wasnt properly attatched. his Overclocked load temps are lower than stock load temps.


----------



## NeoXF (Jul 5, 2013)

Ikaruga said:


> As a grown up man, I do not smoke anything, and since you are continuously insulting and disrespect me here, I'm seriously thinking about telling your name to one of my favorite script here, vbulletin:totalignore. I know you don't give a damn, just saying that you might find yourself talking to a wall one day.
> 
> The 5800K eats about 85W on stock at full load, and power consumption just skyrockets when you start OC-ing (achieving 130-150W is a piece of cake, really is). I would go for the slowest i5 + GTX650 any time tbh.



So what you're saying is, a system that's twice as expensive is a more powerul option? Oh, would've never thought of that...


OK, for anyone who doesn't know or hasn't figured it out yet... A10-6700T is actually a desktop version of the mobile Richland A10-5750M... but with a 10W higher TDP... 2,5-3,5GHz CPU clocks, 533-720MHz GPU clocks and 1866MHz DDR3 support.

Also, here's a bonus:


----------



## Bjorn_Of_Iceland (Jul 5, 2013)

meanwhile enjoying some energy efficiency on my proc since day 1.


----------



## de.das.dude (Jul 5, 2013)

e 350 XD


----------



## Bjorn_Of_Iceland (Jul 5, 2013)

heheh XD


----------



## Ikaruga (Jul 5, 2013)

cdawall said:


> That's nice this is a thread about AMD APU's no one cares that you would spend more money and get an i5+GTX650.





NeoXF said:


> So what you're saying is, a system that's twice as expensive is a more powerul option? Oh, would've never thought of that...
> 
> 
> OK, for anyone who doesn't know or hasn't figured it out yet... A10-6700T is actually a desktop version of the mobile Richland A10-5750M... but with a 10W higher TDP... 2,5-3,5GHz CPU clocks, 533-720MHz GPU clocks and 1866MHz DDR3 support.
> ...



I love that everybody takes only the very last sentence of my comment(s) and go on with that. I was talking about the subject in my previous post, namely that AMD CPUs are simply leaking crap. They eat a ton of electricity and also heat up quite easy (but the IGP parts are decent ones tho). They are basically selling stuff in 2013 which is on par with what Intel had 5 years(!) ago, that a lot of time in computer technology terms tbh, and it should be half the price at least. The problem is that people keep buying from them only because they are cheaper, so they are not really forced to start producing better chips.

*ps.:* It's also must be noted that the i5+GTX650 combo is not really twice as expensive, because you have to OC the AMD chip while you can undervolt/clock the Intel+NV combo at the same time, and if you add up the difference in (let's say) three years of electricity costs, you might end up not that far perhaps.


----------



## drdeathx (Jul 5, 2013)

Ikaruga said:


> As a grown up man, I do not smoke anything, and since you are continuously insulting and disrespect me here, I'm seriously thinking about telling your name to one of my favorite script here, vbulletin:totalignore. I know you don't give a damn, just saying that you might find yourself talking to a wall one day.
> 
> The 5800K eats about 85W on stock at full load, and power consumption just skyrockets when you start OC-ing (achieving 130-150W is a piece of cake, really is). I would go for the slowest i5 + GTX650 any time tbh.



Here is some power consumpton numbers. Keep in mind the 5800K is at 3.9GHz and Intel 3220 is at 3.3GHz....

These numbers we done with a dedicated 7970. Core for core your probably looking at 30-40 watt difference at 3.3GHz but you cannot. The difference really is not much. Don't get so bent out of shape too BTW.










This is also at full load on GPU and CPU where these numbers will never be hit in daily use thus the smoking comment... LOL


----------



## Ikaruga (Jul 5, 2013)

drdeathx said:


> Here is some power consumpton numbers. Keep in mind the 5800K is at 3.9GHz and Intel 3220 is at 3.3GHz....
> 
> These numbers we done with a dedicated 7970. Core for core your probably looking at 30-40 watt difference at 3.3GHz but you cannot. The difference really is not much. Don't get so bent out of shape too BTW.
> 
> ...



I do not have any idea (none whatsoever tbh), how this comes to this thread (and from where?), but an i3 PC which eats 95W when it's idle? Seriously..... 95W when it's idle? Are you trolling?


----------



## newtekie1 (Jul 5, 2013)

Ikaruga said:


> I do not have any idea (none whatsoever tbh), how this comes to this thread (and from where?), but an i3 PC which eats 95W when it's idle? Seriously..... 95W when it's idle? Are you trolling?



It has a 7970 in the system, did you even bother to read?


----------



## arbiter (Jul 5, 2013)

Ikaruga said:


> I do not have any idea (none whatsoever tbh), how this comes to this thread (and from where?), but an i3 PC which eats 95W when it's idle? Seriously..... 95W when it's idle? Are you trolling?



Well what i guess is since 7970 gpu was paired in, AMD does do things in their chipset to help performance and power usage with their gpu's and cpu's. This is probably reason why power usage idle is higher. The Reviewer should dropped in an Nvidia card to see the results are in same range or not.

With that being said, NO ONE would pair one the LOWEST end i3 cpu's with a top of the line gpu. its an Oxymoron


----------



## drdeathx (Jul 5, 2013)

Ikaruga said:


> I do not have any idea (none whatsoever tbh), how this comes to this thread (and from where?), but an i3 PC which eats 95W when it's idle? Seriously..... 95W when it's idle? Are you trolling?



Ummm no, I did a head to head with the i3. Those are MY results....


Granted, if you read my response, it was done with a 7970..... Just showing the difference between 13 3220 and 5800K. Now the APU's AMD is releasing is a lower core speed and lower voltage so the differences between i3 and the ones that are posted in the thread will be minimal thus again, the smoking comment.



arbiter said:


> Well what i guess is since 7970 gpu was paired in, AMD does do things in their chipset to help performance and power usage with their gpu's and cpu's. This is probably reason why power usage idle is higher. The Reviewer should dropped in an Nvidia card to see the results are in same range or not.
> 
> With that being said, NO ONE would pair one the LOWEST end i3 cpu's with a top of the line gpu. its an Oxymoron



Agreed. I only had a 7970 at the time of the head to head and was looking only at CPU vs APU performance and not graphics in real world benchmarks... The 5800K performed pretty much head to head with the i3 3220 with the 5800K being slightly slower in single threaded benchmarks and ahead in multi threaded. Overclocked, the 5800K pretty much owned the 3220. Now that was Ivy, Haswell will surely be a bit more efficient and as we have recently saw, single threading is the same as Ivy but multi threading is  bit better thn Ivy with Haswell.




newtekie1 said:


> It has a 7970 in the system, did you even bother to read?




Thanks Teckie.. obviously the answer is he didn't.....


----------



## Ikaruga (Jul 5, 2013)

drdeathx said:


> Ummm no, I did a head to head with the i3. Those are MY results....
> 
> 
> Granted, if you read my response, it was done with a 7970..... Just showing the difference between 13 3220 and 5800K. Now the APU's AMD is releasing is a lower core speed and lower voltage so the differences between i3 and the ones that are posted in the thread will be minimal thus again, the smoking comment.


I obviously saw it that you used 7970, still don't get it why did you replied that to me and why would you post it here (or anywhere tbh).. and 95W is still too much.



drdeathx said:


> Agreed. I only had a 7970 at the time of the head to head and was looking only at CPU vs APU performance and not graphics in real world benchmarks... The 5800K performed pretty much head to head with the i3 3220 with the 5800K being slightly slower in single threaded benchmarks and ahead in multi threaded. Overclocked, the 5800K pretty much owned the 3220. Now that was Ivy, Haswell will surely be a bit more efficient and as we have recently saw, single threading is the same as Ivy but multi threading is  bit better thn Ivy with Haswell.


Again, as soon as you overclock that thing, you might as well go for an i5 since you will end up with the same cost on the long run. Hell, you actually made me Googling things I was sure about and knew already:/




























etc.....


----------



## cdawall (Jul 5, 2013)

Ikaruga said:


> I love that everybody takes only the very last sentence of my comment(s) and go on with that. I was talking about the subject in my previous post, namely that AMD CPUs are simply leaking crap. They eat a ton of electricity and also heat up quite easy (but the IGP parts are decent ones tho). They are basically selling stuff in 2013 which is on par with what Intel had 5 years(!) ago, that a lot of time in computer technology terms tbh, and it should be half the price at least. The problem is that people keep buying from them only because they are cheaper, so they are not really forced to start producing better chips.
> 
> *ps.:* It's also must be noted that the i5+GTX650 combo is not really twice as expensive, because you have to OC the AMD chip while you can undervolt/clock the Intel+NV combo at the same time, and if you add up the difference in (let's say) three years of electricity costs, you might end up not that far perhaps.



First off the electricity cost will take well over a decade to even out with my current pricing. So that is mute.

Second off I don't think you actually know what high leakage chips are. It has nothing to do with wattage. There are high leakage chios from any and all manufacturers. You might want to sit down and do some more research about how a transistor woks. You also may want to look into what an AMD chip is better at. There are people that use them (like the entire server industry) because in a hughly multithreaded environment they are better performing. 

Not aimed at you but I just want to kick it out there "real world benchmarks" is an oxymoron. Use both this and the i3 in the real world and try to spot the difference. There isn't an office program or internet browser that will be mystically faster on anything beyond a 5 year old core2duo. Remember that's what most pc owners use there pc for.


----------



## Ikaruga (Jul 5, 2013)

cdawall said:


> First off the electricity cost will take well over a decade to even out with my current pricing. So that is mute.
> 
> Second off I don't think you actually know what high leakage chips are. It has nothing to do with wattage. There are high leakage chios from any and all manufacturers. You might want to sit down and do some more research about how a transistor woks. You also may want to look into what an AMD chip is better at. There are people that use them (like the entire server industry) because in a hughly multithreaded environment they are better performing.
> 
> Not aimed at you but I just want to kick it out there "real world benchmarks" is an oxymoron. Use both this and the i3 in the real world and try to spot the difference. There isn't an office program or internet browser that will be mystically faster on anything beyond a 5 year old core2duo. Remember that's what most pc owners use there pc for.



I know what "leakage chios" are and also well know how transistors work. I also know that where AMD chips better at, but we are talking about APUs here (but to stay on topic, I do think that AMD-APUs have a great future, as I stated before on this forum quite a long time ago). The thing is that these chips are just not good enough to replace the cpu+vidcard combo in desktops (well, almost there but not yet) and - in my opinion - AMD seriously needs to work on their CPU cores, both IPC and power consumption wise, while I also think that their GPUs and IGPs are already great.


*ps.:*I know what you mean, for examlpe I do type this from a 775 rig. It's not my main computer ofc, but this is what I have hooked up to the big screen, it's dead silent and looks awesome so I love it. It's already more than enough for browsing the net, so I refuse to replace it while it's works


----------



## de.das.dude (Jul 5, 2013)

cdawall said:


> Not aimed at you but I just want to kick it out there "real world benchmarks" is an oxymoron. Use both this and the i3 in the real world and try to spot the difference. There isn't an office program or internet browser that will be mystically faster on anything beyond a 5 year old core2duo. Remember that's what most pc owners use there pc for.



amen


----------



## cdawall (Jul 5, 2013)

Ikaruga said:


> I know what "leakage chios" are and also well know how transistors work. I also know that where AMD chips better at, but we are talking about APUs here (but to stay on topic, I do think that AMD-APUs have a great future, as I stated before on this forum quite a long time ago). The thing is that these chips are just not good enough to replace the cpu+vidcard combo in desktops (well, almost there but not yet) and - in my opinion - AMD seriously needs to work on their CPU cores, both IPC and power consumption wise, while I also think that their GPUs and IGPs are already great.
> 
> 
> *ps.:*I know what you mean, for examlpe I do type this from a 775 rig. It's not my main computer ofc, but this is what I have hooked up to the big screen, it's dead silent and looks awesome so I love it. It's already more than enough for browsing the net, so I refuse to replace it while it's works



Really you know what a high leakage chip is well explain away. You have not used it correctly in a single one of your posts. So what you know and what you think you know might need some realigning.

Also I think you fail to see the reason behind HSA and AMD's new cores. You might want to go look into them before calling them bad designs. Realize that AMD has made a package it can add and remove x86, ARM and GPU cores at willy nilly to suit whatever needs it has. Welcome to the bigger picture. :shadedshu


----------



## Ikaruga (Jul 5, 2013)

cdawall said:


> Really you know what a high leakage chip is well explain away. You have not used it correctly in a single one of your posts. So what you know and what you think you know might need some realigning.
> 
> Also I think you fail to see the reason behind HSA and AMD's new cores. You might want to go look into them before calling them bad designs. Realize that AMD has made a package it can add and remove x86, ARM and GPU cores at willy nilly to suit whatever needs it has. Welcome to the bigger picture. :shadedshu


I think I only said ones that it's "leaking all over the place" or something similar, and I never called them bad desing, nor did I ever thought such thing. Thinking about and working with/on computers every day since the 80's, I do understand all the aspect you mentioned, and more to that I also think that it's scalability is well beyond it's competition. My point was that it's not good enough in it's current state to replace the gfx-card+intel combo in desktops (please not that this is a PC enthusiast site), and whoever says it otherwise just excited because of its powerful IGP part imo. 
Instead of improving them, AMD keeps selling these APUs which are consume too much power and lack single threaded speed compared to the competition. Yes I understand they "found" some bins now which don't eat that much power if underclocked, but that's not something I would call great "progress". I do root for AMD that they make something which stomps Intel into the grounds, because fierce competition would be the best thing happen to us, but I just don't see it CPU wise in Trinity/Richland. It's still not enough to run AAA titles, so an i3-3225 is more than enough where you want an APU like this. Perhaps the next one will be much better finally, and believe me, I will be the first who will praise and acknowledge the job well done.


----------



## drdeathx (Jul 5, 2013)

Ikaruga said:


> I know what "leakage chios" are and also well know how transistors work. I also know that where AMD chips better at, but we are talking about APUs here (but to stay on topic, I do think that AMD-APUs have a great future, as I stated before on this forum quite a long time ago). The thing is that these chips are just not good enough to replace the cpu+vidcard combo in desktops (well, almost there but not yet) and - in my opinion - AMD seriously needs to work on their CPU cores, both IPC and power consumption wise, while I also think that their GPUs and IGPs are already great.
> 
> 
> *ps.:*I know what you mean, for examlpe I do type this from a 775 rig. It's not my main computer ofc, but this is what I have hooked up to the big screen, it's dead silent and looks awesome so I love it. It's already more than enough for browsing the net, so I refuse to replace it while it's works



APU's are replacing desktops with GPU's ATM. Heck, a huge part of the gaming community now uses laptops to game. It will not be long for APU's to catch up to graphics on high end laptops making the dedicated GPU market shrink more. The performance of these APU's are far good enough and their CPU cores are fine. This is criticized you earlier...


----------



## Ikaruga (Jul 5, 2013)

drdeathx said:


> APU's are replacing desktops with GPU's ATM. Heck, a huge part of the gaming community now uses laptops to game. It will not be long for APU's to catch up to graphics on high end laptops making the dedicated GPU market shrink more. The performance of these APU's are far good enough and their CPU cores are fine. This is criticized you earlier...



Yes, it's always happening like this, and we talked about this many times, like how we did more than a year ago in this other thread: 





Ikaruga said:


> This is just progression and it's happening since I know computers.
> As technology advances, they put more and more components into one chip to make it more efficient, but as those components get more and more complex (or when new components are introduced), they have to make them separate, because they get too big.
> And the cycle continues...





Ikaruga said:


> No, I meant something like the "Wheel of Reincarnation", the thing what happened with the Coprocessors, the CPU instruction extensions and with the various controllers, or perhaps *what will happen (or already happening?) with the graphics(GPU)* or audio* chips* in the future.



That's why I think that AMD is on the right track with the APUs which are really great, I just want them to make good cores finally, and I get a little sad when they come out with things like these here. And their CPU cores are not "fine" in my standards.


----------



## drdeathx (Jul 5, 2013)

Ikaruga said:


> Yes, it's always happening like this, and we talked about this many times, like how we did more than a year ago in this other thread:
> 
> 
> 
> That's why I think that AMD is on the right track with the APUs which are really great, I just want them to make good cores finally, and I get a little sad when they come out with things like these here. And their CPU cores are not "fine" in my standards.



The PileDriver cores are pretty damn good. Heck Trinity with PileDriver cores are pretty much on par with Phenom II performance with an on die GPU for $140. That is a great milestome in my own opinion. If AMD keeps getting 10-15% better performance and continues to shrink the die with more transistors, APU's may kill the CPU market which I think they are. Intel does not call them APU's but the GPU in on the die through the 4770K's.


----------



## Fourstaff (Jul 5, 2013)

cdawall said:


> First off the electricity cost will take well over a decade to even out with my current pricing. So that is mute.



Consider yourself lucky, if you are paying less than £1/watt/year (based on 12hr usage pattern per day) over this side of the pond I think you have signed up to the cheapest electricity provider ever.


----------



## newtekie1 (Jul 5, 2013)

Fourstaff said:


> Consider yourself lucky, if you are paying less than £1/watt/year (based on 12hr usage pattern per day) over this side of the pond I think you have signed up to the cheapest electricity provider ever.



Currently I pay ~$0.12/kWh.  So lets assume I'm using the computer for 12 hours a day and the power consumption difference is 50w. The money saved from using the lower powered computer is about ~$26 a year.  If I only use the computer 8 hours a day the cost savings drops to ~$17.50 a year.


----------



## Fourstaff (Jul 5, 2013)

newtekie1 said:


> Currently I pay ~$0.12/kWh.  So lets assume I'm using the computer for 12 hours a day and the power consumption difference is 50w. The money saved from using the lower powered computer is about ~$26 a year.  If I only use the computer 8 hours a day the cost savings drops to ~$17.50 a year.



16p/kWh here, for 70p/w/yr, 12hr day. For 20w difference its £14/year, judging by how lasting i7 920 rigs are, we are looking at least 5 years of usage. So that gives you £70 in difference, or the price difference between getting an i7 3570K and 5800K. Now then I know that the £70 is on the high side (and quite unrealistic given that the difference is less than 20w), but even at half that value I think people would appreciate a much faster chip for a bit more money spent overall. That is before overclocking.


----------



## cdawall (Jul 5, 2013)

Fourstaff said:


> 16p/kWh here, for 70p/w/yr, 12hr day. For 20w difference its £14/year, judging by how lasting i7 920 rigs are, we are looking at least 5 years of usage. So that gives you £70 in difference, or the price difference between getting an i7 3570K and 5800K. Now then I know that the £70 is on the high side (and quite unrealistic given that the difference is less than 20w), but even at half that value I think people would appreciate a much faster chip for a bit more money spent overall. That is before overclocking.



That doesn't include a GPU. The i5 3570K isn't even in the same league for onboard and that is the entire point of the A10-5800/6800K.


----------



## Fourstaff (Jul 5, 2013)

cdawall said:


> That doesn't include a GPU.



If the IGP of 5800K is powerful enough, and the IGP of 3570K is not powerful enough, obviously the 5800K is the clear winner. However if we are going for a discrete solution, I think its only fair that both uses the same card (and consequently should have the same power consumption).


----------



## cdawall (Jul 5, 2013)

Fourstaff said:


> If the IGP of 5800K is powerful enough, and the IGP of 3570K is not powerful enough, obviously the 5800K is the clear winner. However if we are going for a discrete solution, I think its only fair that both uses the same card (and consequently should have the same power consumption).








Looking at this graph from above that 20w you are "saving" is only under load. So let me ask you how long is your CPU under 100% load, because at idle AMD is doing better and you should be comparing the FX6300 or FX8320 if you want to compare an actual CPU that competes with the 3570K why you you would compare it to an APU is beyond me. You do know the 5800K directly compare with an FX 4XX0 series right?

So comparing that side the 3570K is $220 and the 4300 is $100 and the 6300 is $120. Lets compare CPU's to CPU's and not APU's since you are running a discrete card in your scenario.


----------



## Fourstaff (Jul 5, 2013)

cdawall said:


> http://images.anandtech.com/graphs/graph6347/50410.png
> 
> Looking at this graph from above that 20w you are "saving" is only under load. So let me ask you how long is your CPU under 100% load, because at idle AMD is doing better and you should be comparing the FX6300 or FX8320 if you want to compare an actual CPU that competes with the 3570K why you you would compare it to an APU is beyond me. You do know the 5800K directly compare with an FX 4XX0 series right?
> 
> So comparing that side the 3570K is $220 and the 4300 is $100 and the 6300 is $120. Lets compare CPU's to CPU's and not APU's since you are running a discrete card in your scenario.



Yes you are right. How I arrived at the conclusion that the 3570K idles less than 20w below the 5800K is beyond me, but its probably in some obscure test which involves undervolting and underclocking (and consequently not apples to apples comparison). Anyways I was trying to highlight the fact that power consumption can indeed make a difference in price (and, situationally, cooling costs), something which people should be aware of rather than brush it under. Works perfectly fine in the States where electricity is cheap and plentiful, but over this side of the pond electricity is quite a lot more expensive.


----------



## cdawall (Jul 5, 2013)

Fourstaff said:


> Yes you are right. How I arrived at the conclusion that the 3570K idles less than 20w below the 5800K is beyond me, but its probably in some obscure test which involves undervolting and underclocking (and consequently not apples to apples comparison). Anyways I was trying to highlight the fact that power consumption can indeed make a difference in price (and, situationally, cooling costs), something which people should be aware of rather than brush it under. Works perfectly fine in the States where electricity is cheap and plentiful, but over this side of the pond electricity is quite a lot more expensive.



Your CPU is still not under load that often unless you are gaming/encoding 8hrs a day I fail to see when that 3570K's power savings will pay for themselves. Even then if you are encoding there is a good chance the AMD chip will be faster at encoding so it may be a mute point.


----------



## Fourstaff (Jul 5, 2013)

cdawall said:


> Your CPU is still not under load that often unless you are gaming/encoding 8hrs a day I fail to see when that 3570K's power savings will pay for themselves. Even then if you are encoding there is a good chance the AMD chip will be faster at encoding so it may be a mute point.



No the 3570K will not save enough power to justify the increase in cost against the 5800K, if anything it takes more power. I was thinking about the hypothetical (and off topic) 3770K vs 8350, both overclocked.


----------



## cdawall (Jul 5, 2013)

Fourstaff said:


> No the 3570K will not save enough power to justify the increase in cost against the 5800K, if anything it takes more power. I was thinking about the hypothetical (and off topic) 3770K vs 8350, both overclocked.



There was a review done on that. It would take several years running your PC at 100% load for 8-12hrs a day to make a difference.

These both have a good idle state in them which works for browsing the web and what most people do for the vast majority of the time.


----------



## drdeathx (Jul 5, 2013)

Fourstaff said:


> Yes you are right. How I arrived at the conclusion that the 3570K idles less than 20w below the 5800K is beyond me, but its probably in some obscure test which involves undervolting and underclocking (and consequently not apples to apples comparison). Anyways I was trying to highlight the fact that power consumption can indeed make a difference in price (and, situationally, cooling costs), something which people should be aware of rather than brush it under. Works perfectly fine in the States where electricity is cheap and plentiful, but over this side of the pond electricity is quite a lot more expensive.



With all due repect, it doesn't idle less


----------



## Fourstaff (Jul 5, 2013)

drdeathx said:


> With all due repect, it doesn't idle less



With all due respect, I agreed with that in the first line of the post.


----------



## newtekie1 (Jul 12, 2013)

I finally managed to find my Kill-a-Watt:






This is Rig2 in my sig so:
4.4GHz CPU Overclock
1013MHz GPU Overclock

Idle...50w!  And this is driving 3 monitors.


----------



## drdeathx (Jul 12, 2013)

Fourstaff said:


> Yes you are right. How I arrived at the conclusion that the 3570K idles less than 20w below the 5800K is beyond me, but its probably in some obscure test which involves undervolting and underclocking (and consequently not apples to apples comparison). Anyways I was trying to highlight the fact that power consumption can indeed make a difference in price (and, situationally, cooling costs), something which people should be aware of rather than brush it under. Works perfectly fine in the States where electricity is cheap and plentiful, but over this side of the pond electricity is quite a lot more expensive.



The 3570K is NOT the same apple as you said.... I have no idea why your comparing power consumption from a chip that is over $200 and a more expensive platform. If someone is that concerned about power consumption, they can easily downclock and undervolt and CPU or APU. This discussion is worthless comparing the 2.



newtekie1 said:


> I finally managed to find my Kill-a-Watt:
> http://img.photobucket.com/albums/v296/newtekie1/5600KPowerIdle.jpg~original
> 
> This is Rig2 in my sig so:
> ...



Very nice Tekie. Every time the Trinity discussion comes up, those that argue against it.. don't own one.... It is all their conjecture.


----------



## NeoXF (Jul 12, 2013)

arbiter said:


> With that being said, NO ONE would pair one the LOWEST end i3 cpu's with a top of the line gpu. its an Oxymoron



I would, if given a budget of 500 bucks for CPU and GPU, exclusively for gaming (which I'd generally spend on)... it'd be i3-3240 + R7970, not i5-3570K + R7870... of which I'm pretty sure, in 90% of the scenarios, would be a great deal faster then the latter setup.


Anyway, as I've said before... A10-6700T looks like a desktop version of the A10-5750M, same CPU clocks, same GPU clocks, same IMC speed support, just 10W more on the TDP (dunno why...).

This plus a nanoITX FM2 board and fast RAM would make a great SteamBox, LOL.


----------



## newtekie1 (Jul 12, 2013)

drdeathx said:


> Very nice Tekie. Every time the Trinity discussion comes up, those that argue against it.. don't own one.... It is all their conjecture.



Exactly, and a lot of them think that overclocking AMD APUs will cause the power consumption to go "through the roof", but since I left all the power saving features enabled and just raised the multiplier and load voltage, the idle power consumption stays pretty reasonable.  And since this machine sits idle 90% of the time, it just sits there sipping power, it isn't gobbling down power like some say that it will.  

I figured it out, at that power consumption, running 24/7/365(because this is my server), this machine will cost me ~$55 a year to run.  Even _if_ the 3220 consumes marginally less power at idle(and I don't believe it does) it isn't going to make but a few bucks difference. The power consumption argument is pretty pointless.

I went with the APU over the 3220 because it gives me the great power consumption while idle AND the much more powerful CPU and GPU when I need it.  Oh, yeah, and the AMD build was cheaper.


----------



## NeoXF (Jul 14, 2013)

newtekie1 said:


> I went with the APU over the 3220 because it gives me the great power consumption while idle AND the much more powerful CPU and GPU when I need it.  Oh, yeah, and the AMD build was cheaper.



I'm also guessing the FM2 motherboard is quite a bit more feature packed than what you would have gotten on a crappy H61 or B75 tops, 1155 motherboard for the Intel setup...


----------



## newtekie1 (Jul 14, 2013)

NeoXF said:


> I'm also guessing the FM2 motherboard is quite a bit more feature packed than what you would have gotten on a crappy H61 or B75 tops, 1155 motherboard for the Intel setup...



Definitely, all SATA 6Gbps ports, 6 USB3.0 ports on the back and 2 more front panel, dual PCI-E x16, overclocking/voltage control options out the ass, debug LED, onboard power/reset buttons...oh and the board only costs $80!


----------

