# AMD FX-8150 3.60 GHz with Windows Patches



## Omega (Feb 4, 2012)

After settling on the market, with all the quirks and bugs supposedly fixed, all the hype and disappointment blown away, we put AMD's FX-8150 under the scope. Benchmarks are done with and without the Windows 7 hotfix and in depth overclocking should resolve any doubts you have about AMD's flagship processor.

*Show full review*


----------



## Volkszorn88 (Feb 13, 2012)

This re-confirming that my 1090T @4ghz is still 100000000x better. Kthxbye


----------



## nonkX3 (Feb 13, 2012)

i could only wish people are going to stop bashing it...this makes me sad, it's kinda like trying to eat your own head...


----------



## newtekie1 (Feb 13, 2012)

All this confirms is that I'll be sticking with my 875K and "1605T" for quite some time to come.


----------



## Super XP (Feb 13, 2012)

Good to see the FX 8150 outperforming the fastest Phenom II in 99% of the benchmarks. The gaming benchmarks once again prove to mean nothing. It's all upto the GPU and these tests prove it.


----------



## Frick (Feb 13, 2012)

Volkszorn88 said:


> This re-confirming that my 1090T @4ghz is still 100000000x better. Kthxbye



Because it's slower in most things or because it uses more power?


----------



## mtosev (Feb 13, 2012)

More proof that Intel is a better choice


----------



## Goodman (Feb 13, 2012)

mtosev said:


> More proof that Intel is a better choice



Faster yes! , better nope!


----------



## mtosev (Feb 13, 2012)

2600K costs less than the 8150 so how's it not a better choice?


----------



## Frick (Feb 13, 2012)

Goodman said:


> Faster yes! , better nope!



Kinda depends on what you do, money avaliable and if you find something on sale or not. If you want a bit of power and is about to do a total upgrade it's almost stupid not to go SB imo. More power overall, not a whole lot more expensive, dreamy overclocking etc. If you bought an AM3+ motherboard it could be worth it though, depending on what you do and what you had.

And for avarage users I just buy whatever is cheapest atm which tends to be AMD.

EDIT:



mtosev said:


> 2600K costs less than the 8150 so how's it not a better choice?



Here the 8150 is like €50 cheaper than the 2600k.


----------



## claylomax (Feb 13, 2012)

What a sad reminder of how much of a disaster Bulldozer is.


----------



## johnnyfiive (Feb 13, 2012)

mtosev said:


> 2600K costs less than the 8150 so how's it not a better choice?



Just get the $199 8120 and then the majority of everyone's argument here is invalid.


----------



## mtosev (Feb 13, 2012)

Did the price of the 8150 went down or did the 2600K get more expensive? they were tied a few weeks back. difference maybe 10-20EUR or maybe I was looking at the 2600.


----------



## Frick (Feb 13, 2012)

mtosev said:


> Did the price of the 8150 went down or did the 2600K get more expensive? they were tied a few weeks back. difference maybe 10-20EUR



Don't know if anything has changed, but now the 2600k is about €260 and the 8150 about €210. If I was upgrade from say LGA775 or AM2 i would so go Intel.


----------



## Goodman (Feb 13, 2012)

mtosev said:


> 2600K costs less than the 8150 so how's it not a better choice?



What you wrote before was vague/general about Intel vs AMD , so i answer accordingly (as quality product/innovations)


----------



## mtosev (Feb 13, 2012)

I don't buy entry level hardware so I don't care who offers more at that price range. I know that AMD mostly dominates the entry and low midrange, when you get higher and higher AMD starts to fade away quickly


----------



## Daimus (Feb 13, 2012)

Thanks for the Review.
The first full test with the patch installed, which I read. I see no performance degradation, except Metro2033.
I'm going to install the patches.


----------



## R_1 (Feb 13, 2012)

Actually Bulldozer is a very competitive server CPU. AMD should strip all those server circuits and ramp up the clock speeds for the consumer market. They will do just that with the new Trinity. I wish that transition was done a way faster.


----------



## xenocide (Feb 13, 2012)

R_1 said:


> Actually Bulldozer is a very competitive server CPU. AMD should strip all those server circuits and ramp up the clock speeds for the consumer market. They will do just that with the new Trinity. I wish that transition was done a way faster.



Yea that strategy worked real well for Netburst.

I see no reason to buy the FX-8150 over an i7-2600k.  For just about anything heavily threaded, the 2600k is just good or substantially better than the FX-8150, while using less power, and when you factor in the motherboards from the test setups, they cost about the same.  The 2600k is a superior product on just about all fronts.  AMD has a chance with Pildedriver to catch up to Intel's offerings, but they need to bring Power Consumption down handily (that was originally a selling point with BD), Thread Performance up substantially, and keep the price reasonable.  The FX-8120 is a pretty solid purchase for people who use the threads, but I would say an i5-2500k or i7-2600k are still better purchases.


----------



## TheoneandonlyMrK (Feb 13, 2012)

ABstract from conclusion i disagree with

""AMD should have given us a competitive new architecture some time ago, and now that they finally did, they are playing the "architecture of the future" card and wants us to wait for full benefits of Bulldozer's architecture. That's not fair to AMD fans and users. A launched product should offer its end user everything it can, here and now, not in a year from now, when the whole world could end even before that. A future proof architecture should be of concern only to the company and it's management, and while they can be somewhat pleased with Bulldozer, desktop users cannot.""


Right so, anyone(intel powervr) with any ideas of bringing us ray traced graphics should stick it up their rear because we wont have any games or use for it and 4k resolutions that are being banded about as future (proof) tech should not be brought in, after all whos got a 4K screen ,yeh stick your inovations up your arse dev co's, we dont want them not unless they make quake quicker NOW  Ridiculouse on TPU fututre tech/proof counts for nought, really

and the fx8150 appears to sit on average between a 2500k and 2600k in games, where 98% of the world will actually use it ,that seems like a good cpu to me(i bought 960T awaiting PD) i mayhap shoulda bought it

IMHO new evaluation for ya intel2500-2700k are for peeps who only know how to multiplier oc(noobs) amd do decent chips for tinkerers



claylomax said:


> What a sad reminder of how much of a disaster Bulldozer is.







Frick said:


> Here the 8150 is like €50 cheaper than the 2600k.



both usefull posters in this thread ,mayhap you want the intel bummers thread though eh

stop chatin poya people


----------



## PopcornMachine (Feb 13, 2012)

Bulldozer was, is, and will continue to be a disappointment.


----------



## Omega (Feb 13, 2012)

theoneandonlymrk said:


> ABstract from conclusion i disagree with
> Right so, anyone(intel powervr) with any ideas of bringing us ray traced graphics should stick it up their rear because we wont have any games or use for it and 4k resolutions that are being banded about as future (proof) tech should not be brought in, after all whos got a 4K screen ,yeh stick your inovations up your arse dev co's, we dont want them not unless they make quake quicker NOW  Ridiculouse on TPU fututre tech/proof counts for nought, really



You're missing the point, or I didn't make myself clear enough on my opinion.
I hope Americans won't bust my ass for copyright infringement but here goes...

Lets say I supply the US army with F-16 fighter jets for a year now, and they've proven themselves to to be cheap, easy to maintain and most importantly an all round performer.

Now, after a year you show up with a F-22 Raptor, and you're all like "I got stealth, a future proof technology", and the US Airforce goes like "Wooooow".

But when they put our two fighter jets to the test, head to head, yours F-22 Rapptor is outmaneuvered, outgunned and outperformed as a platform in every way. Would you say that your future proof technology justifies your product failure? 

If you delivered your plane to be used, it needs to make use of that future proof technology integrated in a whole balanced and complete package - product for end user. AMD has a new architecture, that holds a certain potential for long term growth and performance improvements. But that means little to us end users in the short term because they delivered a product on a level that Intel had a year ago, and by the time AMD reaches Bulldozer full potential, Intel will have Sandy Bridge - X which will be X times faster. It's hard to see "future proof tech" there

Edit:
The F16 vs F22 comparison was used just to make a point. Please don't troll about it


----------



## NC37 (Feb 13, 2012)

Still think it is funny everyone thought BD would be the best thing since sliced bread. I knew from the moment they released the preview of BD and Piledriver that initial BD tech would be so-so. Piledriver and beyond was shaping up to be much more interesting. Then again, first gens are like this.

Course this is what AMD needed. They needed a big change to set the course of their dev for the future, not necessarily be super right off the bat. BD does that. Intel has done the same thing in their tech too. Hyper Threading initially was pretty piss poor. But now look how far it's come.


----------



## ensabrenoir (Feb 13, 2012)

*deja vu*

Bulldozer sucks
No it don't....its missunderstood
No its sucks
Un ha
Ah ha
Un ha
Ah ha......


----------



## xenocide (Feb 14, 2012)

theoneandonlymrk said:


> and the fx8150 appears to sit on average between a 2500k and 2600k in games, where 98% of the world will actually use it ,that seems like a good cpu to me(i bought 960T awaiting PD) i mayhap shoulda bought it
> 
> IMHO new evaluation for ya intel2500-2700k are for peeps who only know how to multiplier oc(noobs) amd do decent chips for tinkerers



How dare Intel use Multipliers for OCing, thank god AMD hasn't done that with BE chips for the better part of the last 5 years.  Intel went with a stricter design that didn't allow for crazy overclocks to every element of the CPU, but offered amazing performance, efficiency, and at a great price.  I'd rather OC my i5-2500k to ~4.8GHz and have it consume 240w under load than get an FX-8120/50 at ~5GHz using over 400w.

If you could OC a chip to 20GHz but it performed as well as a Toaster would you still love it?  Because that's basically the logic you're implying.



theoneandonlymrk said:


> stop chatin poya people



wtf does that even mean???


----------



## ThE_MaD_ShOt (Feb 14, 2012)

Meh, I'll stick to what I have. It serves me very well.


----------



## damric (Feb 14, 2012)

Thanks. Been waiting for the TPU BD review for ages...

But an outstanding review it is, so kind of worth waiting.

I've been extensively testing my FX-4100 for  the last couple months, but only last night I really started juicing it to see what it can do at 5GHZ, as I'm not exactly impressed with what it can do at 4.5GHZ (compared to my old Phenoms at 4.0GHZ).

I really had to juice it to be stable, with nearly 1.65volts to remain stable. There was obviously quite a bit of heat, with stressed temperatures leveling out about 65'C. (4.5GHZ on 1.5volts barely hit 40'C).

What was interesting was the scaling seemed to improve dramatically at the higher clocks. I would normally get about 20GFLOPS in LinX at 4.5GHZ, but GFLOPS increased to about 28 at 5GHZ. The built in benchmark in AOD soared to 8400 at 5GHZ, compared to about 7500 at 4.5GHZ. System responsiveness in windows "felt" much faster as well.

So what I'm saying is that I think these BD chips were designed to run at much higher clocks than what we see right now. They seem to really like high frequency. I think that as the fab process improves we will see what these chips are really made of, without having to use obnoxious voltages and thermal solutions.

For now, I'm anxious to see what my current generation BD chip can do if I can improve upon my already excellent air cooling.

On a side note, I was also able to further tune my 2x4GB of Ripjaws to 1800 CL7 1T last night.


----------



## Jstn7477 (Feb 14, 2012)

damric said:


> Thanks. Been waiting for the TPU BD review for ages...
> 
> But an outstanding review it is, so kind of worth waiting.
> 
> ...



I really hope that's 28 GFLOPS per core, as my Pentium Dual-Core E6600 @ 3.8GHz gets ~24 GFLOPS, and my 4.5GHz 2600K gets ~110 GFLOPS using AVX enabled IntelBurnTest.


----------



## xenocide (Feb 14, 2012)

I'm pretty sure a Phenom II X4 can reliably beat an FX-4100 since the Phenom II X4 will never encounter shared resources like the BD Chip.  Per-Thread performance on Phenom II always seems to beat Bulldozer.


----------



## Super XP (Feb 14, 2012)

xenocide said:


> I'm pretty sure a Phenom II X4 can reliably beat an FX-4100 since the Phenom II X4 will never encounter shared resources like the BD Chip.  Per-Thread performance on Phenom II always seems to beat Bulldozer.


That's what Bulldozer was designed for to share resources. Now what AMD needs to so is speed it up so it won't get bogged down in performance.

Picture a 4 lane freeway going into 2 lanes without warning. That is how the Bulldozer seems to work in terms of sharing its resources. I believe Piledriver will resolve this issue, perhaps not as much as we would like, but enough to gain it a nice 20% to 30% performance boost over the current Bulldozer IMO. 

That new CEO is playing it safe and keeping everything behind close doors. Once they achieve the desired performance, that is when they will release the big guns.


----------



## Thefumigator (Feb 14, 2012)

Great review, finally!!!

But I know what's going to happen with this thread. the expected.


----------



## xenocide (Feb 14, 2012)

Super XP said:


> That's what Bulldozer was designed for to share resources. Now what AMD needs to so is speed it up so it won't get bogged down in performance.
> 
> Picture a 4 lane freeway going into 2 lanes without warning. That is how the Bulldozer seems to work in terms of sharing its resources. I believe Piledriver will resolve this issue, perhaps not as much as we would like, but enough to gain it a nice 20% to 30% performance boost over the current Bulldozer IMO.
> 
> That new CEO is playing it safe and keeping everything behind close doors. Once they achieve the desired performance, that is when they will release the big guns.



I'm fully aware that's what it was designed for, and I consider it a flawed design.  I'm also curious why people keep saying PD will be a 10/15/20/30/50% performance increase over Bulldozer, other than AMD saying that's what they were shooting for, is there any concrete evidence or explaination as to how this will be accomplished?


----------



## Super XP (Feb 14, 2012)

xenocide said:


> I'm fully aware that's what it was designed for, and I consider it a flawed design.  I'm also curious why people keep saying PD will be a 10/15/20/30/50% performance increase over Bulldozer, other than AMD saying that's what they were shooting for, is there any concrete evidence or explaination as to how this will be accomplished?


32nm process revision along with modifications to the L2 & L3 caches, Branch Prediction, and so on.
It's not a bad design, AMD's only mistake was the fact they relied heavily on automation instead of detailed hand workmanship just as they did in the past.


----------



## xenocide (Feb 14, 2012)

Super XP said:


> 32nm process revision along with modifications to the L2 & L3 caches, Branch Prediction, and so on.
> It's not a bad design, AMD's only mistake was the fact they relied heavily on automation instead of detailed hand workmanship just as they did in the past.



L2 and L3 cache on BD CPU's I heard was awful by todays standards, but Ihave doubts there will be any real improvements on the manufacturing front.  Their real flaw was assuming that just because they made multi-threading a huge focus, that it wouldhappen overnight.  A huge number of tasks are still single-threaded, and thus most people will see a huge benefit in a stronger per-thread performance.  If you are never using more than 4 threads, what good is being able to do 8 and each of the 4 that you do are only at 66%?


----------



## Jiraiya (Feb 14, 2012)

Here






Higher is better, not lower


----------



## n0tiert (Feb 14, 2012)

i can really not understand why all thoose Intel fanboys uses this thread for flames & beef against a totally different CPU arch ,
it´s same as all the pro Nvidia sponsored based games benching with a ATI Card ...... 
if there wouldn´t be a competition ... there wouldn´t be any price dumps, new innovations.... 


Get a life and bitch somewhere else

i´m running a fully AMD system, everything works fine in apps / games and there might be a marginal perf benefit on Intel 
but u can´t tell me that you feel or even see it.. (not bench related)

note to the OP "thx for the Review"


----------



## DOM (Feb 14, 2012)

I still don't get why you down clock the mem on Intel side and only use 2 sticks on the the 1366 makes it less realistic imo


----------



## Omega (Feb 14, 2012)

DOM said:


> I still don't get why you down clock the mem on Intel side and only use 2 sticks on the the 1366 makes it less realistic imo



I don't downclock anything. Intel supports DDR3 up to 1333 MHz and that's a fact. Anything above that is in fact overclocking and would not give valid results.

As for LGA1366 setup, please read again:


> LGA1366
> 3 x 2048 MB MUSHKIN BlackLine FrostByte PC3-12800 DDR3
> @ 1333 MHz 7-7-7-21 (limited to 4GB)



I'm using 3 sticks of 2 GB, equaling 6 GB in total. Since all other platforms can't have that kind of configuration, LGA1366 is then limited in windows to use just 4GB, to be on the same level with other platforms. Triple channel is still used, and all of it's benefits, it just means Windows can't address more than 4 GB.


----------



## xenocide (Feb 14, 2012)

n0tiert said:


> i can really not understand why all thoose Intel fanboys uses this thread for flames & beef against a totally different CPU arch ,
> it´s same as all the pro Nvidia sponsored based games benching with a ATI Card ......
> if there wouldn´t be a competition ... there wouldn´t be any price dumps, new innovations....



I'm (if this was directed towards me) not flaming AMD because it's AMD.  I'm pointing out BD has a lot of very real shortcomings.  If you choose to ignore them and continue to support the product\company more power to you, but I will always go where the best overall price\performance is.  In the Athlon XP/Athlon 64 days I bought AMD becuase it was the best bang for the buck.  These days Intel seems to offer that top notch Price\Performance (in the realm of gaming and every day use)

AMD and their marketting brought all of the criticism they recieve on themselves.  You don't go back to using a Prolific name like the FX Series, known historically for being the most powerful CPU's in their class, and release something that struggles to keep up with the competition, and expect people to just accept it.  If you bought a season ticket through the NBA, and when you got to the stadium it was just a bunch of High School kids playing, wouldn't you be a little steamed?

I am only discussing what AMD needs to do to remain competative, and what would benefit the consumers the most.  When BD was on the horizon, and the FX name was announced, people starting jumping in joy because it was gearing up to be a game changer.  Then ES benches leaked, and they were _clearly_ fakes because it performed somewhere between awful and above average.  Then BD launched, and the early samples were pretty damn accurate, so it just _had_ to be that Windows was poorly optimized, or the BIOS were wrong, or the scheduler was broken.  Then it became that BD was never _really supposed to be that good_, it was based on server architecture anyway.  Now that all of that has been debunked and proven inaccurate, it's Piledriver is the _real product to look out for_, it's going to offer a 5-75% performance boost over its little brother!

The nonsense can go both ways.

I just hope AMD can figure it out and offer a product like Llano or Trinity, that does an excellent job at what it's intended to do.  I am hoping to get a Trinity Laptop when they come out for a reasonable price and get some light gaming done when I'm out of the house.  Should be good.

Oh, and I agree, it's an excellent review.


----------



## Yo_Wattup (Feb 14, 2012)

mtosev said:


> I don't buy entry level hardware so I don't care who offers more at that price range. I know that AMD mostly dominates the entry and low midrange, when you get higher and higher AMD starts to fade away quickly



Actually you'd be surprised with the sandy bridge pentiums, think i3 minus hyperthreading. Great chips. Wish I went with them over Llano A8-3850 for my girlfriends build. Add a cheapo discreet gpu and its still cheaper than Llano. 



theoneandonlymrk said:


> Right



The only word I understood in your post. Seriously, english dude, we talk english here.



xenocide said:


> wtf does that even mean???



HAHAHAHAHAAA 



n0tiert said:


> i´m running a fully AMD system, everything works fine in apps / games and there might be a marginal perf benefit on Intel
> but u can´t tell me that you feel or even see it.. (not bench related)



I can genuinely tell you with 110% certainty that I see and feel the difference with a 2500k as opposed to AMD's next best offering. Games, audio production, you name it. Not all games are GPU restricted. *cough* skyrim *cough*


----------



## DOM (Feb 14, 2012)

Omega said:


> I don't downclock anything. Intel supports DDR3 up to 1333 MHz and that's a fact. Anything above that is in fact overclocking and would not give valid results.
> 
> As for LGA1366 setup, please read again:
> 
> ...


 I say its not realistic cuz ppl that build there own rig are not going to buy 1333 mhz ram and I'm sure overclocking  isn't in amd or intel support, but I guess Intel does now with that rma 

And who still uses 32-bit OS when 8+gb are so cheap.. I did not read what OS was used  lol 

So I'm just giving my opinion


----------



## Omega (Feb 14, 2012)

DOM said:


> And who still uses 32-bit OS when 8 gb are so cheap.. *I did not read *what OS was used lol



nuff said
don't give your opinions on something you didn't bother to read.


----------



## DOM (Feb 14, 2012)

Omega said:


> nuff said
> don't give your opinions on something you didn't bother to read.



I said the OS you used.. wow go cry to your mommy, I guess I just won't look at your lame oem reviews


----------



## n0tiert (Feb 14, 2012)

Yo_Wattup said:


> Actually you'd be surprised with the sandy bridge pentiums, think i3 minus hyperthreading. Great chips. Wish I went with them over Llano A8-3850 for my girlfriends build. Add a cheapo discreet gpu and its still cheaper than Llano.







Yo_Wattup said:


> I can genuinely tell you with 110% certainty that I see and feel the difference with a 2500k as opposed to AMD's next best offering. Games, audio production, you name it. Not all games are GPU restricted. *cough* skyrim *cough*



wooohooo now you showing off with the benefit of mighty  INTEL

me wouldn't wonder if all the stuff is based on books like e.g:






or 

http://www.intel.com/intelpress/programming.htm?iid=prodmap_tb+prog

or 

http://software.intel.com/en-us/intel-sdp-home/


hmmmmm   FX-8150 Gameplay

http://www.hardocp.com/article/2011/10/11/amd_bulldozer_fx8150_gameplay_performance_review/2


----------



## de.das.dude (Feb 14, 2012)

Volkszorn88 said:


> This re-confirming that my 1090T @4ghz is still 100000000x better. Kthxbye



http://www.techpowerup.com/reviews/AMD/FX8150/10.html

cant tell if trolling or in denial


----------



## xenocide (Feb 14, 2012)

n0tiert said:


> hmmmmm   FX-8150 Gameplay
> 
> http://www.hardocp.com/article/2011/10/11/amd_bulldozer_fx8150_gameplay_performance_review/2



Clearly an inaccurate benchmark.  The i7 drops in fps when they overclock it, that literally makes NO SENSE.  I could understand if performance barely differed because it was GPU restricted, but to go down substantially in fps makes no sense.


----------



## PaulieG (Feb 14, 2012)

*I don't like where this thread is going. The bickering and flaming stops now. Infractions will be handed out without warnings from this point forward. People getting so damn emotional over silicon. Keep it civil, people.*


----------



## TheoneandonlyMrK (Feb 14, 2012)

Omega said:


> The F16 vs F22 comparison was used just to make a point. Please don't troll about it



I put my opinion across im not here  to wind you or others up unlike some in this thread


though i appreciate what your saying and i agree its not a get out of jail card for AMD, i feel the single threaded performance is not as bad as or important as some imply, its not got a leg blown off,more a slight limp to me

and in Games ,where it matters most, to most people, all the BS spouted is just that, it performs close to (mostly between)intels 25-2600K which isne too Bad for a whole new architecture not a million miles below as many are implying. 

and as for future tech your ideas though reasonable dont quite sound right to me, when physx was new it wasnt and still isnt used by many dev co's for anything but that dosnt to me make it less valuable as an idea , or not worth putting in and some would argue as intel did in the beginning that it was pointless as Cpu's did enough work(that was then eh)

its an actual fact that BD has Ops that arent and cant be used by most software yet, but that ddoes not mean they are worthless or shouldnt be worked with 256bitAvx extensions are an eg, im not a software tech/writer but i can appreciate when something will have a use ,just as GPU,s are now wielding their additional features ,one day so might BD then it may gain another 2% or maybe even the 50% Amd stupidly claimed



Paulieg said:


> I don't like where this thread is going. The bickering and flaming stops now. Infractions will be handed out without warnings from this point forward. People getting so damn emotional over silicon. Keep it civil, people.



why do i allways see thes After i post i did try to keep a civil head on tho



xenocide said:


> If you could OC a chip to 20GHz but it performed as well as a Toaster would you still love it? Because that's basically the logic you're implying.


 no what i implied was .Intels chips are easy to OC, a noob can do it, which some say is good ,and i see that point, but Amd's have more to mess with and fine tune(fact), so someone who Can and enjoys Ocíng, will have more fun with Amd.

oh and yeh at least once just to see


 as i said i played safe and got a 960T, but im not on here giving AMD shit(poya) permanently ,despite them anoying me a bit with their Bs PR etc


----------



## Omega (Feb 14, 2012)

I understand your opinion and respect it. But in the end it all comes down to facts.
And the fact is, review is about a specific product (FX-8150), and as such it didn't live up to either needs or expectations.

If I were to review just Bulldozer architecture, presented on a piece of paper, I'd give it a round of applause 

Perhaps we will see more from Pilledriver, perhaps we'll be disappointed again. I think it's futile to even begin discussion about the future and what will it bring.


----------



## PopcornMachine (Feb 14, 2012)

I for one had great hopes for Bulldozer.  My early builds were all AMDs, but when I got back into it after some time off I was saddened to see Intel was the better bang for the buck.

Bulldozer was my hope for a reversal of that.  I was disappointed.  And promises of fixes down the road were even more unpleasant. 

All you need to do is look at the chart in the link below and you will see that even a 2500K is a much better buy than an FX-8150.  Of course this according to one reviewers' results, but it pretty much agrees with what I read elsewhere.

So I went with Sandy Bridge when I needed to upgrade, and got my 2500K for $180.  I think it was a real good buy considering the options.

http://techreport.com/articles.x/21813/19


----------



## repman244 (Feb 14, 2012)

> To make matters more complicated, there have been AM3 boards on the market for quite some time now, using 800 series chipset and claiming to have AM3+ processor support. That is true only if those boards have implemented an AM3+ socket layout, also known as "black colored" socket featuring 942 pin holes. They will run your Bulldozer processors with proper BIOS update, but without the use of HT 3.1 support. Older AM3 boards using "white" 941 pin sockets can't house AM3+ processors because of physical incompatibility. So before buying a new Bulldozer processor for your AM3 board, check your socket layout first.



I think this should be corrected since it is now known that the 800 series does indeed support BD even with the old AM3 socket (CHIV (E) with BIOS update for example). The only problem is that no one has tested the downsides of using it (apart from the huge strain on the PWM section when overclocking).

Anyhow, I think this is a nice performance increase for those who have a BD CPU (a free increase!), I hope they improve everything with Piledriver so that we have more products to choose from.


----------



## Omega (Feb 14, 2012)

@repman244
Thanks for the info, I'll update the article. 

@ZenZimZaliben
English is not my native, did the best I can.


----------



## dasa (Feb 14, 2012)

So if overclocking shows its not the cpu\memory making 8150 the fastest cpu in this test why the difference between the fx8150 and 2500k\phenom II










Obviously its been tested in a gpu limited scenario unlike this test






So what i want to know is why when gpu limited dose the fx8150 get a lead in some tests by what in many reviews seems to be a greater amount than typical margin of error and as shown by the overclocking test can even be consistent
Its just so dam inconsistent amongst reviews yet it keeps happening here and there


----------



## Yo_Wattup (Feb 15, 2012)

n0tiert said:


> wooohooo now you showing off with the benefit of mighty  INTEL
> 
> me wouldn't wonder if all the stuff is based on books like e.g:
> 
> ...



Not sure on the point your trying to get across...


----------



## xtremesv (Feb 15, 2012)

claylomax said:


> What a sad reminder of how much of a disaster Bulldozer is.



I don't get why people keep saying Bulldozer is a disaster. OK, I understand it's not the "divine all-mighty" CPU many expected but it's not a bad chip either, the hype was too much for this new architecture. I could picture a former PII X6 1100T owner resenting having bought an FX-81X0 but I don't see it with someone upgrading from a PII X4 or first-gen i5.

I'm not trying to defend my choice but I'm very pleased with my FX-8120 compared to my old PII X4 965 @3.8. In my case, FX was cheaper than i5 2500K and I oced it just fine (@3.9/4.4) staying at stock voltage and Turbo Core enabled.


----------



## alexsubri (Feb 15, 2012)

Glad to see that my AMD 965 Black Edition @ 3.7GHZ still holding its ground!


----------



## Omega (Feb 15, 2012)

dasa said:


> So what i want to know is why when gpu limited dose the fx8150 get a lead in some tests by what in many reviews seems to be a greater amount than typical margin of error and as shown by the overclocking test can even be consistent
> Its just so dam inconsistent amongst reviews yet it keeps happening here and there



There's a big difference in Metro2033 settings between that chart and mine, I don't use Physx.
From the chart results you provided I think they enable Physx and let the CPU do the work, not the GPU. The difference between Physx on/of is enormous and in my experience it can result in some unstable test results, so I turned it off.

Also, in the TPU charts you pasted, there seems to be a bug. 
FX-8150 no patch score should be 68 not 78
Same thing in the overclocking table, FX-8150 should be 68, not 78
I'll fix it asap.


----------



## repman244 (Feb 15, 2012)

xtremesv said:


> I could picture a former PII X6 1100T owner resenting buying an FX-81X0 but I don't see it with someone upgrading from a PII X4 or first-gen i5.



Well that is the problem, since I and many others don't have an upgrade from the 1090T/1100T. The 4 module BD is somewhat faster in multithread scenarios which matters for only a few (it matters for me too but I would need to spend money on a new board + CPU for a minor increase in speed).
And you can't stop thinking, why did they even put more than 4 years of development and a ton of money into it and end up in worse position (looking at core per core scenario) and with higher power consumption. This is probably the main reason to think of it as a fail.
I do agree with your point that for someone with an X4 Phenom it is an upgrade and it also offers an improved IMC which is holding back the Phenom II quite often.


----------



## theonek (Feb 16, 2012)

Well, it's not bad at all, especially at it's price, this is excellent choice for cheap and fast. Glad to have it!


----------



## TheoneandonlyMrK (Feb 16, 2012)

D


----------



## dasa (Feb 17, 2012)

Omega said:


> There's a big difference in Metro2033 settings between that chart and mine, I don't use Physx.
> From the chart results you provided I think they enable Physx and let the CPU do the work, not the GPU. The difference between Physx on/of is enormous and in my experience it can result in some unstable test results, so I turned it off.
> 
> Also, in the TPU charts you pasted, there seems to be a bug.
> ...



Thanks for the reply
i wasnt trying to directly compare your results with xbit labs as you say different test and different hardware 
just commenting that your test is showing how fast 5870 is and the other is showing cpu speed rather than 6970
but as you say running physx without a nvidia gpu wont be producing realistic results

looks like you have missed a bit when fixing the overclocked results
the non overclcoked results is now down to 68 but the overclocked results are still 78-79


----------



## WarEagleAU (Feb 18, 2012)

Well I for one am Happy with my AMD 6100 (6120) processor and love it. Overclocks very well, runs everything smooth and stable, even Skyrim a lot better than my 955 BE which I loved. I love my new ASUS board 990x and the UEFI BIOS. everything seems to run definitely smoother and faster and I did install the patches. I was pleasantly surprised to see the 8150 leading or staying really close in a lot of the benchmarks compared to what I was expecting to see or what I read around on the net. While it definitely isn't what AMD hoped, it could have been a lot worse.

One thing I wanted to ask....IF AMD would take unused cores for those processes and programs that are not multi-core used and took those unused cores and somehow combined them into one huge core or, I guess, took the unused cores and used them to work on the single core program....would that help at all or could it be done? (sorry for the run on). It seems like that is something that could theoretically be done.

Excellent review Omega and BTA even if I don't full on agree with your assessment I respect your opinion and views.


----------



## W1zzard (Feb 18, 2012)

WarEagleAU said:


> ....IF AMD would take unused cores for those processes and programs that are not multi-core used and took those unused cores and somehow combined them into one huge core



= holy grail of parallelism

even with the source code available this is nearly impossible so far. for binaries it's even more complicated

http://en.wikipedia.org/wiki/Automatic_parallelization


----------



## GC_PaNzerFIN (Feb 19, 2012)

If this and if that. Lets face the reality, not everything can be multi-threaded and the design is poor for general use due to most programs either being single threaded, or load only one thread hard. 

Which one is to blame, the CPU that is fast only in paper with software not in popular use or everything else? AMD should have thought what the CPU is going to be used for before forgetting about single thread performance. Major error in design goals from day one.

Intel doesn't sacrifice single threaded performance in the altar of multithreading. Very well rounded performance. No wonder it is success.


----------



## Super XP (Feb 19, 2012)

Omega said:


> You're missing the point, or I didn't make myself clear enough on my opinion.
> I hope Americans won't bust my ass for copyright infringement but here goes...
> 
> Lets say I supply the US army with F-16 fighter jets for a year now, and they've proven themselves to to be cheap, easy to maintain and most importantly an all round performer.
> ...


You make an interesting point. Any company claiming Future Proof tech. on today's released products is full of shit. It's a marketing gimick to make you buy into it NOW. Then 5 Years from now when this so called future tech becomes usefull, you would end up buying something NEWER. 

Intel is just as guilty as AMD and others like Nvidia and ATI in the past. Release a product now that should perform better tomorrow is useless because "YOU WILL" end up buying something newer by the time that so called technology gets released/invented.


----------



## Thefumigator (Feb 19, 2012)

Super XP said:


> You make an interesting point. Any company claiming Future Proof tech. on today's released products is full of shit. It's a marketing gimick to make you buy into it NOW. Then 5 Years from now when this so called future tech becomes usefull, you would end up buying something NEWER.
> 
> Intel is just as guilty as AMD and others like Nvidia and ATI in the past. Release a product now that should perform better tomorrow is useless because "YOU WILL" end up buying something newer by the time that so called technology gets released/invented.



It makes sense but I think AMD was the first to want bulldozer to be better than Phenom II, they just couldn't make it perform (much) better despite the extra time they took (also known as delay). Some may think, "if they couldn't make bulldozer perform (much) better then why didn't they just shrink Phenom II and put two extra cores to it?" well, they have to do more than just shrinking, they have to add AVX and the rest of the stuff. Did AMD had time? Did they expected this would happen?

Whatever is the case, FX was released anyway, thinking it will sell anyway as the "newer processor" and will make people switch to AM3+ and doing so they will be buying "the newer socket" and even pairing it with the "newer radeon card".

Yet despite being all "new" stuff, you can still think a Radeon HD5000 series or HD6000 is still good stuff and so is Phenom II X6, which is for some a much better purchase over an FX.

But I still think that Bulldozer is far from being a F22 when comparing it with a F16. To me both processors are decent performing, in some cases one is better than the other, and viceversa. The difference in percentage isn't anything to worry about, so its not like an F22 to me.


----------



## Winston_008 (Feb 20, 2012)

So. i get from this article that bulldozer is great value if the only programs you use are winrar, truecrypt or pov ray.


----------



## xenocide (Feb 20, 2012)

Winston_008 said:


> So. i get from this article that bulldozer is great value if the only programs you use are winrar, truecrypt or pov ray.



The FX-8120/8150 are great if you only use heavily threaded applications.  Otherwise either Phenom II or Sandy Bridge are much better.


----------



## Super XP (Feb 20, 2012)

My FX 8120 @ 3.0 GHz (8- cores) blows away my last Phenom II x4 940 @3.0 GHz by far. Was Bulldozer a great upgrade for me ? Dam right it was and my cost for the mobo and ram to accompany my 8120 was dirt cheap.  You can put that in the bank...


----------



## Inceptor (Feb 20, 2012)

Maybe it blows it away in multithreaded operation, but there's no way for it to pull ahead clock-for-clock in 1-to-4 thread operation, unless you're willing to allow it to draw large amounts of power.  The deep instruction pipeline and server based design guarantees that a CPU of the Bulldozer family requires higher clock speeds to equal the performance of a Phenom II CPU with their shorter instruction pipelines, and more 'wrapped-up and complete' designs.  The BD architecture is like a puzzle piece in a two piece puzzle, where the other piece is the as yet to be fully realized GPGPU that will be fully integrated with it.

To parry the possibly inevitable errors in reasoning concerning "clock-for-clock":
On the one hand, from some imaginary uber-objective/absolute viewpoint, you can compare clock-for-clock, but it's ultimately a *synthetic* comparison.  You're comparing a short pipeline with a deep pipeline, and the different performance characteristics that derive from each of them; apples and oranges, so to speak.
On the other hand, by making the clock-for-clock comparison, what is wanted is to compare the technological sophistication or advancement.  The question is whether the advancement which is not necessarily favourable for current software usage, but which will be favourable later on, is a good advancement.  Obviously, there is a long term plan, and the plan is focused on APU development with its GPGPU and memory integrations.  
What's the point of the 'nothing' conversation?

Stop staring at the 'trees' and 'rocks' and notice the whole 'forest' around you with its winding paths into the distance.


----------



## TheoneandonlyMrK (Feb 21, 2012)

mtosev said:


> I don't buy entry level hardware so I don't care who offers more at that price range. I know that AMD mostly dominates the entry and low midrange, when you get higher and higher AMD starts to fade away quickly



both your pc's look pretty entry level to me

this kinda post, tut ,shouldnt be necessary but you made it


----------



## NanoTechSoldier (Mar 10, 2012)

There Is Nothing Physically Wrong, With AMD FX-8150 Or AMD Processors, For That Matter..  Their Held Back, By Poor Coding In Windows..

There ARE Problems With Windows 7 x64 & BD... Windows, IS Scripted For Intel Hyperthreading & Intel Instruction Sets... Over AMD Instruction Sets etc..  

Basically, AMD CPUs, Have To Run Intel Instruction Sets In Programs, Before Their Own... 

Which, I Call A Slight Handicap...

A Decently Programmed OS, WILL Make The FX-8150 Scream... An AMD Tuned Linux OS, WILL Show AMDs True Potential, In High-End Multi-Core Coding & Stable Speeds, @ +5GHz...

AMD Is Way Ahead Of Intel, In Many Areas Too...  AMD Processor Development, Is Already @ 28nm In Size [GPU 7970]..  

AMD Is Waiting To Pounce On Intel & Rip Their Throat Out.. Metaphorically..

Guiness World Record, Is Held By An AMD FX 8150 & ASUS CHVF MB... @ 8.429GHz... 

Proving... That AMD Processors, Can Handle Extreme Temperatures Below 0 & Still Code @ Frequencies Over 8GHz..

How Long Will It Take Microsoft.. To Work Out All Their Bugs With AMD..??  It's Been Over 30 Years Already.. 

Microsoft Established on April 4, 1975 & AMD started Back In May 1, 1969. 

Intel Went Public In October 13, 1971... AMDs Got A Couple Of Years On Intel...

There Are A Few BIOS Tweaks On ASUS CHVF MB.. That CAN Dramatically Increase Performance In AMD FX 8150 CPU...


----------



## ensabrenoir (Mar 10, 2012)

Sniff ..sniff.......u smell that  I smell ..........employee....no amd never even tried to defend it so its got to be the smell of delusion .........so thought we were past all of this.....it is simply what it is


----------



## repman244 (Mar 10, 2012)

NanoTechSoldier said:


> There Is Nothing Physically Wrong, With AMD FX-8150 Or AMD Processors, For That Matter..  Their Held Back, By Poor Coding In Windows..
> 
> There ARE Problems With Windows 7 x64 & BD... Windows, IS Scripted For Intel Hyperthreading & Intel Instruction Sets... Over AMD Instruction Sets etc..
> 
> ...



You can always use Linux...but the performance is the same.

So I guess it was Microsoft who made the Pentium 4 slow, hot and power hungry?


----------



## NanoTechSoldier (Mar 11, 2012)

repman244 said:


> You can always use Linux...but the performance is the same.
> 
> So I guess it was Microsoft who made the Pentium 4 slow, hot and power hungry?



MIcrosoft & Operating Systems, In General.. Have A BIG Part, In CPU Performance..  If "Microsoft" Hasn't Coded It Properly, For A CPU.. The CPU Won't Perform...  The Motherboard BIOS, Is A BIG Part Too & If It's Not Coded For Windows Properly.. It WILL Be Unstable..

If A CPU, Is Released, After An OS Is On The Market.. The OS, WILL Have To Be Patched &/or Rewritten..  

Seeing How, The AMD FX-8150, Is A Multi-Core CPU & Has 8 Cores etc... 
The Multi-Processor Kernel, Has To Be Rewritten In Windows, To Utilize All 8 Cores... 

Linux, On The Other Hand.. Is A OS That Uses Pure UNIX Coding & Is Easily Modified For Any Purpose Or CPU...  

If You Don't Know What Your Doing In Linux & Can't Code It... You Won't See Any Differentiating Benefit From Windows..

Linux, Will Always Be Better, Than Windows & Always Has...  

Windows 7, Still Has Issues From Win98..  Their Coding Old Problems, Into The New Operating Systems etc...

Another Thing.. AMD CPUs, Aren't Doing To Bad... When You Take Into Account, They're Only Using Dual Channel Memory, At The Moment & Intel, Are Using Triple Or Quad Channels, To Gain Performance...  

Wait Until An AMD MB Uses Quad Channel Memory & Then See How The i7 Competes... Intel Will Fail BIG Time..

AMD CPUs, Have Half The Memory Throughput & They Still Compete With An Intel i7... The AMD FX, Is Waiting To Be Unleashed...


----------



## repman244 (Mar 11, 2012)

NanoTechSoldier said:


> Another Thing.. AMD CPUs, Aren't Doing To Bad... When You Take Into Account, They're Only Using Dual Channel Memory, At The Moment & Intel, Are Using Triple Or Quad Channels, To Gain Performance...
> 
> Wait Until An AMD MB Uses Quad Channel Memory & Then See How The i7 Competes... Intel Will Fail BIG Time..



It was proven that the benefits are almost none (I'm talking about consumer desktop parts, not servers). AMD has a slow IMC and wold not benefit from more channels if the IMC is slow.


I also don't get your point about the OS, the patch was released and what's done is done, I don't get why people expect a magical 30% increase in performance, it's the chip at fault not the OS.

And if the performance is the same in Win 7 as in Linux I fail to see what exactly is MS doing wrong here.
And I don't get your point that the multi-core kernel has to be rewritten...there are 8-10-12 core chip on the market for quite a while and if I remember right Windows Server 2008 R2 is almost identical to Win 7 and has no problems running 4 CPU's with 12 cores.
There is no magical kernel for BD to shine, the only one who should fix it is AMD.


----------



## nt300 (Mar 11, 2012)

xenocide said:


> Quote:
> Originally Posted by R_1
> Actually Bulldozer is a very competitive server CPU. AMD should strip all those server circuits and ramp up the clock speeds for the consumer market. They will do just that with the new Trinity. I wish that transition was done a way faster.
> 
> Yea that strategy worked real well for Netburst.


I had to get back to this comment. Netburst was a pile you know what. You cannot compare such a failure to Bulldozer. That fsb was a real bottleneck and a major design flaw. AMD does not have this issue. AMD choose longer pipe stages for its design for future scalability. Bulldozer you can build on where as Netburst was a dead end. 


repman244 said:


> You can always use Linux...but the performance is the same.
> 
> So I guess it was Microsoft who made the Pentium 4 slow, hot and power hungry?


No, not true the Pentium 4 problem was Intel and its useless Netburst bottleneck of a design. You cannot compare Bulldozer over P4 never...


repman244 said:


> It was proven that the benefits are almost none (I'm talking about consumer desktop parts, not servers). AMD has a slow IMC and wold not benefit from more channels if the IMC is slow.


This I agree with, AMD has to speed up its IMC. This is the reason why Intels IMC is more efficient because its faster. Bulldozer would have performed better imo if they did a better job with its IMC.


----------



## repman244 (Mar 11, 2012)

nt300 said:


> No, not true the Pentium 4 problem was Intel and its useless Netburst bottleneck of a design. You cannot compare Bulldozer over P4 never...



You misunderstood my reply, I didn't say it really was Microsoft hence the question mark at the end


----------



## nt300 (Mar 11, 2012)

repman244 said:


> You misunderstood my reply, I didn't say it really was Microsoft hence the question mark at the end


Woopsi


----------



## repman244 (Mar 11, 2012)

nt300 said:


> This I agree with, AMD has to speed up its IMC. This is the reason why Intels IMC is more efficient because its faster. Bulldozer would have performed better imo if they did a better job with its IMC.



Indeed, however BD's IMC is a lot better than the Phenom II's (OCing the CPU/NB bring you almost the same performance increase as raising the CPU frequency) but it still lacks the speed (if we compare it to SB's IMC).

On the server side BD isn't looking very bad for the future but right it's performance per watt is really poor and never mind comparing it to the 8 core SB based Xeon's (Anand did a very good review of it).

I remember a lot of people saying that AMD should of just continue shrinking Phenom II (or add 2 more cores to Thuban), but IMO that would lead you nowhere, the gains you would see would probably be only 5-7% and many people said that 8 cores wouldn't be possible with 130W TDP. The modular design gave AMD some space to try and make something new while BD didn't perform as everyone wanted it to perform it think that the future versions (Piledriver etc.) will be like the Phenom -> Phenom II gain.


----------



## Super XP (Mar 11, 2012)

Great point, Though I believe Piledriver has a much better chance for improvement over Bulldozer versus Phenom II over Phenom I by a larger %. The issue for Phenom I was the current process, they would not clock high enough and performed poorly.

AMD revised/respun the silicon with some process modifications which gained them higher default clock speeds. I believe AMD will actually refine the Bulldozer design for Piledriver. So we should easily see a 20% performance boost clock for clock over Bulldozer IMO.

Remember, Trinity at this point is more important than none APU Desktop CPU's which is why Piledriver has to be much better.


----------



## Aquinus (Mar 11, 2012)

Super XP said:


> Great point, Though I believe Piledriver has a much better chance for improvement over Bulldozer versus Phenom II over Phenom I by a larger %. The issue for Phenom I was the current process, they would not clock high enough and performed poorly.
> 
> AMD revised/respun the silicon with some process modifications which gained them higher default clock speeds. I believe AMD will actually refine the Bulldozer design for Piledriver. So we should easily see a 20% performance boost clock for clock over Bulldozer IMO.
> 
> Remember, Trinity at this point is more important than none APU Desktop CPU's which is why Piledriver has to be much better.



BD was the first revision of a very new, very revolutionary architecture change. You can't expect them to get it perfect with the first revision. With that said, never get the first revision of a new architecture (hence why the Core Duo and Core Solo were crap)... but Intel has been working off that design since with their smaller pipeline, they replaced the FSB with QPI and the IMC, and ever since have just been making improvements to the same architecture.

With all of this said, Piledriver has a lot of wiggle room to improve the architectures IPC, where SB/IVB are starting to top out their IPC because of all the improvements that have been made to the platform over the last 6 years.

I've always been pretty happy with both AMD and Intel as far as performance for their processors, and because Bulldozer is a first revision chip, I didn't go with it. Also for all of those who said "the 2600k is cheaper than bulldozer," well, quite frankly my SB-E 3820 is cheaper than the 2600k and I was able to hit 4.75ghz without too much of an issue and it still has some OC headroom, in fact plenty of it with load temperatures never exceeding 60*C.

AMD's goal is different than Intel's and with all the architectural changes even comparing SB/IVB to BD is almost a stretch.

Just keep in mind that BD's single thread performance is a bit on the poor side, but at least each logical thread can do the same amount of work, which makes it scale extremely well. Intel's HyperThreading doesn't result in that much performance improvement, so as AMD improves the IPC and shoves more "modules" on the CPU die, multi-threaded performance will go through the roof.

Finally I will end my tirade by saying, software isn't going to always be single-threaded.


----------



## nt300 (Mar 11, 2012)

Aquinus said:


> but Intel has been working off that design since with their smaller pipeline, they replaced the FSB with QPI and the IMC, and ever since have just been making improvements to the same architecture.
> 
> With all of this said, Piledriver has a lot of wiggle room to improve the architectures IPC, where SB/IVB are starting to top out their IPC because of all the improvements that have been made to the platform over the last 6 years.


Now this is interesting. AMD and its Hyper Transport with IMO off the Athlon 64 is what push Intel into coming out with QPI and IMC. 

The way I see this is AMD already had a first Bulldozer CPU several months before the launch which failed to impress AMD so they further postpone that Bulldozer in favour of a more refined version which we have today. So it could be that Piledriver may be 3rd refinement. I also heard AMD was working on Piledriver to correct current Bulldozer deficiencies about 4 months prior to todays Bulldozer release. This is evident due to Bulldozers timing and release by TSMCs 32nm manufacturing. 
AMD thought the Bulldozer they releases was good enough to release for now, does stand up well against the competition, and while feeding this version to people, they continue to refine the design in time for Piledriver. 

This is of course speculation on my part with what Ive taken from the press releases, awkward release dates and TSMC's weird 32nm manufacturing timings. 

Bulldozer is not at all a bad CPU, just hyper hyped years before its launch. Before I upgrade I am going to stick it out until Piledriver them make choice between AMD and Intel. But I like AMDs cheaper overall platform pricing and they do just fine with gaming.


----------



## Aquinus (Mar 11, 2012)

nt300 said:


> Bulldozer is not at all a bad CPU, just hyper hyped years before its launch. Before I upgrade I am going to stick it out until Piledriver them make choice between AMD and Intel. But I like AMDs cheaper overall platform pricing and they do just fine with gaming.



I agree, it's not a bad platform at all. It's just benefits server-like applications more because of the IPC, which makes Interlagos a beast of a server CPU. Plus, with the acquisition of ATi, AMD has video card sales helping them which gives them time to improve BD without going bankrupt.

With all of this said, my 3820 is much faster than the 8150 is many cases and costs less, but on the opposite side, I paid more money for the motherboard than I did for the CPU. Cost wise, AMD is still the option to go with, but if you want the latest and greatest, Intel is the *current* choice. That could change in the next year or two though.


----------



## symmetrical (Mar 15, 2012)

Other than performance issues, the problem with Bulldozer is its insanely terrible power consumption when overclocked. When I used to have the FX 8120 overclocked to a stable 4.6ghz and played battlefield 3 on my GTX 580 overclocked, I think it overwhelmed my 650w power supply and I kept getting constant crashes. I already tested for stability and no matter what I did I kept getting crashes. Not until I ran both at stock settings did I get it to run normal. 

Now that I have an i7 2600k @4.4ghz with the same GTX 580 OC, I could game for hours with no issues.


----------



## Daimus (Mar 19, 2012)

symmetrical said:


> Other than performance issues, the problem with Bulldozer is its insanely terrible power consumption when overclocked. When I used to have the FX 8120 overclocked to a stable 4.6ghz and played battlefield 3 on my GTX 580 overclocked, I think it overwhelmed my 650w power supply and I kept getting constant crashes. I already tested for stability and no matter what I did I kept getting crashes. Not until I ran both at stock settings did I get it to run normal.
> 
> Now that I have an i7 2600k @4.4ghz with the same GTX 580 OC, I could game for hours with no issues.



You're not thinking about how to purchase a more powerful PSU, instead of changing the platform, no?


----------



## symmetrical (Mar 20, 2012)

Daimus said:


> You're not thinking about how to purchase a more powerful PSU, instead of changing the platform, no?



Yes I considered it, I was looking at some 850w PSUs but I figured the amount of money I spend will be the same if I sell my mobo and the FX8120, I could just switch entirely to the intel platform. 

$134 for Corsair TX850w

Or

Sell my FX-8120 setup with the mobo for $250.

i7-2600K - $279
P8Z68 - $129

Total: $408 - $250 = $158.

So yes I went with the latter and now I have the 2600K which is a bit faster and more efficient than my old FX8120.

I actually liked the performance of the FX-8120, but in my case it was better to just switch entirely.


----------



## Daimus (Mar 20, 2012)

You could sell your PSU and buy a more powerful with minimal loss of money. In fact since it would be logical. But you want to change the entire platform. Actually FX consumes more power when overclocking, it had to be taken into account in advance.


----------



## D3MoNR3N0 (May 27, 2012)

*8150 works for emulators great*

i run dolphin at full speed plus epsxe pcsx2 mmame neorage project64 plus many others and all types of games on ultra like wow diablo 3 assassins creed  aion starcraft 2 re5 re ORC street fighter x tekken dragon age secondlife i am in love with this cpu esp for the price


----------



## Aquinus (May 27, 2012)

symmetrical said:


> Other than performance issues, the problem with Bulldozer is its insanely terrible power consumption when overclocked. When I used to have the FX 8120 overclocked to a stable 4.6ghz and played battlefield 3 on my GTX 580 overclocked, I think it overwhelmed my 650w power supply and I kept getting constant crashes. I already tested for stability and no matter what I did I kept getting crashes. Not until I ran both at stock settings did I get it to run normal.
> 
> Now that I have an i7 2600k @4.4ghz with the same GTX 580 OC, I could game for hours with no issues.



Clearly you've never overclocked a 1366 or 2011 rig. My rig at full power (CPU+GPUs) with only a overclock of 4.5ghz on the cpu, it will draw over 500-watts, but that also doesn't mean that I would trust a 550-watt or 650-watt PSU for my rig, even though it draws under the rated maximum. Also just because your computer was crashing doesn't mean it was because of Bulldozer, but rather your overclock with it. SB over-clocking is honestly easy mode. BD you have a lot more you have to consider when you over-clock because most of the time you're not jump bumping the vcore and adjusting the multi. 

Now I'm not saying all this to say that AMD has a better chip, I'm not, but I think you've over-exaggerating how bad BD is. I've never had an issue with a single AMD or Intel CPU I've had. None at all, over-clocking or otherwise.


----------

