# Why Bulldozer's spotty performance is good news.



## twilyth (Oct 15, 2011)

This is really pretty obvious if you think about it.  What was BD designed to do?  Run 8 threads in real time doing integer calculations.  Which benchmarks does it perform best at?  Multi-threaded integer applications.  What types of work loads do servers mostly deal with?  Yup . . . you guessed it.

www.pcworld.com/businesscenter/article/241961/amds_bulldozer_disappoints_why_thats_good_news.html


> AMD's latest-and-greatest chip may lag slightly behind Intel’s competing Core i5, as initial PCWorld performance-testing indicates. But these disappointing results hide benefits that AMD's "Bulldozer" FX CPU will likely offer, especially for cost-conscious small businesses.
> 
> AMD BulldozerThe issue is that most CPU-performance tests don't reflect the potential computational power offered by FX, which has up to eight cores, depending on the version. Sure, computationally-wise, preliminary synthetic tests, such as PCMark 7 and Cinebench, reflect real-world computing performance and indicate that the FX lags in comparison with Intel’s Core i5. That's what PCWorld's tests showed after running the four-core FX-4100 through the paces.
> 
> ...



More at link.

edit:  I guess this is just the sort of rationalization you would get from a fanboy, so maybe there's no point in denying it.  It does make sense to me but then the chick you pick up at last call never really looks all that bad either.


----------



## HTC (Oct 15, 2011)

twilyth said:


> speed from 3.6GHz to 3.6GHz.



There's a typo there, but it seems it's from the source.

Interesting.

Bulldozer and Interlagos are supposed to be the same processor but different targets: the 1st being desktop and the latter being server, correct? This being the case, if Interlagos turns out to be very good, what will this mean for Bulldozer?


----------



## twilyth (Oct 15, 2011)

Probably not much I'm afraid.  People who are knowledgeable will look at benchmarks that reflect how they use their computers.  So until more software is intensively multithreaded, I don't think most users are going to see any reason to pick BD.

However as the article points out, software will continue to move in that direction, so if you plan to hold on to a rig for 3 or 4 years it might be a consideration.  Right now though, the main draw will price/performance which is about on a par with the 2500.


----------



## lilhasselhoffer (Oct 15, 2011)

It can go from 3.6 GHz to 3.6 GHz.  Wow, that's a revelation...


Joking aside, duh.  Our processor designed for multi-threaded applications runs multi-threaded applications well.  Give that marketting chimp a cookie.

If I buy a consumer chip it should perform well in a consumer environment.  The converse is true in a server/work station environment.


The failure isn't the architecture, it's the chimps who didn't think about the target before releasing the project...............  All of this seems convenient, given the management shake-up and damage control mode at the top of the company.


----------



## hat (Oct 15, 2011)

But BD isn't a server chip, it's a desktop chip. AMD has had server chips with loads of cores before BD, like the "Magny-Cours" chip. I think this is a moot point for the majority of users, since most software still isn't highly compatible with multithreading yet.


----------



## Lionheart (Oct 15, 2011)

I feel like AMD bulldozer has hidden potential, I guess we just have to wait and see


----------



## twilyth (Oct 15, 2011)

lilhasselhoffer said:


> It can go from 3.6 GHz to 3.6 GHz.  Wow, that's a revelation...
> 
> 
> Joking aside, duh.  Our processor designed for multi-threaded applications runs multi-threaded applications well.  Give that marketting chimp a cookie.
> ...


Yup.  I'm sure that's part of it too.  Plus I read someplace that BD was really supposed to have launched a couple of years ago.  If it had, that would have been something special.  Now it's like buying a 2008 car and trying to pretend it's the latest and greatest.


----------



## hat (Oct 15, 2011)

Sure, pretty much any 2011 tech just out the door would have been something special back in 2008.


----------



## Goodman (Oct 15, 2011)

I couldn't give a rat ass about future , would you still use A64 single core on socket 754 or 939 (future ready back then) because now we can all use "good" software & Windows Vista/7 64bit?

Sorry! but what comes out now is got to be good to use right now , not in 2 years when it will get close to be obsolete or replace by something better...

Beside Interlagos is not doing much better then Opteron base CPU's as far as i heard
Anyhow no point of denied BD is 70% fail , long live PII


----------



## lilhasselhoffer (Oct 15, 2011)

hat said:


> Sure, pretty much any 2011 tech just out the door would have been something special back in 2008.



Development started in 2006, with targetted release of 2008.  Delays, fighting Intel, and money crunches crippled AMD.

Now in 2011 the product is released.  Pit a C2Q against bulldozer, and you've got a hell of a competition.  The threading issue is still there, but it would have been a level playing field.  SB eats C2, and you can understand where that leaves BD.

I really wanted BD to live up to what AMD said, but it doesn't.  What it is can be described in one word, disappointment.


----------



## Inceptor (Oct 15, 2011)

Well... pit a C2Q against an AM3 Phenom II x4, and you've got a hell of a competition, in fact, the Phenom comes out on top.  
Overclock an AM3 Phenom II x4, and it can match a first gen i5 in some benchmarks, not all, just a few.

So... pit a BD against a first gen quad core i7, and you've got a hell of a competition, in fact, the BD comes out on top or equal in many benchmarks.
Overclock a BD, and it can match or overtake a second gen i5 in many benchmarks.

Objectively, the only problems it has are its power draw when overclocked, and the instruction scheduling errors from the OS side.

Are you guys seeing a pattern here? I am.
It may be a public relations and marketing fiasco with Enthusiasts, but it seems they're still just as far from Intel performance as they were in the past. 
This is obviously the fall-back position.  They were gunning for earlier release, but something happened, whatever it was doesn't matter, so they had to delay, but, at least in hindsight, they could afford to delay and still release the thing and not be any further away in performance from Intel than usual.

I don't know... that's just my guess as to what they were thinking.


----------



## theeldest (Oct 15, 2011)

Lionheart said:


> I feel like AMD bulldozer has hidden potential, I guess we just have to wait and see



Agreed. I'd actually like to see how this works in a VMware ESXi cluster. I've seen in a couple reviews where BD got decent performance increases from Windows 8 because it has better thread management.

ESXi is all about resource scheduling in massively multithreaded environments. Historically Intel has won the VMware benchmarks but I think BD can compete here.


----------



## Altered (Oct 15, 2011)

I must be confused as hell. Ive read on this site it needs Windows 8 to get full potiential.
I also read it was to be released at least a year ago. 

Had the chip been released a yr ago we would have had to wait 2 years to see the chip run as it should? Just what was a guy to think? 



Inceptor said:


> Well... pit a C2Q against an AM3 Phenom II x4, and you've got a hell of a competition, in fact, the Phenom comes out on top.
> Overclock an AM3 Phenom II x4, and it can match a first gen i5 in some benchmarks, not all, just a few.
> 
> So... pit a BD against a first gen quad core i7, and you've got a hell of a competition, in fact, the BD comes out on top or equal in many benchmarks.
> Overclock a BD, and it can match or overtake a second gen i5 in many benchmarks.



And then I read this and the best I remember the Core 2 Quad beat up the Phenom II x4 up pretty good unless your only counting gaming, where most of these quad-core chips perform so similarly that it isn’t worth factoring in the tenths of a frame.
As far as BD against an i7. AMD’s fastest offering isn’t able to match i7 chip from what I have seen. 

I really wanted to see this BD take the crown but I am either missing something or "spotty performance" is *not* good news.


----------



## Inceptor (Oct 15, 2011)

If you read carefully, you'll see I said _AM3_ Phenom II _x4_.  Not the older AM2+ models and not any x3 or x2 models.
And I said _first generation quad core i7_, not current second generation i7, not first generation Gulftown i7.

:shadedshu


----------



## John Doe (Oct 15, 2011)

Inceptor said:


> If you read carefully, you'll see I said _AM3_ Phenom II _x4_.  Not the older AM2+ models and not any x3 or x2 models.
> And I said _first generation quad core i7_, not current second generation i7, not first generation Gulftown i7.
> 
> :shadedshu



Regardless, your original statement is highly unlikely to be true. Besides, pretty much the only difference between AM3 and the AM2+ Phenom 2 was the DDR3 controller. As you can see, Deneb has worse per-clock performance than Yorkfield:

http://www.hardwarecanucks.com/charts/index.php?pid=61,76&tid=3

With that aside, Deneb OCs worse than Yorkfield. So what AMD did was to target it towards budget unlike Intel who milked Core 2 Quads. So from here, you can not expect BD to perform like a Nehalem. At most, I would expect it to perform between Yorkfield and Ibex Peak (old i5). As somebody who has owned both an i7 870 and a X4 975, I have not seen a big difference between the two. But if you were to talk about pure performance; then C2Q, i5, i7, Sandy they all beat the Phenom 2, and changes are most of them will beat BD, whether both chips are OCed or not.


----------



## billcat479 (Oct 15, 2011)

To be released in 2008 and now? What is the type of software does it work with today? Just because it wasn't released way back then didn't mean they quit working on it or somethings are just out of whack.       
     Have to take a time machine back to see what their roadmap really was in 2006 because they would have started at least by then to start the design work on it. I sure don't remember hearing a whole lot about it back then but that's just my bad memory maybe. But seeing anything like this design posted back then, I doubt it very much. Please post a link so we can take a look at this. 
  From the testing it is pretty clear it's been built or totally reworked for a new set of software specs that were not out then.  Or is not out even now so take your time machine back to figure out this perplexing line of logic please?
  Past or present or future what is out is out, what it's built to do is still being done and going to be done more in the future. So either Microsuck or AMD or both are totally out of phase with each other and their software releases and compilers and optimizations.  
  AMD has put some head scratching cpu's out when there was not any present need for what they did. It started with the the 64 bit cpu when 64bit home OS software that wasn't out yet but is now. They just kept the 32bit part working well till it started to come together. Started adding more core's to their designs.     
  And back then when people had time to work with them AMD was praised for their design. I think a bit of that has to be put into this new cpu. It's really hard to know just what it can really do.
   Kind of like making a product to influence the future. Intel said what and why bother and then copied AMD while still saying why bother until their duel sort of cores came out. Funny.
  AMD's timing has a lot of question marks, they are stuck with a smaller R&D budget, has to wait on the fabs to get projected dies working and then make them.  It is different when they split with their own production and have to work with what came out of that.
  They don't have it as easy as intel so they try to predict ahead of time what way the computer will be used and have long range plans worked out before they finish and produce. Risky in short term but if right cheaper in the long run. They have to think over new designs a lot more carefully than Intel does, that part is obvious.

   It has been stated in pretty much all tests that this design is not fit too well for today's software but for tomorrows.
   Which is not that good sounding for the people with a lot of money to waste buying a new system every year to say at least on paper my computer is faster than yours now so why buy something that I could care less if it runs faster .5 to 1 year from now.  
 If people buy a computer to last 2 years and most of them do then AMD makes a lot of sense MAYBE.  The maybe part is the hard one to buy into unless you know how it will work with optimized software for it's design. All cpu's depend on this aspect to do their best.
  So do you think the folks at AMD are that stupid? 
  Do they turn a blind eye to current and coming software uses? That is an easy no.
  In some aspects they might have taken a better route. It was their first Athlons that made the jump to smaller pipelines and more efficient cpu's while intel was still trying the mhz is best first super long pipeline cpu that could not be cooled. 
AMD has sort of taken this path in some of this new cpu which is not good.
  Maybe this is why they have a new CEO now while the last one could not sit for what they were doing? 
  Who knows about that. Lots of maybes here and there.
  It can only be looked at for what it can do and why and if it is a cpu that will get faster as it ages like their first Athlon run did. 

  The really funny part of all this is in a way we are back in time, AMD has put out a CPU that needs newer software which is already showing up in programs and future OS's and will continue to be better coded with better enhancements that could, could maybe, make their cpu get faster as time catches up with it. It's a gamble.
   The sites that took this approach did show how well this cpu can be. With the right coding it's a pretty fast cpu. 
    In a way it's nice to see products come out besides wine that get better with age. Make's it easy to skip a upgrade for the next 2 years buying into this new cpu.
  Most computer users don't hack up there computers every 6 months, only the benchmark led fanatics do that an it's a very small percent of computer owners that are in this category.  

  This is really what it boils down to isn't it. What you see when you use the computer and what you THINK you should see because of the black and white world of benchmarks while most any current brand are so fast now it doesn't really matter what cpu your running because you can run any game, or app and at home and you would never know the difference between them until you looked inside.
   This is the most funny and insane part of the computer benchmark world where it is more applicable for non standard software use then for home systems. Saying yeah, I bought this cpu because I only run 2 programs on it and it's twice as fast in doing it. That is funny. And not realistic. 
  It's exciting to take a look at new things except when you can't see all of what the new thing can do and a redesigned cpu can fit in this category. 
  Like watching a Ferrari being tested on a 100ft. chunk of roadway.
 Pointless. I'm going to give it more time and keep a open mind about it while it's testing can be ironed out. It's way to early for this new of a design to make final opinions.  So far it isn't that bad but needs more tests and more info on what software optimized programs can really do with it. So far they have only shown a small part of what it can do or will never do well. 
  AMD's timing really sucks..


----------



## Fourstaff (Oct 15, 2011)

Two things to note:

Bulldozer excels at multithreaded loads, guess what server load is?

Bulldozer sucks at singlethreaded loads, guess what consumer load is?

Bulldozer looks almost like its geared to server use, and APUs like Llano for normal desktop use, leaving a rather big hole in the middle for people like us, who demand both excellent single threaded and multithreaded applications, or at least not a bad compromise.


----------



## BarbaricSoul (Oct 15, 2011)

Inceptor said:


> Well... pit a C2Q against an AM3 Phenom II x4, and you've got a hell of a competition, in fact, the Phenom comes out on top.



A PII at the same clock speeds as my C2Q DOES NOT beat my C2Q. Oh, and don't mention OC'ing, my C2Q can do 4.5ghz on air. Let's see a PII beat a Q9*50 at +4ghz. I'll dismiss the rest of your statement as fanboism


----------



## John Doe (Oct 15, 2011)

lol. Though, keep in mind that S775 is EOL with it's still jacked up prices. Denebs and especially Thubans give solid price/performance. If you are on tight budget, there is no other option than AMD for a quad on the cheap, well unless you buy second hand.


----------



## qubit (Oct 15, 2011)

Lionheart said:


> I feel like AMD bulldozer has hidden potential, I guess we just have to wait and see



I think it has the same kind of "potential"  that the HD2900XT had against the 8800 GTX. Very good specs on paper, but it never did beat it.


----------



## BarbaricSoul (Oct 15, 2011)

John Doe said:


> lol. Though, keep in mind that S775 is EOL with it's still jacked up prices. Denebs and especially Thubans give solid price/performance. If you are on tight budget, there is no other option than AMD for a quad on the cheap, well unless you buy second hand.



I never suggested spending the money on a new Q9650 now. Only way I would suggest that would be to someone with a C2D system that doesn't have the money for a full upgrade, and then I would only suggest a used Q9650 for $200 or less.


----------



## Fourstaff (Oct 15, 2011)

BarbaricSoul said:


> A PII at the same clock speeds as my C2Q DOES NOT beat my C2Q. Oh, and don't mention OC'ing, my C2Q can do 4.5ghz on air. Let's see a PII beat a Q9*50 at +4ghz. I'll dismiss the rest of your statement as fanboism



Neither is comparing clock for clock, you have to measure it twice, once at stock speeds, and another at average max sustained overclock. The former for out of the box experience felt by non-techies, the latter for people like us. C2Q comes out tops in both metric I believe, but I don't like the way you jump straight to calling people fanboys while preaching a faulty metric. And then there is Price/Perf, which evens things out, depending on budget.


----------



## John Doe (Oct 15, 2011)

BarbaricSoul said:


> I never suggested spending the money on a new Q9650 now. Only way I would suggest that would be to someone with a C2D system that doesn't have the money for a full upgrade, and then I would only suggest a used Q9650 for $200 or less.



Yeah, I know. That is what I would do, too. The Yorkfield is still a great chip. But if you are buying new, then your only option is AMD for an OCable, cheap quad, since non-K i5s are locked. They still give decent performance though while coming in more expensive.


----------



## BarbaricSoul (Oct 15, 2011)

Fourstaff said:


> Neither is comparing clock for clock, you have to measure it twice, once at stock speeds, and another at average max sustained overclock. The former for out of the box experience felt by non-techies, the latter for people like us. C2Q comes out tops in both metric I believe, but I don't like the way you jump straight to calling people fanboys while preaching a faulty metric. And then there is Price/Perf, which evens things out, depending on budget.



How is my metric faulty. What I basically said is that a PII @ 3ghz does not beat amy C2Q @ 3ghz, it also does not beat my C2Q with both clocked @ 3.6ghz, or both clocked @ 4ghz. The actual speed they are clocked at doesn't matter to what I said, as long as both are clocked at the same speed. The fanboy comment was from the impression he gave me saying the PII does beat the C2Q, and that BD with OC'ing can beat 2nd gen I5 if the I5 isn't OC'ed. He isn't putting both platforms on equal terms. That's why I saw his comments as fanboism. 

And no, I'm not a fanboy myself. Like everyone else, I want BD to be great, and I was considering using one to replace my C2Q, but now that the real performance of BD is known, no way I would buy one over a I5 or I7.


----------



## Fourstaff (Oct 15, 2011)

BarbaricSoul said:


> How is my metric faulty. What I basically said is that a PII @ 3ghz does not beat amy C2Q @ 3ghz, it also does not beat my C2Q with both clocked @ 3.6ghz, or both clocked @ 4ghz. The actual speed they are clocked at doesn't matter to what I said, as long as both are clocked at the same speed. The fanboy comment was from the impression he gave me saying the PII does beat the C2Q, and that BD with OC'ing can beat 2nd gen I5 if the I5 isn't OC'ed. He isn't putting both platforms on equal terms. That's why I saw his comments as fanboism.
> 
> And no, I'm not a fanboy myself. Like everyone else, I want BD to be great, and I was considering using one to replace my C2Q, but now that the real performance of BD is known, no way I would buy one over a I5 or I7.



Because you don't run both at the same frequency, stock its 9550's 2.83Ghz vs 955's stock 3.2Ghz (for the non-overclocking croud), and then ~4ghz for 9550 and about the same (or a bit less) for the 955 in the overclocked performance. You don't go about electronically limiting your Mustang to 100mph and then claim that your Toyota Prius goes faster because it is not limited to 100mph: you have to take two readings, one with limiters on (whatever it is, in this case corresponding to stock), and then another without limiters (overclocked). Or limit the rev to 2000rpm, knowing that the Mustang's optimum rev is much higher than that, while Prius is closer to its optimum rev.

Edit: I am not saying that you are wrong, just that your view is skewed to better clock for clock performance, which died with P4 when Intel needed 1.5x freq to equal AMD's offering.


----------



## BarbaricSoul (Oct 15, 2011)

Fourstaff said:


> Because you don't run both at the same frequency, stock its 9550's 2.83Ghz vs 955's stock 3.2Ghz (for the non-overclocking croud), and then ~4ghz for 9550 and about the same (or a bit less) for the 955 in the overclocked performance. You don't go about electronically limiting your Mustang to 100mph and then claim that your Toyota Prius goes faster because it is not limited to 100mph: you have to take two readings, one with limiters on (whatever it is, in this case corresponding to stock), and then another without limiters (overclocked).



But that's my point, he's trying to compare the BD with OC'ing(no limits) to a I5 with no OC'ing(limited). How's what I said any more wrong than what he said? Also, from your earlier post about price/performance, shouldn't we also add in the increased cost in electricity to BD has to use to even come close to the I5 performance, as it will cost more to run a BD chip compared to a I5 or I7 2nd gen chip.


----------



## Fourstaff (Oct 15, 2011)

BarbaricSoul said:


> But that's my point, he's trying to compare the BD with OC'ing(no limits) to a I5 with no OC'ing(limited). How's what I said any more wrong than what he said? Also, from your earlier post about price/performance, shouldn't we also add in the increased cost in electricity to BD has to use to even come close to the I5 performance, as it will cost more to run a BD chip compared to a I5 or I7 2nd gen chip.



Yes there are a lot more to add to that, but it remains the fact that you are using a faulty measurement to prove your point, and in my eyes you are no better than him (and you get minus points by calling people fanboy, which is why I was annoyed at you and consequently attacked you).


----------



## John Doe (Oct 15, 2011)

Fourstaff said:


> Or limit the rev to 2000rpm, knowing that the Mustang's optimum rev is much higher than that, while Prius is closer to its optimum rev.



Hmm... Mustang at 2000 RPM or a Prius? Prius should take off after the 3rd gear. Oh well, it is a Ford. It will mess the thang up anyway.


----------



## BarbaricSoul (Oct 15, 2011)

Well I didn't take it as a attack, this isn't an arguement to me. Just a friendly debate. 

I do see what your saying, and yes I will concede that a PII stock clocked at 3.7 ghz does out perform a C2Q at 3ghz. The main point I was trying to make is that no matter what speed's a PII or a BD chip is clocked at, it does not out perform the Intel equivalent chip when compared performance/GHZ or performance/core. And that is the performance that matters to me as a consumer.


----------



## Dent1 (Oct 15, 2011)

Altered said:


> And then I read this and the best I remember the Core 2 Quad beat up the Phenom II x4 up pretty good unless your only counting gaming, where most of these quad-core chips perform so similarly that it isn’t worth factoring in the tenths of a frame.



Gaming aside. The orginal Deneb Phenom IIs with C2 stepping (4MB L3) and Core 2 Quads traded blows very evenly.  But as time progressed and the Phenom IIs got tweaked to 6MB L3 cache/95W with the C3 stepping and the Phenom IIs pulled ahead a fair bit, although the Core 2 Quad still held its own.


----------



## BarbaricSoul (Oct 15, 2011)

Dent1 said:


> Gaming aside. The orginal Deneb Phenom IIs (4MB L3/125W) and Core 2 Quads traded blows very evenly.  But as time progressed and the Phenom IIs got tweaked to to 6MB L3 cache/95W the Phenom IIs pulled ahead a fair bit



That depends on which C2Q your comparing the PII to. The only way the PII pulled ahead of the high-end C2Qs is higher clock speeds. Intel moved on to the I series, while AMD was stuck with the PII. If Intel wanted to increase the clock speeds of the C2Qs instead of moving on to newer architecture, PII wouldn't have a chance at beating the C2Q. Like I've said all morning long, at EQUAL CLOCK SPEEDS, a PII does NOT out perform a C2Q Q9*50.


----------



## John Doe (Oct 15, 2011)

Dent1 said:


> Gaming aside. The orginal Deneb Phenom IIs with C2 stepping (4MB L3) and Core 2 Quads traded blows very evenly.  But as time progressed and the Phenom IIs got tweaked to 6MB L3 cache/95W with the C3 stepping and the Phenom IIs pulled ahead a fair bit, although the Core 2 still held their own.



If talking about Kentsfield, then yeah they do perform similar. However Penryn (Yorkfield) outdoes any Phenom 2 in single or quad threaded performance. It also has better performance per-watt against top end Denebs (95W against 140). Still, things are and have always been cheaper on AMD's end. Also, mobos with nVidia chipsets blow on the Intel side so AMD has a better SLi mobo advantage as well. Bit of a toss up.


----------



## Bo$$ (Oct 15, 2011)

I think we will see some sort of boost in apps like sony vegas, that can really use multiple threads... I am actually liking bulldozer quite a bit. But intel is very strong, for me it is price rather than outright Performance in 1 particular application


----------



## Lionheart (Oct 15, 2011)

Ok we get it, CQ2 is better then PII :shadedshu, who cares I thought this was a Bulldozer thread.


----------



## Super XP (Oct 15, 2011)

Goodman said:


> I couldn't give a rat ass about future , would you still use A64 single core on socket 754 or 939 (future ready back then) because now we can all use "good" software & Windows Vista/7 64bit?
> 
> Sorry! but what comes out now is got to be good to use right now , not in 2 years when it will get close to be obsolete or replace by something better...
> 
> ...


I disagree, though what you probably mean is what comes out now should be good now and for the future, it's called balance. 

I will say this again, look at benchmarks with retail copies of Bulldozer, they differ from the original benchmarks released on Oct 12, 2011. Why is this? Why is the retail Bulldozer performing better? Something is up here, and with more tweaking and some love, AMD's FX line should do what it was meant to do, Bullodoze Competition


----------



## xenocide (Oct 15, 2011)

You can claim Bulldozer is a "Server CPU" all you want, but it just sounds like excuses to justify poor per-core performance.  Sure, it does great in heavily threaded applications, but those are so few and far between it's hardly worth mentioning when you consider they still marketed this as a Consumer (even Enthusiast) CPU.


----------



## Super XP (Oct 15, 2011)

xenocide said:


> You can claim Bulldozer is a "Server CPU" all you want, but it just sounds like excuses to justify poor per-core performance.  Sure, it does great in heavily threaded applications, but those are so few and far between it's hardly worth mentioning when you consider they still marketed this as a Consumer (even Enthusiast) CPU.


Bulldozer just needs the right benchmarks to show off it's real performance. That said, AMD needs to release a revision ASAP to ensure it does what it was meant to do, perform like a Bulldozer. Hopefully they can iron out it's issues and get back into the game.

This is almost identical setups for both PC's


----------



## de.das.dude (Oct 15, 2011)

with all these people buing intels, why do you think the software companies will design softwares depending upon bulldozers architecture?


----------



## Dent1 (Oct 15, 2011)

BarbaricSoul said:


> That depends on which C2Q your comparing the PII to. The only way the PII pulled ahead of the high-end C2Qs is higher clock speeds. Intel moved on to the I series, while AMD was stuck with the PII. If Intel wanted to increase the clock speeds of the C2Qs instead of moving on to newer architecture, PII wouldn't have a chance at beating the C2Q. Like I've said all morning long, at EQUAL CLOCK SPEEDS, a PII does NOT out perform a C2Q Q9*50.



I was talking about the C3 Phenom IIs outperformed C2Q on average as time progressed. Obviously the tests where the Phenom II was triumphant was only slight and negliable and often within margin for error as the two opposing architectures very similar performers.

Yes Intel could of increased the clock speeds but considering that the Phenom II X4 processor offerings was just as fast and sometimes faster, OC'd just as far, cheaper and had AM2/AM2+/AM3 backward compatible AMD would look more attractive to the educated buyer. Also Core 2 Quad with souped up clocks wouldn’t look attractive compared to a cheaper Phenom II X6 Thuban. Intel's move to Core I-series made Intel seem attractive again.


----------



## xenocide (Oct 15, 2011)

de.das.dude said:


> with all these people buing intels, why do you think the software companies will design softwares depending upon bulldozers architecture?



Nobody gave Netburst a break, or catered directly to HyperThreading.  Software companies aren't going to develop for obscure changes when 90%+ of the market are still using the same design.  What company is really excited to double the amount of threads their software uses when a good number of their customers don't even use that many?



Super XP said:


> *Bulldozer just needs the right benchmarks to show off it's real performance*. That said, AMD needs to release a revision ASAP to ensure it does what it was meant to do, perform like a Bulldozer. Hopefully they can iron out it's issues and get back into the game.



If you make the test bias of course they will win lol.  The problem is that overall they don't outperform similarly priced offerings from Intel in a majority of tasks.


----------



## cadaveca (Oct 15, 2011)

de.das.dude said:


> with all these people buing intels, why do you think the software companies will design softwares depending upon bulldozers architecture?



Of course they will. AMD and Intel BOTH are going to be bringing CPUs with like 20 cores to the desktop space in the next couple of years.


I kinda gotta agree with the sentiment offered by the article writer in the OP...devs need good multicore chips with true cores to write and test software on, becuase, as mentioned, both Intel and AMD are increasing core counts. Bulldozer provides an affordable option for devs NOW, so that their software in the FUTURE can be written NOW.


All i can think of is a BD-based design with a good GPU in it, for consoles, would provide a similar core platform like the PS3 does, but in x86, and with truly powerful cores. This makes me get a bit excited for the next generation of gaming consoles.


----------



## xenocide (Oct 15, 2011)

cadaveca said:


> *All i can think of is a BD-based design with a good GPU in it, for consoles, would provide a similar core platform like the PS3 does*, but in x86, and with truly powerful cores. This makes me get a bit excited for the next generation of gaming consoles.



Any developer would hate that.  The PS3 had a more powerful CPU on paper, but it was such a nightmare to program for that most developers still have issues with it.  I guarantee most next-gen consoles will still use IBM CPU's, probably based on a more consumer-friendly version of the platform Watson's hardware used.


----------



## FordGT90Concept (Oct 15, 2011)

twilyth said:


> What types of work loads do servers mostly deal with?


FLOPs! The FPU is always the bottleneck, not the ALU.


----------



## cadaveca (Oct 15, 2011)

xenocide said:


> Any developer would hate that.  The PS3 had a more powerful CPU on paper, but it was such a nightmare to program for that most developers still have issues with it.  I guarantee most next-gen consoles will still use IBM CPU's, probably based on a more consumer-friendly version of the platform Watson's hardware used.



I understand, xenocide, but for me, something providing a challenge just makes me rise to the occasion, not bitch about it.

I think the nightmare of the Cell Processor was that it wasn't running x86 code, where most devs are far more comfortable.


You're probably right about the console hardware, but i think it'd be a mistake for them to NOT be running either x86 or x64 code for compatibility reasons. It doesn't make sense for OEMs that are designing not just consoles...but lifestyles...to be using many devices all running different code, when they do not have to.


----------



## Fourstaff (Oct 15, 2011)

BarbaricSoul said:


> I do see what your saying, and yes I will concede that a PII stock clocked at 3.7 ghz does out perform a C2Q at 3ghz. The main point I was trying to make is that no matter what speed's a PII or a BD chip is clocked at, it does not out perform the Intel equivalent chip when compared performance/GHZ or performance/core. And that is the performance that matters to me as a consumer.



I can see your point too, but you were lucky that the PII and C2Q have more or less the same roof. If PII had a roof of about 8Ghz, compared to 4Ghz of C2Q then, with the current work done per clock cycle per core the PII is going to absolutely demolish C2Q and some more.


----------



## techtard (Oct 15, 2011)

Please take the fanboy slapfight to an Phenom vs C2Q thread. This is for (yet another) Bulldozer thread.


----------



## Bo$$ (Oct 15, 2011)

even if this isnt as good as expected, the price of the overall platform is cheaper. i think it will be pretty good in a few years as games and apps get more threaded


----------



## Disparia (Oct 15, 2011)

This is easy then... stop thinking like a bunch of single task consumers 

I can't be the only one here who puts a rip to transcode, opens up a game on monitor 1 while a show/movie plays on monitor 2. ...and that's just my instance on my multi-instance home mainframe for my wife and two little users 

The first part is true, the last part is where I'm heading. GPU virtualization is here and will continue to get better. That was really the last part needed to make it happen as CPU power, RAM density, PCIe lanes and storage I/O are plentiful/powerful.


----------



## Fatal1ty39 (Oct 15, 2011)

found this video AMD Fx-8 Vs intel core i5 2500K
http://www.engadget.com/2011/10/12/amd-fx-processor-brings-eight-cores-to-battle-we-go-eyes-on-vi/


----------



## cadaveca (Oct 15, 2011)

I love to see Macci.  Thanks!

Pretty funny to hear Sacha say that Bulldozer will be a catalyst for multithreaded programming for devs. I just said that myself!


----------



## Altered (Oct 15, 2011)

Bo$$ said:


> even if this isnt as good as expected, the price of the overall platform is cheaper. i think it will be pretty good in a few years as games and apps get more threaded



Im not sure you added in the added cost of Windows 8 to perform properly. 

And this is where the BD issues arises. I dont think buying anything be it a car, boat, TV, or PC hardware to only get partial use of it that day you take it home is what any consumer considers a deal. And I sure dont see anyway its justifiable for any company to ask you too pay, at top dollar, for something that is performing below the other offerings in its price range on the hope something from an entire different company may allow it to run as it should.  I can see enthusiast doing this but AMD or Intel do not stay in business on that slim % of buyers.

Hopefully a revision or the retail shows up stronger.


----------



## Super XP (Oct 15, 2011)

Well Said......
http://www.rage3d.com/reviews/cpu/amd_fx_8150/index.php?p=10


> Verdict? It's tempting to be disappointed and doom'n'gloom about FX's performance. The clean kill and win isn't here against Intel's top Sandy Bridge. But, AMD hasn't priced it against the 2600K, it's head to head with the 2500K, and in that context AMD's Scorpius platform is a solid buy. AMD's FX processor is enough to challenge Intel in a lot of areas, but AMD still has work to do on improving performance while they live in an x86 and low-thread count integer based workload world. The first generation Bulldozer's have slimmed down the 'big cores' as part of their ongoing strategy for shifting to APUs, and this has obvious consequences for performance. Despite that, there is a very real and impressive boost in performance for current real world applications vs. the previous generation. Well done, AMD.


----------



## Inceptor (Oct 15, 2011)

John Doe said:


> As somebody who has owned both an i7 870 and a X4 975, I have not seen a big difference between the two. But if you were to talk about pure performance; then C2Q, i5, i7, Sandy they all beat the Phenom 2, and changes are most of them will beat BD, whether both chips are OCed or not.



Those two sentences contradict each other.
Anyway, I'm not a rabid fan of either company.  Here's what I based my comments on:
http://www.anandtech.com/bench/Product/49?vs=80
Choose a Phenom II x4 965 or higher vs a C2Q Q6600 or higher.  Those are all stock speeds, which is what I was referencing.  I never said anything about overclock vs overclock.  And I never said anything about clock for clock, since stock speeds are different for Intel and AMD.
And I never reference game benchmarks.


----------



## Neuromancer (Oct 15, 2011)

Super XP said:


> Bulldozer just needs the right benchmarks to show off it's real performance. That said, AMD needs to release a revision ASAP to ensure it does what it was meant to do, perform like a Bulldozer. Hopefully they can iron out it's issues and get back into the game.




A large, heavy gas guzzling machine, that sucks at digging holes, but can tear down hills no problem.

Bulldozer is a perfect name for this chip.

The power consumption on this chip is absurd, if it contained a 6500 series GPU in it, I could see needing as much power as it does. 

That said I see a lot of people saying it eats multithreaded apps up, and that not entirely true either. 

Consumer level multithreaded goodness...







I thought that handbrake only uses up to 8 threads. Although I never had a 12 threaded CPU to test that out for myself  And the incredible hexacore times from the 990X show that it probably was using all 12.


However it should be noted that AMD announced (subtly) a LONG time ago that bulldozer would have reduced performance per core/per clock compared to deneb/thuban/zosma.

What they actually released was a multithreaded testing scenario that showed the BD chips performing (making up the numbers ) like 80% faster than a four core deneb. They marketted it as a kudo, but the math showed lower performance per core.


----------



## Inceptor (Oct 15, 2011)

billcat479 said:


> If people buy a computer to last 2 years and most of them do then AMD makes a lot of sense MAYBE. The maybe part is the hard one to buy into unless you know how it will work with optimized software for it's design. All cpu's depend on this aspect to do their best.
> So do you think the folks at AMD are that stupid?
> ...
> Most computer users don't hack up there computers every 6 months, only the benchmark led fanatics do that an it's a very small percent of computer owners that are in this category.
> ...



Very good points, and a perspective to which nearly everyone who posts on this forum is completely blind.
Most people hold onto their systems for 4 or 5 years, even some of the people who post on this forum.  In some cases, the systems are even older.  On that time scale, the 'original' BD would turn out to be acceptable even years down the line, just as the Core 2 Quads and Phenom IIs have been acceptable to many of the people in this forum, years after supposed obsolescence.
Don't think only about gaming and benchmarking, people.  Because neither AMD or Intel is thinking about you as their main money maker and customer.  They may allocate some marketing dollars towards you, but you don't pay their bills.


----------



## thebluebumblebee (Oct 15, 2011)

I feel that to expect great things from BD in multi-threaded apps is some kind of hype.  Folding@Home is a muli-threaded app and the BD only does about the same as the 2500K!  Linky


----------



## twilyth (Oct 15, 2011)

thebluebumblebee said:


> I feel that to expect great things from BD in multi-threaded apps is some kind of hype.  Folding@Home is a muli-threaded app and the BD only does about the same as the 2500K!  Linky



Ah, but the catch is _interger_ apps.


----------



## Steevo (Oct 15, 2011)

hat said:


> But BD isn't a server chip, it's a desktop chip. AMD has had server chips with loads of cores before BD, like the "Magny-Cours" chip. I think this is a moot point for the majority of users, since most software still isn't highly compatible with multithreading yet.



On multi user account servers you don't need the software to be multi threaded, you allocate resources to each terminal user. More cores or threads means more users effectively per server. 

This is the only location that a OPTERON, not bulldozer, is a win. 

Companies constrained by server space are using two and four way boards, not single socket. Adding two more under-performing cores that suck up power is not the option they want.


----------



## repman244 (Oct 15, 2011)

Steevo said:


> Adding two more under-performing cores that suck up power is not the option they want.



Now that you mention that, does anyone imagine what the power consumption is of an 8 module Opteron is :shadedshu 

The power consumption could really be an issue (and a big one!), they really need to lower it if they plan on putting a GPU in there...


----------



## Zubasa (Oct 15, 2011)

repman244 said:


> Now that you mention that, does anyone imagine what the power consumption is of an 8 module Opteron is :shadedshu
> 
> The power consumption could really be an issue (and a big one!), they really need to lower it if they plan on putting a GPU in there...


Actually there already is the Interlagos which is an 8-module 16-core Bulldozer.
But no it does not suck up a ton of power because Opterons are made for parallel processing power thus are clocked much slower than the Zambezi.
For example the Opteron 6276 is only clocked at 2.3Ghz.


----------



## repman244 (Oct 15, 2011)

Zubasa said:


> Actually there already is the Interlagos which is an 8-module 16-core Bulldozer.
> But no it does not suck up a ton of power because Opterons are made for parallel processing power thus are clocked much slower than the Zambezi.



I know there are.

It's not so much about the clocks, the whole chips is insufficient, you can't take that away. I bet that per core it still consumes more power than Magny.
And AFAIK the BD based Opterons also have the agressive turbo and have higher clocks than Magny.


----------



## Zubasa (Oct 15, 2011)

repman244 said:


> I know there are.
> 
> It's not so much about the clocks, the whole chips is insufficient, you can't take that away. I bet that per core it still consumes more power than Magny.
> And AFAIK the BD based Opterons also have the agressive turbo and have higher clocks than Magny.


Actually I am pretty sure that the Interlagos is much more efficient that you believe it is.
The reason that the Zambezi uses so much power is because it is clocked as high as it could (within TDP) to get some single threaded performance.

Edit: BTW the 45nm Magny Cours (Opteron 6100s) are actually clocked higher than the 32nm Interlagos.


----------



## Bo$$ (Oct 15, 2011)

Altered said:


> Im not sure you added in the added cost of Windows 8 to perform properly.
> 
> And this is where the BD issues arises. I dont think buying anything be it a car, boat, TV, or PC hardware to only get partial use of it that day you take it home is what any consumer considers a deal. And I sure dont see anyway its justifiable for any company to ask you too pay, at top dollar, for something that is performing below the other offerings in its price range on the hope something from an entire different company may allow it to run as it should.  I can see enthusiast doing this but AMD or Intel do not stay in business on that slim % of buyers.
> 
> Hopefully a revision or the retail shows up stronger.



well it is already no slouch in games, but it is keeping up with 2500k which is similarly priced, and out performs it in certain apps


----------



## FordGT90Concept (Oct 15, 2011)

Bulldozer was intended to run at 4 GHz+ but because of the issues at Global Foundries, they can't get that high.  Their low model number (8150) suggests that they anticipate getting the problem fixed so they'll have headroom for higher clocked processors down the road.

It is what it is but more will eventually come.  The question is if the problems will be resolved before Ivy Bridge.  Ivy Bridge will raise the bar another 10-20% and I'm not so sure fixing the foundry issues will afford that much of a gain.  Raising the clockspeeds will only be able to net them about a 25% increase max from where they are now.


----------



## Zubasa (Oct 15, 2011)

FordGT90Concept said:


> Bulldozer was intended to run at 4 GHz+ but because of the issues at Global Foundries, they can't get that high.  Their low model number (8150) suggests that they anticipate getting the problem fixed so they'll have headroom for higher clocked processors down the road.
> 
> It is what it is but more will eventually come.  The question is if the problems will be resolved before Ivy Bridge.  Ivy Bridge will raise the bar another 10-20% and I'm not so sure fixing the foundry issues will afford that much of a gain.  Raising the clockspeeds will only be able to net them about a 25% increase max from where they are now.


Here is the problem.
Performance per Watt goes down the shitter as they increase the clock speeds higher and higher.


----------



## repman244 (Oct 15, 2011)

Zubasa said:


> Actually I am pretty sure that the Interlagos is much more efficient that you believe it is.
> The reason that the Zambezi uses so much power is because it is clocked as high as it could (within TDP) to get some single threaded performance.



I know that it has to be efficient as much as possible.
But if we can believe the Opteron's "leaked" clockspeeds: http://www.xbitlabs.com/news/cpu/di...ck_Speeds_of_AMD_Opteron_Bulldozer_Chips.html
You can see that they are really pushing it.
Also I wonder how the performance is if you run multiple VM's on BD (let's assume you assign 1 core to each VM), there could be a performance hit due to resource sharing like we see it with Zambezi. I can't find any info about VM performance anywhere...
It's really a hit or miss with performance, we also must not forget that core per core BD based Opterons are slower than Magny Opterons (or if we compare a BD Zambezi core with Thuban's/Deneb's core at the same clock speed).

And I know for a fact that even with servers, nothing is always parallel...

I really hope that AMD can improve BD ASAP since with this kind of performance it's not really looking good, and I want AMD to stay competitive.


----------



## Zubasa (Oct 15, 2011)

The Interlagos uses Turbo Core 2.0 which is nothing like the one on the Thuban, so I don't really see that thet are "pushing" the clock speed all that much in terms sacrificing efficiency.
The Magny is basically the same chip as the Thuban (2 in MCM), they only reason they didn't use Turbo Core on it is most likely because of how TC 1.0 increases the voltage on the entire chip (even the cores in idle).
Using Turbo Core 1.0 on a server CPU will basically be a disaster interms of power efficiency.


----------



## repman244 (Oct 15, 2011)

Zubasa said:


> The Interlagos uses Turbo Core 2.0 which is nothing like the one on the Thuban, so I don't really see that thet are "pushing" the clock speed all that much in terms sacrificing efficiency.
> The Magny is basically the same chip as the Thuban (2 in MCM), they only reason they didn't use Turbo Core on it is most likely because of how TC 1.0 increases the voltage on the entire chip (even the cores in idle).



If you look at the slide you will see a +500MHz turbo across all cores, stock is 2.3GHz. That is quite a high turbo for a server chip...

You cannot deny the fact that the power consumption for the % of the performance increase (and sometimes even a decrease) is actually horrible. I just hope it's the GF having issues with there chips and that the power requirements will be improved with new stepping. 

Turbo core 1.0 wasn't really that good, like you said it had the problem of not being able to control individual cores voltage (+ it really depended on the board and the BIOS, it was all over the place). And you don't want to implement not so polished features in a server chip.

I guess we will need to wait for an Interlagos benchmark run. But IMO it won't shine all that much.


----------



## Zubasa (Oct 15, 2011)

repman244 said:


> If you look at the slide you will see a +500MHz turbo across all cores, stock is 2.3GHz. That is quite a high turbo for a server chip...
> 
> You cannot deny the fact that the power consumption for the % of the performance increase (and sometimes even a decrease) is actually horrible. I just hope it's the GF having issues with there chips and that the power requirements will be improved with new stepping.
> 
> ...


Indeed that will depend on how well Intelagos performs in practice.
On the other hand, Bulldozer do have some aggressive power saving features to help with its power consumption.

If this site is to be trusted:
http://www.cpu-world.com/CPUs/Bulldozer/AMD-Opteron 6276.html
Interlagos will have a significantly lower TDP than Magny Cours.
(115W TDP/85W ACP vs 140W TDP/105W ACP.)

Also the fact that AMD stated the Interlagos will share the same platform with Mangy Cours means that they should be within a similar power envelope.


----------



## repman244 (Oct 15, 2011)

Zubasa said:


> Indeed that will depend on how well Intelagos performs in practice.
> On the other hand, Bulldozer do have some aggressive power saving features to help with its power consumption.
> 
> If this site is to be trusted:
> ...



Yes that is correct the G34 socket does have it's limits when it comes to power delivery.

Don't get me wrong though I do not say that BD is a total failure on all fronts. It is actually a really good platform for future and something AMD can build upon.
It's just that it's not consistently good (it's a beast in some cases but then in some others it falls behind) and AMD needs a consistently good chip, it doesn't have to be the fastest CPU out there. I hope they polish out the problems with Piledriver and better process from GF.


----------



## FordGT90Concept (Oct 16, 2011)

Zubasa said:


> Here is the problem.
> Performance per Watt goes down the shitter as they increase the clock speeds higher and higher.


They might be running at higher volts than intended because of the fab issues.  Higher voltage increases stability.


----------



## Steevo (Oct 16, 2011)

FordGT90Concept said:


> They might be running at higher volts than intended because of the fab issues.  Higher voltage increases stability.



I can't even pretend to care about their issues for a three year wait to release this. 

I know they have had a hard time trying to come up with the next big thing, but they failed, and not just at one part. Everything but a option that is only used 10% of the tie was missed by this processor.

Power consumption? Fail
IPC? Fail
Clocks/Overclocks? Fail
Cost? Fail


The only market this is good for is servers, and not much else considering a dual socket with two four core prior generation Opterons offers more performance for less price.


----------



## kid41212003 (Oct 16, 2011)

billcat479 said:


> AMD has put some head scratching cpu's out when there was not any present need for what they did. It started with the the 64 bit cpu when 64bit home OS software that wasn't out yet but is now. They just kept the 32bit part working well till it started to come together. Started adding more core's to their designs.
> 
> Kind of like making a product to influence the future. Intel said what and why bother and then copied AMD while still saying why bother until their duel sort of cores came out. Funny.
> 
> AMD's timing really sucks..



Agree.


----------



## Champ (Oct 16, 2011)

a bit off topic guys, but is the Thuban, I believe it called, a true 6 core chip?


----------



## twilyth (Oct 16, 2011)

Champ said:


> a bit off topic guys, but is the Thuban, I believe it called, a true 6 core chip?



Yup.  All on one die.  No MCM.


----------



## Champ (Oct 16, 2011)

thats good.  I'm one of those people that builds a gaming rigs and keeps it forever maybe upgrading GPUs, until stuff starts to fail.  My last chip was a X2 5000+ BE.  I may get a Thuban and it should keep me happy for the years to come.


----------



## Steevo (Oct 16, 2011)

I just bought one to carry me over till next year when i upgrade my system again.

$189 for a 1100T at the egg was great. I can't wait to get it and get it going to see what sort of clocks I can get.


----------



## Neuromancer (Oct 16, 2011)

I dont get why AMD would go MCM after they criticized core2 for not making "true quad cores"

Thubans will make you smile, even sitting next to my sandyBridge PCs, I still like using the AMD system better (better desktop feel).


----------



## Super XP (Oct 16, 2011)

Zubasa said:


> Indeed that will depend on how well Intelagos performs in practice.
> On the other hand, Bulldozer do have some aggressive power saving features to help with its power consumption.
> 
> *If this site is to be trusted:*
> ...


Yes they can be trusted, they were one of the first to break the Bulldozer story along with several new Intel CPU's. It's a good site.


Neuromancer said:


> I dont get why AMD would go MCM after they criticized core2 for not making "true quad cores"
> 
> Thubans will make you smile, even sitting next to my sandyBridge PCs, I still like using the AMD system better (better desktop feel).


Truthfully it's called Pure Innovation. AMD's 8-Core CPU is like a 4-Core but with Hyper Threading. AMD didn't want to do something like Hyper Threading so they choose to do what they did with Bulldozer. I say they've pushed Innovation quite strong with this CPU, now all they need to do is learn from it and make it more efficient and faster.


----------



## John Doe (Oct 16, 2011)

Inceptor said:


> Those two sentences contradict each other.
> Anyway, I'm not a rabid fan of either company.  Here's what I based my comments on:
> http://www.anandtech.com/bench/Product/49?vs=80
> Choose a Phenom II x4 965 or higher vs a C2Q Q6600 or higher.  Those are all stock speeds, which is what I was referencing.  I never said anything about overclock vs overclock.  And I never said anything about clock for clock, since stock speeds are different for Intel and AMD.
> And I never reference game benchmarks.



Tell me how do they contradict each other. What you do not see is a Yorkfield or even a Deneb beyond 4 Ghz (my X4 975 was at 4.3) is sufficient for most games. From there, a faster CPU does not make a World of difference. Instead, you should spend the extra on a stronger GPU. And yes, you do sound like a fanboy. That is how you sounded like on the first page. With that aside, you are changing subject from games. Further, Q6600 does not compare to a Deneb. A Q9550/Q9650 etc. are all faster compared to their AMD equivalents. They are faster at both stock _and_ OC'ed. See the HWC link; Phenom loses either way.

Anyway, Mustang against a Silverado... 

http://www.youtube.com/watch?v=mJmum0N4Mxc&feature=related


----------



## de.das.dude (Oct 16, 2011)

Neuromancer said:


> I dont get why AMD would go MCM after they criticized core2 for not making "true quad cores"
> 
> Thubans will make you smile, even sitting next to my sandyBridge PCs, I still like using the AMD system better (better desktop feel).



exactly!! even i dont know why but intels dont feel as fast in everyday application.


----------



## John Doe (Oct 16, 2011)

de.das.dude said:


> exactly!! even i dont know why but intels dont feel as fast in everyday application.



It is a placebo effect. Like how lowering your memory timings make you think it is faster. At the end of the day, the end result is the same no matter your instructions. The only difference between different chips is per-Mhz performance. It is still the same image on your screen. BTW, I found a Silverado a few months ago. 350. black with red stripes. Wish I had use for a pick-up though, they don't make those in hardtop.


----------



## de.das.dude (Oct 16, 2011)

John Doe said:


> It is a placebo effect. Like how lowering your memory timings make you think it is faster. At the end of the day, the end result is the same no matter your instructions. The only difference between different chips is per-Mhz performance. It is still the same image on your screen. BTW, I found a Silverado a few months ago. 350. black with red stripes. Wish I had use for a pick-up though, they don't make those in hardtop.



nope. its different all right.


----------



## John Doe (Oct 16, 2011)

de.das.dude said:


> nope. its different all right.



Well, IDK. Have a look at Windows benches then: like WinZip or PCMark. You can't tell the exact performance without numbers. It's like trying to guess how long an obstacle is by looking at it. There is no way to tell which one is smoother without considering other factors.


----------



## Zubasa (Oct 16, 2011)

Steevo said:


> The only market this is good for is servers, and not much else considering a dual socket with two four core prior generation Opterons offers more performance for less price.


Are you sure?


----------



## John Doe (Oct 16, 2011)

Zubasa said:


> Are you sure?



A pair of Harpertown chips beat a Nehalem in MaxxPi. On top of it, server environments are even more multi threaded. Parallelism is the key. You have thousands of Input/Outputs in every second. The more cores you have, the better things would get. But the older platform pulls more power so it would be easier to cool single CPU racks. It could make up for the price difference from power and cooling. Depends on the scale of your environment.


----------



## Steevo (Oct 16, 2011)

Zubasa said:


> Are you sure?





John Doe said:


> A pair of Harpertown chips beat a Nehalem in MaxxPi. On top of it, server environments are even more multi threaded. Parallelism is the key. You have thousands of Input/Outputs in every second. The more cores you have, the better things would get. But the older platform pulls more power so it would be easier to cool single CPU racks. It could make up for the price difference from power and cooling. Depends on the scale of your environment.



1 of ASUS KCMA-D8 ATX Server Motherboard Dual Socket C3...

 2 of AMD Opteron 4122 Lisbon 2.2GHz 4 x 512KB L2 Cache ...

What you get? 8 threads on real cores, 12MB total of L3 cache, insane amounts of memory, and still have a PCI-e slot, and the chips will run cooler as there are two of them running with more surface area.


----------



## Super XP (Oct 16, 2011)

Steevo said:


> 1 of ASUS KCMA-D8 ATX Server Motherboard Dual Socket C3...
> 
> 2 of AMD Opteron 4122 Lisbon 2.2GHz 4 x 512KB L2 Cache ...
> 
> What you get? 8 threads on real cores, 12MB total of L3 cache, insane amounts of memory, and still have a PCI-e slot, and the chips will run cooler as there are two of them running with more surface area.


Wouldn't this be a better CPU?
http://www.newegg.com/Product/Product.aspx?Item=N82E16819105276


----------



## Neuromancer (Oct 16, 2011)

de.das.dude said:


> nope. its different all right.



Yeah I agree, I prefer my AMD systems to both my Sandy Bridge systems. Its that "desk top feel" cant bench it unfortunately, but lots of AMD users notice it.


----------



## v12dock (Oct 16, 2011)

Neuromancer said:


> Yeah I agree, I prefer my AMD systems to both my Sandy Bridge systems. Its that "desk top feel" cant bench it unfortunately, but lots of AMD users notice it.



I can vouch for this also


----------



## xenocide (Oct 17, 2011)

v12dock said:


> I can vouch for this also



And I can vouch for the opposite.  I've used several systems with Phenom II's and my SB Setup feels a lot more responsive.  The Phenom II's ran a tad better than my Q6600, but SB definitely felt smoother.  Opinion cannot be used as fact, sorry.


----------



## Super XP (Oct 17, 2011)

O.K. I am revisiting this issue in regards to how well would Bulldozer scale if you slap a Tri-CrossfireX and/or SLI into the mix. Well looking at the benchmarks, from my estimation, Bulldozer IMO could have possibly WON at least 7 out of 10 gaming Benchmarks due to the FACT it was clocked slower for some reason by about 500MHz. 

If the Clocks were even Clock for Clock, Bulldozer would have won in these benchmarked games listed bellow IMO:
1) Lost Planet 2 (Very High Image Preset)
2) Unigine Heaven Benchmark (v2.5/DX11/Shaders High/Tess Normal)
3) Aliens vs. Predator (Very High Image Preset) 
Right now, they were even on this one, but if they were even clocks, Bulldozer would have killed this one Big Time.
4) Tony Clancy's H.A.W.X.2 (Very High Image Settings)
5) Mafia 2
6) Metro 2033 (High Image Settings)
7) Dirt 3 (Very High Image Preset)
-----------------------
Total system Power Consumption in IDLE and LOAD:
Bulldozer = 136w / 723w
Core i7 2600k = 186w / 703w

So who's saying Bulldozer sucks back a lot of power? I believe it all depends on your overall system as you can see by the numbers above.

The issue I have with this review is Bulldozer is about 500MHz slower, and the reviewer had the balls to call Bulldozer a disapointment 

http://www.tweaktown.com/articles/4353/amd_fx_8150_vs_intel_i7_2600k_crossfirex_hd_6970_x3_head_to_head/index3.html

Any possible issues Bulldozer has right now should be resolved with Piledriver (Q1 2012). This I cannot wait for, it's much needed competition.


----------



## John Doe (Oct 17, 2011)

You have to consider the OC'ing headroom, and operation temps of a Sandy chip against BD. You can OC SB up 5 Ghz on a Zalman flower... also, from the benches I saw (most of them were from crap sources but yeah), there're times BD performs worse than even a Phenom. It's oriented for multi-threading, not gaming. This has been hashed out endlessly, have a read back on the thread. New games like clock-per-clock, and older ones are all for single threaded performance. These, by themselves, aren't what BD is thought for. I'd also much like to see AMD redesign the chip but that's going to be hard at this rate. But they can decrease leakage. GF is behind TSMC.


----------



## Inceptor (Oct 17, 2011)

John Doe said:


> And yes, you do sound like a fanboy.



Good lord 
The system I have is the first time I've ever bought AMD 

You know, I sat there thinking a few months ago, I was all ready to buy a 2500k rig with a 560Ti or 570.  All components, since I hadn't had a desktop in years.  I buy good quality for the price, it would have set me back $1700-1800.  I was all pumped about it too.  Then I thought to myself, "Am I really going to use it for a lot of gaming?  It's really tempting just to get it for the hardware, but can I justify it?".  So I started looking around, what did AMD have?, and what did they have on the horizon?  First thing I noticed was the price difference, cheaper, that suited me fine.  Second thing I noticed was that cpu performance was lower than the high end Intel cpus.  Then I thought about what I was going to do with the system...
How much power does it consume?
How much power does it consume when overclocked? (I was going to tinker, after all)
What is the optimal balance between system performance and power consumption, for me?
Taking that into account, what is the most cost effective choice, regardless of performance?
Is there any novelty in it? (no fun going down a well trodden path, if you ask me)
Is there any upgrade path if I take up the choice of novelty?



I chose AMD, for the first time ever.
I knew I would not see 'ultimate' performance, but I would have an interesting experience trying to make it go faster.
I had no real knowledge of what BD performance would be, but I knew that it was likely to be in the 2500k range for the FX-8xxx, perhaps not at launch but with a revision or two.

It's likely my next rig will be Intel based, just for the contrast.


----------



## Inceptor (Oct 17, 2011)

Here's an interesting post from Overclock.net.  It gives some perspective.

http://www.overclock.net/amd-cpus/1141188-asus-crosshair-v-formula-board-may-11.html#post15299874


----------



## John Doe (Oct 17, 2011)

That seems like an uninformed fanboy post TBH. If a company was to use multiple cores as a single, more powerful core, it'd be Intel. Intel has more resources than AMD. They always had. They're a bigger company to begin with. Add that games today barely scale over 4 cores. So, like said many times before, AMD's target is server enviroments. Intel on the other hand rebadges their quad/hex core chips for servers. Though, they got the per-clock performance lead so things are easy on the desktop side. I'm sure my older Westmere would trump any BD at 4.2. Not to mention it's the slowest of it's line, is server chip (dual QPI). And while doing this, it still pulls less power.


----------



## Inceptor (Oct 17, 2011)

Well, the observations concerning Intel's and AMDs resources are common knowledge.  
AMD's target being server environments... hmm well they might want to increase their miniscule market share there, but I can't see it being their target.  
Intel rejigs their core architecture for Xeons, and then in the next generation takes the Xeon socket and rejigs a current gen core architecture for it to create their enthusiast high end platform, a market segment in which they have no competition.  This is also common knowledge.

I have to say, I'm perplexed by your need to inflame argument where there is none.


----------



## John Doe (Oct 17, 2011)

Why? You quoted some random guy with no idea as a reliable source... honestly, what was up with those smiles? You certainly come off like an AMD fanboy. No offense but it's obvious. And yes, AMD's target is servers. They had the lead with their quad socket platforms, Istanbul chips were the first 6 cores, and this. BD was targeted with servers in mind. Hell, even Mr. Fruehe (AMD's server guy -- JF-AMD) was prancing on this platform last year. Saying how it will have solid single thread performance. What happened? He lied then it blew.


----------



## alexsubri (Oct 17, 2011)

> In the worst case scenario, AMD’s FX launch is disappointing in that the chip doesn't trounce competing Intel devices in performance. Regardless, as the two chip giants battle, they continue to attempt to outdo each other, which benefits consumers.



best point in the article


----------



## hat (Oct 17, 2011)

All in all it seems to me that AMD remains the cheaper, less yet still quite fast performance option. And hell, you can still overclock any AMD chip... I expect to see lots of us here at TPU buying lower class BD chips and making them fly faster than the fastest stock BD chip... and the fastest stock Intel chip. That's still crazy performance.


----------



## Neuromancer (Oct 17, 2011)

Inceptor said:


> Here's an interesting post from Overclock.net.  It gives some perspective.
> 
> http://www.overclock.net/amd-cpus/1141188-asus-crosshair-v-formula-board-may-11.html#post15299874



Thanks I edifyied that post. Since Gigabyte counts parallel processing even for VRMs.  ASUs' 8+2 = gigabyte 16+4.  . 

I recently reviewed the 990FX UD5 and the intersil chip it uses for VRM distribution is only a 6 phase parallel processing unit. (IE high end) 

Funny the Chinese site calls it a 10 phase, but its not its 6 phases parallel. Providing 5*2 phases for cpu and 1*2 pahses for nb.

I get where they are wrong, the nomenclature is close, but intersil does not make a 10 phase VRM


----------



## Inceptor (Oct 17, 2011)

John Doe said:


> Why? You quoted some random guy with no idea as a reliable source...



I'm scratching my head here.
I quoted someone who put up an interesting perspective on the situation.  I made no claim about his veracity or expertise.  I quoted a comment...
You on the other hand are looking for an argument with someone, preferably someone who will take the exact opposite viewpoint to you.
Unfortunately, I can't give that you, because I agree with you, _in general_, about BD.
I'm sorry, but who cares how Opterons have done in the past, AMD does not have a large share of the server and workstation markets... I point you back to your statement regarding AMD vs Intel resources.  I don't even need to reference anything to know I'm right, and no one else here needs those references either.
AMD wants server market share, yes.   Will they get much more of it? Maybe, if Interlagos turns out to be something special.  A whole lot more of it? Doubtful.

Anything an executive says, to media, or even much of what they say to shareholders, is marketing bull@#$@.  No point being bitter about that kind of stuff.  That's the way things are.

I think you missed out on the fanboy rage exchanges.  You should have been here a couple of weeks ago, you would have enjoyed yourself with them.

Let's see fanboy accusation list, what have people accused me of?
Intel fanboy? check.
AMD fanboy? check.
Next on the list NV and Radeon fanboy badges, then I think I'll try for mobo company fan badges...
This could be fun...


----------



## John Doe (Oct 17, 2011)

hat said:


> All in all it seems to me that AMD remains the cheaper, less yet still quite fast performance option. And hell, you can still overclock any AMD chip... I expect to see lots of us here at TPU buying lower class BD chips and making them fly faster than the fastest stock BD chip... and the fastest stock Intel chip. That's still crazy performance.



They don't unlock now, do they?  The Phenom X4 gave solid performance, especially in games considering it's price. Once AMD matures their chips, yeah. Until then, I doubt those on the higher-end side would pick BD over SB. Unless they're a fanboy of course...


----------



## Neuromancer (Oct 17, 2011)

I agree that AMD rocks, but guys lets stop the animosity. (wow did I just say that?) I am having a hard time reading the facts from the love/hate relationship.

Here is my animosity:


You know what I think is stupid? SLI on AMD. F*bomb on nVidia and their crappy ass products that cant even game on one screen and play HD video on another. like my 6850 can. Its why my 580GTX sits idle until I need to bench or revie wsomething.  Why?  Cuz that is ALL nVidia is good for. 

There you go, hate message for you that is gauranteed to drop some bombs.  It happens to be fact too. But nVidia fans will argue that no gamer would have an HD video playing on a second monitor. (I RTS/RPG slow type games  )

DX11 breaks on both cards BTW... not sure why yet. Still 6870 in my gamer 580 sits on shelf. (both 580s)

EDIT: only reason I upgrade from 6850 to 6870 was the heatpiped heatsink, 6850 is getting reused dont you fret 

EDIT EDIT: Sorry, got so caught up in this stuff.


----------



## John Doe (Oct 17, 2011)

Neuromancer said:


> You know what I think is stupid? SLI on AMD.



Uhm, yeah. That's why I had a pair of 480's on a Crosshair 2 yet it kicked S775 ass. Both the 780a and 980a (which is a rebadged 780) were good chips without the issues of S775 SLi systems. They were rare and made a good combo.


----------



## hat (Oct 17, 2011)

I would, because it would be cheaper and still hellishly fast, even at stock, but we can overclock any BD chip. The current 6 core BD model is $190, a bit cheaper than the $220 Intel wants for the 2500k, with two more cores to spare... making it a bit less helpless in heavily multithreaded apps, although it would still lag behind in most things. Then you can overclock the 6 core BD beyond stock 2500k performance... of course then you could clock the 2500k to outclass the overclocked BD, but even though the 2500k is faster, the BD is still pretty damn fast, and cheaper, factoring overall system cost in.

The point I'm making is AMD is slower and cheaper, although slower doesn't really matter because even the slower performance is still pretty fast. In short, nothing really changed... I pretty much expected BD to turn out like this.

I wonder why AMD hasn't put a similar feature to Intel's turbo boost in BD, though. This makes me wonder about the overclockability of BD...


----------



## Frick (Oct 17, 2011)

de.das.dude said:


> exactly!! even i dont know why but intels dont feel as fast in everyday application.



What the eternal heck are you guys on? This makes no sense, and it is not true.


----------



## John Doe (Oct 17, 2011)

Neuromancer said:


> My point was nvidia sucked and you just named another reason. 780 to 980 chipsets.
> 
> thanks.
> 
> I was actually only referring to nVidia GFX and their inability to multimon well but that is a great point, they sucked at chipsets too for the most part (NF4 rocked) oh and NF200 snicker snicker



So that makes them "suck"?  I had HD 4870 CrossFire and it was garbage. Such a driver nightmare. Before 2010, there was no application profiles so CrossFire was a PITA. It usually didn't work, even when it did, you had to force it by changing the application name, which kicked you out of Steam servers. Ever since X1950 cards, ATi (AMD) either lacked on GPU power or drivers.


----------



## Neuromancer (Oct 17, 2011)

hat said:


> I would, because it would be cheaper and still hellishly fast, even at stock, but we can overclock any BD chip. The current 6 core BD model is $190, a bit cheaper than the $220 Intel wants for the 2500k, with two more cores to spare... making it a bit less helpless in heavily multithreaded apps, although it would still lag behind in most things. Then you can overclock the 6 core BD beyond stock 2500k performance... of course then you could clock the 2500k to outclass the overclocked BD, but even though the 2500k is faster, the BD is still pretty damn fast, and cheaper, factoring overall system cost in.
> 
> The point I'm making is AMD is slower and cheaper, although slower doesn't really matter because even the slower performance is still pretty fast. In short, nothing really changed... I pretty much expected BD to turn out like this.
> 
> I wonder why AMD hasn't put a similar feature to Intel's turbo boost in BD, though. This makes me wonder about the overclockability of BD...



Um AMD did put turbo boost in. 

Also 6 core BD = 3 core phenom 2 with hyperthreading, but a better version of hyperthreading. 

I do agree with the "slower" does not matter with more cores. Or slower overall if BD has hte same desktop feel phenthub and zosma had. 


(have not actually used zosma just guesstimating)

BD requires to much power and has a lower IPC than thub/zos  extensions are great, Intel lives on them, but generally not supported on AMD , lets face it., Where is hte AMD multicore optimizer download?

EDIT: Why talk about the 6 (lol) core Bulldozer when the 8 (lol) core only meets the 2500k intel processor?


----------



## hat (Oct 17, 2011)

When I see on Newegg's spec list Multi-Core: Six-Core, I'm led to believe it actually has 6 cores... not that it's a 3 core with hyperthreading. Either someone's spewing bullshit about BD, or AMD's being really misleading with their spec sheets. Where can I read in detail about this supposed hyperthreading that's going on here? I don't see anything about any kind of turbo boost anything on Newegg's spec page for the BD chips either... you'd think whoever's in charge of that would be making damn sure that they're blowing their marketing horn about having that feature.


----------



## Frick (Oct 17, 2011)

John Doe said:


> So that makes them "suck"?  I had HD 4870 CrossFire and it was garbage. Such a driver nightmare. Before 2010, there was no application profiles so CrossFire was a PITA. It usually didn't work, even when it did, you had to force it by changing the application name, which kicked you out of Steam servers. Ever since X1950 cards, ATi (AMD) either lacked on GPU power or drivers.



They only time they lacked in power was probably the HD2xxx series. Drivers I can agree with though. I'm willing to put up with it though as they has always offered the best price/performance ratio and that's what I'm after.


----------



## seronx (Oct 17, 2011)

Opinions?

I LOL'ed hard when I saw this okay


----------



## laszlo (Oct 17, 2011)

amd make a mistake by launching this chip too soon as almost no soft is capable to use all cores and the new added instruction;this chip will become obsolete till we'll have majority of daily used programs/new games, will be able to use dozer at full capacity so i think is a good product but arrived in the wrong market and time;they should focus on quads and improve them or create something what the market demand;now they just show "we can do it we launched the 1st 8 core desktop CPU" and... so what? not average joe buyer will ask himself :can i use it fully? nope;when can i use it? ...after 3-5 years...;than why to invest in it? just to have it?

someone at amd has taken a v.bad marketing decision by forcing this chip and this will affect them


----------



## scooper22 (Oct 17, 2011)

hat said:


> When I see on Newegg's spec list Multi-Core: Six-Core, I'm led to believe it actually has 6 cores... not that it's a 3 core with hyperthreading. Either someone's spewing bullshit about BD, or AMD's being really misleading with their spec sheets. Where can I read in detail about this supposed hyperthreading that's going on here? I don't see anything about any kind of turbo boost anything on Newegg's spec page for the BD chips either... you'd think whoever's in charge of that would be making damn sure that they're blowing their marketing horn about having that feature.



BD FX-6xxx is:
- a 6-core integer/fp128
- a 3-core fp256
at the same time.

as long fp256-ops are somewhat sparse and integer/fp128 are the majority of all ops I think it is fully legitime to call it a six-core as six fully independent threads can be processed. So 6 threads run in parallel (6T) on six cores (6C).

4C/8T Hyper-Threading (Intel only) is a FOUR-core where every core can hold two threads. These are however not processed at the same time. Therefore it's a four-core (4C) with eight threads (8T) where half of them is stalled at all times. This is a true 4-core.


Schematics of one BD-module (FX-6xxx has three of these, FX-8xxxx four):


----------



## John Doe (Oct 17, 2011)

seronx said:


> Opinions?
> 
> I LOL'ed hard when I saw this okay



lol. That's PassMark, right? IIRC, they use some sort of browser tests and stuff to come up with that score... it's nonsense, not a real benchmark at all.


----------



## nt300 (Oct 17, 2011)

laszlo said:


> amd make a mistake by launching this chip too soon as almost no soft is capable to use all cores and the new added instruction;this chip will become obsolete till we'll have majority of daily used programs/new games, will be able to use dozer at full capacity so i think is a good product but arrived in the wrong market and time;they should focus on quads and improve them or create something what the market demand;now they just show "we can do it we launched the 1st 8 core desktop CPU" and... so what? not average joe buyer will ask himself :can i use it fully? nope;when can i use it? ...after 3-5 years...;than why to invest in it? just to have it?
> 
> someone at amd has taken a v.bad marketing decision by forcing this chip and this will affect them


AMD had no choice but to launch. It was the massive rumours that made Bulldozer seem like a killer CPU. 

I don't see why people don't put aside its performance right now and look at the design as Pure Innovation on AMD part. They did something unique and completely different, and hopefully in time this will pay dividends. 

With enough tweaking, cache latency timing fixes and the like, the upcoming Piledriver should look and feel like a monster in performance. Mark my words, Piledriver won't be no slouch. Hopefully by then MS will patch Win 7.


----------



## Super XP (Oct 18, 2011)

Interesting, though it does feel like beating a dead horse over and over again.    


> Hopefully Piledriver will be tweaked and fixed in time for an aggressive Q1 2012 release on the Socket AM3+ platform. I expect at least a 35% to 50% performance boost in Multi-Threading performance and a 10% to 15% performance boost in Single Threaded Performance.
> 
> Bulldozer currently suffers from various issues including but not limited to Branch Prediction, Pipeline Flushing, Cache Trashing, Decode unit not wide enough and so on. If and when AMD fixes this, Bulldozer should have the ability to Bulldoze aas it was meant to do.
> 
> ...


----------



## AphexDreamer (Oct 18, 2011)

http://www.tweaktown.com/news/21168...x.html?utm_source=dlvr.it&utm_medium=facebook

BD to be Fixed with a B3 Revision.


----------



## erocker (Oct 18, 2011)

AphexDreamer said:


> http://www.tweaktown.com/news/21168...x.html?utm_source=dlvr.it&utm_medium=facebook
> 
> BD to be Fixed with a B3 Revision.



Nice, I'll be sure to get on AMD about giving me a B3 as a replacement to my "broken" B2.


----------



## Steevo (Oct 18, 2011)

AphexDreamer said:


> http://www.tweaktown.com/news/21168...x.html?utm_source=dlvr.it&utm_medium=facebook
> 
> BD to be Fixed with a B3 Revision.



Not at all true.

The rumor mill, based on a BIOS paper for motherboard makers shows a B3 revision, and the site assumes they "(if the do)" fix the scheduling issues. The B3 could be lower power parts due to fixed leakage, they could be lower speed actual die cut 3 module X6 parts, they could be a birthday cake in a box. 


Somehow you magically read "it will be fixed"?

What do you smoke so I can get some?

The flaw is in the design, AMD apparently never spent any good time on branch prediction, so they made up for it with a huge cache, that huge cache slowed down the whole chip in speed and sucked up power, that drove the clocks even further down and forced AMD to put a thermal limit on the chips. 

It was and has been a horrible event of one failure leading to another failure in design, implementation, and finish. If they started with a known good CPU and asked the simple "what works, and why, what doesn't and why" and came up with the best 10 things they would have a monster, but they didn't. They have a netburst clone. 

K8, with higher IPC and die shrinks, microcode updates and even having to run large in place calculations as 2 or 4 clock procedures would be faster for most users. 

Look at what Intel did for core. Back to the P III design with the improvements they learned on P4 and branch prediction improvement. If AMD had done this instead of "real men use real cores" we would be looking at this from a whole different angle. Unfortunately AMD  got lost somewhere before they bought ATI and haven't been the same since.


----------



## AphexDreamer (Oct 18, 2011)

Steevo said:


> Not at all true.
> 
> The rumor mill, based on a BIOS paper for motherboard makers shows a B3 revision, and the site assumes they "(if the do)" fix the scheduling issues. The B3 could be lower power parts due to fixed leakage, they could be lower speed actual die cut 3 module X6 parts, they could be a birthday cake in a box.
> 
> ...



Don't get your panties in a bunch. I was merely posting the fact that a B3 revision will be coming out and the article states that they hope it will provide better performance hence the "fix", fixed being a lose term in where I personally don't define what gets fixed but surly the revision fixes something else why the revision?. Both I and the article remain optimistic in the B3 revision as it is still too soon to same with certainty anything regarding it. 

And no I can not provide you with illegal narcotics, sorry.


----------



## Dent1 (Oct 18, 2011)

^ 
This B3 stepping stuff is starting to seem more credible.




> After many were left disappointed by the performance of AMD's FX-Series processors based on the Bulldozer architecture, the chip maker is now working on a new revision of its CPUs that will carry the B3 stepping.
> 
> This information was uncovered by Hardcore-Hardware in a public AMD document entitled “BIOS and Kernel’s Developers Guide (BKDG) for AMD Family 15h Models 00h-0Fh Processors.”
> 
> ...




http://news.softpedia.com/news/AMD-Bulldozer-B3-Revision-Is-in-the-Works-228347.shtml


----------



## Steevo (Oct 18, 2011)

You guys want to buy a AMD Extreme Edition FX? Yeah it still performs slower than the Intel chip running hundreds of Mhz per core slower, but it EXTREME!!!!!

The flaw is the design, the chip, the whole thing. 

Our only hope is the learn something from it and perhaps move the best parts to a design that more reflects their K8 designs, with a good fat fast interconnect between cores.


----------



## Dent1 (Oct 19, 2011)

Steevo said:


> You guys want to buy a AMD Extreme Edition FX? Yeah it still performs slower than the Intel chip running hundreds of Mhz per core slower, but it EXTREME!!!!!



When you say "you guys" whom is it directed at specifically?




Steevo said:


> The flaw is the design, the chip, the whole thing.
> 
> Our only hope is the learn something from it and perhaps move the best parts to a design that more reflects their K8 designs, with a good fat fast interconnect between cores.



Or software developers can stop being lazy and start writing code to utilise multi core processors in general.


----------



## Super XP (Oct 19, 2011)

Dent1 said:


> When you say "you guys" whom is it directed at specifically?
> 
> Or software developers can stop being lazy and start writing code to utilise multi core processors in general.


The issue here is some people can't credit where credit is due. Bulldozer may have not been what people expected, but it's in no way a bad design. I've said this once, and I will say it again, AMD has Balls of Iron for creating something completely new and innovative. Now they need to learn from it and make Piledriver better. 

It would also help if Developers would start writing code or Multi-Cores that would benefit both AMD and Intel.


----------



## bpgt64 (Oct 19, 2011)

This community represents the 5% of the computer buying world that cares about who takes the performance crown.  Whenever I build a new system for work, it's AMD based, more cores, more threads, more Virtual Machines.  Having access to a microcenter and there insane AMD deals helps as well.


----------



## Steevo (Oct 19, 2011)

Dent1 said:


> When you say "you guys" whom is it directed at specifically?
> 
> 
> 
> ...



Anyone who wants one as they are convinced, despite benchmarks and real world testing, that it is somehow better than a higher clocking, higher IPC, and overall faster 2600.


So much like calling out "bastard" in a crowded bar, I would assume only people who know deep down they are bastards will be offended or answer.

Credit due for a 3+ year bad design. 


Would you also like to buy a Yugo, that looks good on paper, but runs like shit?


----------



## Deleted member 67555 (Oct 19, 2011)

+1000 for Microcenter and AMD Deals....They make so hard to pass up...


----------



## lilhasselhoffer (Oct 19, 2011)

Super XP said:


> The issue here is some people can't credit where credit is due. Bulldozer may have not been what people expected, but it's in no way a bad design. I've said this once, and I will say it again, AMD has Balls of Iron for creating something completely new and innovative. Now they need to learn from it and make Piledriver better.
> 
> It would also help if Developers would start writing code or Multi-Cores that would benefit both AMD and Intel.



A quick look at all of your other posts leads me to believe you would defend AMD if their next chip was a Dorito.  "It's a bold new flavorful processing option."

Hyperbole aside, balls of steel and a head of iron are two separate things.  I give props to AMD for moving away from the tried and true CPU designs.  All of these props are negated by on simple fact.  Performance does not come directly from changing everything.

If Bulldozer was a step in the right direction this thread would be drooling fanboys, and Intel hold-outs trying to say that their processors were still awesome.  What we get is Intel being completely non-plussed because BD is a shambling step in the wrong direction.

Before you say it, I concede that the new architecture has the possibility for success.  This all hinges on an enormous amount of work, that should have been done in the five years its taken to develop BD.  


Blame MS, they haven't done enough to push a 64 bit OS.  Blame software writers, they haven't written 64 bit programs or included mulyi-thread support.  Blame everyone else but AMD for a half baked CPU architecture that isn't what the consumer market is demanding.  Just no.

Consumers drive software.  Software drives OS development.  OSs drive hardware.  If you want to make BD a good option then refuse to buy software that isn't multi-threaded and OSs that aren't 64 bit.  I ask you this, what are you left with?  No Apple programs, no older games, and media player choices become miniscule.  Bemoaning the system that consumers define is like blaming a parrot for repeating what it hears.  0 change comes from moaning.


As I have stated before, wake me when BD becomes relevant.  Until AMD can extract sensible performance from their little experiment I'm going to state the obvious.  BD has been an experiment that did not go as expected, and has drawbacks that are unacceptable.


----------



## CDdude55 (Oct 19, 2011)

lilhasselhoffer said:


> a quick look at all of your other posts leads me to believe you would defend amd if their next chip was a dorito.  "it's a bold new flavorful processing option."
> 
> hyperbole aside, balls of steel and a head of iron are two separate things.  I give props to amd for moving away from the tried and true cpu designs.  All of these props are negated by on simple fact.  Performance does not come directly from changing everything.
> 
> ...



+1000


----------



## Dent1 (Oct 19, 2011)

Steevo said:


> Anyone who wants one as they are convinced, despite benchmarks and real world testing, that it is somehow better than a higher clocking, higher IPC, and overall faster 2600.



Very few people are saying the Bulldozer is faster overall than the i7 2600K.

From what I've read, most people are saying the Bulldozer is faster or as fast as the i7 2600K in multithreaded environment, and this is correct.




Steevo said:


> Credit due for a 3+ year bad design.



The design isn’t bad. In theory the design should of yielded better performance than i7 overall. Obviously that didn’t happen, so performance was probably attributed to other factors whether hardware or software (slow cache?, latency?, schedule issues?, OS issues? sloppy multithreaded apps? who knows!)



Steevo said:


> Would you also like to buy a Yugo, that looks good on paper, but runs like shit?



What has that question got to do with the article that claims the arrival of the B3


----------



## Steevo (Oct 19, 2011)

There is no mention of any changes for B3, again it might be a doritos chip gets put inside each one to bake since they run so hot. You might get a free handjob with it to make you feel better.


There is no mention of any "fix" from AMD, it is pure assumption from TweakTown. Assumptive rumors are like diarrhea of the mouth. So TweakTown spewed shit out of their mouth, are you sure you want to eat and repeat that?


http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/8

A 3.4Ghz ( 200Mhz per core SLOWER) 2600 ruins its shit.

http://www.overclockersclub.com/reviews/amd_fx8150/5.htm

No difference.

http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-17.html

Ohes no, a trend is emerging.........

There are a select few places it gets a higher "score", unfortunately two of those are power consumption, and heat output.


----------



## Dent1 (Oct 19, 2011)

Steevo said:


> There is no mention of any "fix" from AMD, it is pure assumption from TweakTown. Assumptive rumors are like diarrhea of the mouth. So TweakTown spewed shit out of their mouth, are you sure you want to eat and repeat that?




Nobody is talking about a fix. I was just merely comfirming that B3 stepping might be on the way according to sources. 

I did not state any performance enhancements or whether it was true.  I was just posting a link I found.




Steevo said:


> http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/8
> A 3.4Ghz ( 200Mhz per core SLOWER) 2600 ruins its shit.
> http://www.overclockersclub.com/reviews/amd_fx8150/5.htm
> No difference.
> http://www.tomshardware.com/reviews/fx-8150-zambezi-bulldozer-990fx,3043-17.html



What has any of this got to do with the rumoured B3 stepping.


----------



## Steevo (Oct 19, 2011)

Dent1 said:


> Nobody is talking about a fix. I was just merely comfirming that B3 stepping might be on the way according to sources.
> 
> I did not state any performance enhancements or whether it was true.  I was just posting a link I found.



Notice how I didn't quote you, but just asked a open question?

And yes, people were talking abut it being a "fix".

Nothing wrong with your link, nothing wrong with the discussion of what it COULD mean. 


Your idea about the design being fine is wrong though. That is the point of a design, to get expected controlled results. If you relied on AMD to design and build a bridge with this same outcome no one would drive on it.



Dent1 said:


> Very few people are saying the Bulldozer is faster overall than the i7 2600K.
> 
> From what I've read, most people are saying the Bulldozer is faster or as fast as the i7 2600K in multithreaded environment, and this is correct.
> 
> What has any of this got to do with the rumored B3 stepping.



Its still slower. And that is the point of this thread.


----------



## Dent1 (Oct 19, 2011)

Steevo said:


> Your idea about the design being fine is wrong though. That is the point of a design, to get expected controlled results. If you relied on AMD to design and build a bridge with this same outcome no one would drive on it.



I never said the design was fine. I said the design isnt bad. And performance _should _of been good in theory.

Reading between the lines I'm praising the architecture but clearly not praising the performance.



Steevo said:


> Its still slower. And that is the point of this thread.



No the original point of the thread, was Bruce Gain from PCWorld concluding that singthreaded performance is poor and desktop consumers are disappointed! The author later blames benchmarks saying most support only 2 or less cores - Its sad you didnt even READ the orignal post.




> Most of these tests are largely geared for CPUs with two or fewer cores. Software makers also have yet to bring to market applications that will take advantage of FX multi-core design for multi-threading tasks.



http://www.pcworld.com/businesscent...ulldozer_disappoints_why_thats_good_news.html


----------



## Damn_Smooth (Oct 19, 2011)

It was stupid of AMD to attempt a die shrink and an architecture change at the same time. I think it can be fixed though, and with Intel focusing more on graphics with Ivy, they can narrow the gap. Now it will be nice to watch if AMD catches up with CPU before Intel catches up with GPU or vice versa. The next few years will be interesting.


----------



## Steevo (Oct 19, 2011)

I believe what the actual point is/was we have millions of pieces of software. AMD made a chip that only effectively runs 2% of it better than the competition. 

So, reading between the lines, the CPU sucks for everyone and everything but terminal servers, and even there its the same or lower price to purchase a older board that supports two quad core chips.

I'm not attacking you BTW. I am just having a conversation in my style.


----------



## Dent1 (Oct 19, 2011)

Steevo said:


> So, reading between the lines, the CPU sucks for everyone and everything but terminal servers, and even there its the same or lower price to purchase a older board that supports two quad core chips.



That may be true, but it has nothing to do with the PCWorld article or the B3 stepping. We've already covered the AMD FX suck for traditional desktop use in comparison to Intel's i5 & i7 offering. You are not telling us anything useful, just rehasing what has been said/discussed already.




Steevo said:


> I believe what the actual point is/was we have millions of pieces of software. AMD made a chip that only effectively runs 2% of it better than the competition.



Where does it say that in the article.  Quote it please 

Article clearly says most software benchmarked doesnt support its multi core/thread approach - I quoted mine, now quote yours.



> Most of these tests are largely geared for CPUs with two or fewer cores. Software makers also have yet to bring to market applications that will take advantage of FX multi-core design for multi-threading tasks.


----------



## eidairaman1 (Oct 19, 2011)

I wouldnt doubt that AMD reimplements the turbocore or w/e it is in 2 steppings or In Piledriver



hat said:


> I would, because it would be cheaper and still hellishly fast, even at stock, but we can overclock any BD chip. The current 6 core BD model is $190, a bit cheaper than the $220 Intel wants for the 2500k, with two more cores to spare... making it a bit less helpless in heavily multithreaded apps, although it would still lag behind in most things. Then you can overclock the 6 core BD beyond stock 2500k performance... of course then you could clock the 2500k to outclass the overclocked BD, but even though the 2500k is faster, the BD is still pretty damn fast, and cheaper, factoring overall system cost in.
> 
> The point I'm making is AMD is slower and cheaper, although slower doesn't really matter because even the slower performance is still pretty fast. In short, nothing really changed... I pretty much expected BD to turn out like this.
> 
> I wonder why AMD hasn't put a similar feature to Intel's turbo boost in BD, though. This makes me wonder about the overclockability of BD...


----------



## Super XP (Oct 19, 2011)

lilhasselhoffer said:


> A quick look at all of your other posts leads me to believe you would defend AMD if their next chip was a Dorito.  "It's a bold new flavorful processing option."
> 
> Hyperbole aside, balls of steel and a head of iron are two separate things.  I give props to AMD for moving away from the tried and true CPU designs.  All of these props are negated by on simple fact.  Performance does not come directly from changing everything.
> 
> ...


We will agree to disagree. 
AMD took a chance with Bulldozer and it does not perform as well as the hype. 
I will say it again and again, AMD's got Balls of Iron for doing something completely different and forcing themselves to Innovate. Despite it's performance they deserve a break IMO, and hopefully they will learn from Bulldozer and make Piledriver better. Fair?  


Damn_Smooth said:


> It was stupid of AMD to attempt a die shrink and an architecture change at the same time. I think it can be fixed though, and with Intel focusing more on graphics with Ivy, they can narrow the gap. Now it will be nice to watch if AMD catches up with CPU before Intel catches up with GPU or vice versa. The next few years will be interesting.


The original 45nm Bulldozer was a fail, which is why they went to 32nm.


----------



## lilhasselhoffer (Oct 19, 2011)

Super XP said:


> We will agree to disagree.
> AMD took a chance with Bulldozer and it does not perform as well as the hype.
> I will say it again and again, AMD's got Balls of Iron for doing something completely different and forcing themselves to Innovate. Despite it's performance they deserve a break IMO, and hopefully they will learn from Bulldozer and make Piledriver better. Fair?



God, I hope so.  I would love to see AMD take a huge spiked plug and ram it into Intel's behind.  I was, like you apparently still do, hoping AMD would give Intel a huge black eye.

Here's to hoping that Piledriver can pull out a win.  MS prooved it was possible with windows 7, and I really want AMD to do the same exact thing.


----------



## twilyth (Oct 19, 2011)

I think the bottom line on BD is that AMD is lucky to have survived after nearly choking on ATI.  When your very existence is in doubt, it tends to reorder your priorities.  I mean ultimately, somebody probably would have rescued them, but then again . . . .  And even then, what would they have looked like?  I doubt they would still be a company that could at least bat in the same league as Intel.

Another thing that I'm just going to toss out there since I know jack squat about chip design - had AMD not gone with an approach like this, what other options would they have had?  I mean, the basic architecture is pretty mature at this point isn't it?  Twenty years ago you could have dynamically addressable registers and some rudimentary branch prediction and you could walk around with the theme from Rocky playing as your own personal sound track.  All the low hanging fruit has been plucked.  So either you go with a hyperthreading approach which costs you in die real estate, or cut each core down to the bare minimum that can still handle 70-80% of the work most rigs see.  I mean what was the last really mind blowing chip design?  Everything that is being implemented today is shit that people have been talking about since computers had handcranks on the side and we called them adding machines.


----------



## eidairaman1 (Oct 19, 2011)

lilhasselhoffer said:


> God, I hope so.  I would love to see AMD take a huge spiked plug and ram it into Intel's behind.  I was, like you apparently still do, hoping AMD would give Intel a huge black eye.
> 
> Here's to hoping that Piledriver can pull out a win.  MS prooved it was possible with windows 7, and I really want AMD to do the same exact thing.



MS was literally suffering cuz of Vista, Most people wouldnt move to Vista or would revert back to XP. MS tried forcing it down peoples throats, their stocks started going down. 7 fixed everything that was problematic with Vista. I still will never upgrade ontop of a previous OS tho.

We can only wonder what Piledriver brings to the table since that CPU should be Fully TSMC Fabbed. GF Fab is having teething problems. AMD needs to go back to what Made K7 n K8 very powerful. N Bring back the Athlon as the Staple Name and the Athlon FX their top Chip, Have the Desktop and Server Socket the Same- Supports APU to Server CPUs (APU function disabled in motherboards that do not have video output from CPU or traditional IGP solution, depending on 1 way to 8 way designs (Like it was during Athlon XP/MP era). Re-implement the Dual CPU Platform that supports 2 Athlon FX (2-way capability -disabled in single CPU boards/High Die n Core Temp tolerance for Overclocking) or Opteron CPUs. (2-8way capable)


----------



## Neuromancer (Oct 19, 2011)

Maybe AMD will do the same thing that MS did.

But a new box on bulldozer, drop the FX nomenclature and drop the price 

I never understood the Vista hate, I think most of it was passed on the early release pirate samples that were BETA builds that still had the diagnostic driver function enabled that really hampered performance. Granted I started Beta testing Vista back from around build 4074 of Longhorn, but Vista is hte first MS OS I had that lasted more than a year (7 has not even come close to that yet surprisingly).


----------



## lilhasselhoffer (Oct 19, 2011)

eidairaman1 said:


> MS was literally suffering cuz of Vista, Most people wouldnt move to Vista or would revert back to XP. MS tried forcing it down peoples throats, their stocks started going down. 7 fixed everything that was problematic with Vista. I still will never upgrade ontop of a previous OS tho.
> 
> We can only wonder what Piledriver brings to the table since that CPU should be Fully TSMC Fabbed. GF Fab is having teething problems.



My point, in a slightly less eluquent structure.

MS took the underlying structure of Vista (analgous to the BD architecture), and pulled out a functional OS called windows 7.  BD has, in my opinion, the same problems.  It is a good idea, but lacks performance where it counts.  Piledriver has the potential to use lessons from BD, and build something truly worthwhile.  

Hopefully, Piledriver will fix BDs errors.  This, ideally, will be exactly like MS and windows 7.


----------



## Steevo (Oct 19, 2011)

twilyth said:


> I think the bottom line on BD is that AMD is lucky to have survived after nearly choking on ATI.  When your very existence is in doubt, it tends to reorder your priorities.  I mean ultimately, somebody probably would have rescued them, but then again . . . .  And even then, what would they have looked like?  I doubt they would still be a company that could at least bat in the same league as Intel.
> 
> Another thing that I'm just going to toss out there since I know jack squat about chip design - had AMD not gone with an approach like this, what other options would they have had?  I mean, the basic architecture is pretty mature at this point isn't it?  Twenty years ago you could have dynamically addressable registers and some rudimentary branch prediction and you could walk around with the theme from Rocky playing as your own personal sound track.  All the low hanging fruit has been plucked.  So either you go with a hyperthreading approach which costs you in die real estate, or cut each core down to the bare minimum that can still handle 70-80% of the work most rigs see.  I mean what was the last really mind blowing chip design?  Everything that is being implemented today is shit that people have been talking about since computers had handcranks on the side and we called them adding machines.



Each significant IPC increase, the 1Ghz barrier, now we are working towards 5Ghz stock chips and OpenCL. X64 and APU's....

There really have been huge strides made in CPU performance, and more importantly, price. I remember it costing somewhere in the $1700 ball park for a decent (for the time) Gateway my parents bought years and years ago (I was a teen).

Dent1

Where does it say that in the article. Quote it please 


So you will read between the lines, but when I summarize the article, and the performance as a whole I need to provide quotes for it? Riiiiight.

I'll get right on that boss.


----------



## Steevo (Oct 19, 2011)

AMD's Bulldozer Disappoints: Why That's Good News

By Bruce Gain, PCWorld

AMD's latest-and-greatest chip may lag slightly behind Intel’s competing Core i5, as initial PCWorld performance-testing indicates. But these disappointing results hide benefits that AMD's "Bulldozer" FX CPU will likely offer, especially for cost-conscious  The cost is the same, and with power consumption for running the processor, and effectively cooling it for half they year as that 200Watts of heat have to go somewhere in the summer, so 200W of direct load on a A/C unit times the number of processors and system efficiency for the whole machine actually makes this cost more ICO (Initial Cost of Ownership) and TCP (Total Cost of Ownership) So sorry, there is no savings. Besides, the fact is companies who are running enough of these to warrant worrying about thread/core count wouldn't be running this anyway, they run servers with two or four CPU's with multiple cores. So everything in this paragraph is untrue. small businesses.


AMD Bulldozer The issue is that most CPU-performance tests don't reflect the potential computational power offered by FX, which has up to eight cores, depending on the version. Sure, computationally-wise, preliminary synthetic tests, such as PCMark 7 and Cinebench, reflect real-world computing performance and indicate that the FX lags in comparison with Intel’s Core i5. That's what PCWorld's tests showed after running the four-core FX-4100 through the paces. Actually most of the high end software they DIDN'T test with is capable of using more than one or two threads. A friend is attending the Art Institute in Portland for editing, and his father runs a editing studio, and runs two and four way boxes for his editing with Premiere Pro http://help.adobe.com/en_US/aftereffects/cs/using/WS9F936D13-E76A-41e4-BF8F-577132AB4723a.html Just in case you can't google it yourself.

Why can’t the FX’ multi-core design crank past Intel’s Core i5 in these tests? Most of these tests are largely geared for CPUs with two or fewer cores. Software makers also have yet to bring to market applications that will take advantage of FX multi-core design for multi-threading tasks.So we see above that software that actually USES multi-core/multi-threading could use all the cores, but the performance still sucks. http://www.tomshardware.co.uk/fx-8150-zambezi-bulldozer-990fx,review-32295-23.html Here they actually use the multi-core capable software in both Windows 7, and 8 and......it slows DOWN. The games below do reflect some significant games, but with Windows 8 still in Alpha, who knows what the actual performance will be in RTM.

The server equivalent of the FX, code-named "Interlagos"--meant to launch in a few weeks--already takes advantage of the eight cores to a greater extent than the desktop equivalent of the FX does, AMD says.

“AMD FX and Bulldozer CPU technology was optimized for multi-processing and multi-threaded applications,” Dina McKinney, corporate vice president, design engineering, for AMD said via email.

The eight cores also benefit from AMD’s Turbo Core feature, which automatically boosts the clock speed of different cores when others are not in use above and beyond their normal speeds. When Turbo Core kicks in, the standard clock speed of the FX-8150, the highest-end version of the FX, can speed from 3.6GHz to 3.6GHz. Turbo core would work, except it was active for those games where it still lost.

Turbo Core also does this while monitoring power consumption and will lower the processing speed if overheating occurs (Intel’s Turbo Boost has a similar functionality).  Junk filler in a article that says nothing new, and means nothing.

So in the future, look out for potential video editing, engineering, and other software that might harness what eight cores and Turbo Boost can offer both in the desktop space. While it is has yet to be proven, the FX with its eight cores could very well be ahead of its time. Except it has already been tested in the video editing, engineering and other software that people use, and its still slower, despite the software using all the threads and cores available. In case you didn't know, second place in a race means you have lost. AMD lost with a chip that is 400Mhz per core faster than its competitively priced chip. Lost.

For now, the FX-8150--the highest-end variation of AMD's FX--retails for $245, compared with $220 for its direct competitor, the Intel Core i5-2500K. So, if you're buying a new motherboard for a workstation and want to scrutinize the best value for your money, the Intel part will offer slightly better performance for most office applications for $25 less.

But, in the larger scheme of things, expect to see versions of the FX show up in future PCs that will at least compete against machines with Intel inside performance- wise, and may still beat the in price as well.

AMD vs. IntelIn the worst case scenario, AMD’s FX launch is disappointing in that the chip doesn't trounce competing Intel devices in performance. Regardless, as the two chip giants battle, they continue to attempt to outdo each other, which benefits consumers.

In the end, the fact that AMD has maintained market share in CPUs means that Intel has had to keep its prices in check to remain competitive. If Intel had a monopoly, as Microsoft has had with its PC operating system, then CPU prices would surely have been higher and Intel would have had less incentive to innovate. Without the competition, a pure Intel monopoly would have left the workstation and server computing would years behind what it is today.

Bruce covers tech trends in the United States and Europe. He can be reached through his Website at www.brucegain.com.

Bruce gain needed a article to write that would get page hits to prove his worth at getting ad revenue, congrats on funding him. He has written a article without any actual tests being done, no hard data, no information. Just speculation that has been proven wrong.


All that being said, I'm sure it will be fun to play with, and yes, it is a upgrade from a X6 if you use software that will take advantage of it, but NO you CANNOT compare it to a 2600 for threaded performance. 

$314 for a 2600

VS

$279.99 for a 8150

$30 less.

http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/10

Load stock speed 82W more power. Means on average $17 more per year in power for the AMD system, not including cooling costs.


----------



## eidairaman1 (Oct 19, 2011)

I added more to my comment



lilhasselhoffer said:


> My point, in a slightly less eluquent structure.
> 
> MS took the underlying structure of Vista (analgous to the BD architecture), and pulled out a functional OS called windows 7.  BD has, in my opinion, the same problems.  It is a good idea, but lacks performance where it counts.  Piledriver has the potential to use lessons from BD, and build something truly worthwhile.
> 
> Hopefully, Piledriver will fix BDs errors.  This, ideally, will be exactly like MS and windows 7.


----------



## xenocide (Oct 19, 2011)

@Steevo

I was enjoying the fact that it lacked any realy evidence and made no god damn sense...


----------



## twilyth (Oct 19, 2011)

We made it 6 pages without that bothering anyone though.

Well . . . . anyone _else_ that is. :shadedshu


----------



## nt300 (Oct 19, 2011)

Steevo said:


> All that being said, I'm sure it will be fun to play with, and yes, it is a upgrade from a X6 if you use software that will take advantage of it, *but NO you CANNOT compare it to a 2600 for threaded performance.*
> 
> $314 for a 2600
> 
> ...


Which is why we hope Piledriver takes it out completely. The AMD FX 8150 does fair well against the 2600 and the 2500, but not in all benchmark tests.


----------



## Super XP (Oct 22, 2011)

"*AIDA64 Extreme Edition* is a streamlined Windows diagnostic and benchmarking software for home users. 

*FX-8150 (Wins 8) vs. PII x6 1100T (Wins 4)* 
Despite Bulldozer winning most of those benchmarks over the older x6 1100T, it's still needs massive improvements in performance. 

*AIDA64 CPU AES (ALL HIgher is Better)*
Core i5 2500K + z68 = 440,033
FX-8150 + 990FX = 346,134
PII x6 1100T + 990FX = 55,013

*AIDA64 CPU PhotoWorxx*
Core i5 2500K + z68 = 59,513
FX-8150 + 990FX = 48,200
PII x6 1100T + 990FX = 31,876

*AIDA64 CPU Queen*
Core i5 2500K + z68 = 35,087
FX-8150 + 990FX = 32,088
PII x6 1100T + 990FX = 32,256

*AIDA64 CPU Zlib*
Core i5 2500K + z68 = 217
FX-8150 + 990FX = 262
PII x6 1100T + 990FX = 245

*AIDA64 CPU JULIA*
Core i5 2500K + z68 = 11,689
FX-8150 + 990FX = 9,591
PII x6 1100T + 990FX = 12,659

*AIDA64 CPU MANDEL*
Core i5 2500K + z68 = 6,196
FX-8150 + 990FX = 4,793
PII x6 1100T + 990FX = 6,442

*AIDA64 CPU SinJulia*
Core i5 2500K + z68 = 3,272
FX-8150 + 990FX = 2,145
PII x6 1100T + 990FX = 3,215

*AIDA64 CPU HASH*
Core i5 2500K + z68 = 2,254
FX-8150 + 990FX = 3,672
PII x6 1100T + 990FX = 3,313

*AIDA64 CPU VP8*
Core i5 2500K + z68 = 3487
FX-8150 + 990FX = 3493
PII x6 1100T + 990FX = 3375

*AIDA64 Memory Copy*
Core i5 2500K + z68 = 20.361
FX-8150 + 990FX = 18.293
PII x6 1100T + 990FX = 10.93

*AIDA64 Memory Read*
Core i5 2500K + z68 = 18.892
FX-8150 + 990FX = 13.971
PII x6 1100T + 990FX = 8.773

*AIDA64 Memory Write*
Core i5 2500K + z68 = 18.853
FX-8150 + 990FX = 10.268
PII x6 1100T + 990FX = 7.08

*AIDA64 Memory Latency (Lower Better)*
Core i5 2500K + z68 = 46.65
FX-8150 + 990FX = 51.2
PII x6 1100T + 990FX = 51.2

http://www.bjorn3d.com/read.php?cID=2125&pageID=11106



> Conclusion
> 
> It is extremely refreshing to see a new architecture from AMD after such a long time without one. Bulldozer is the world's first eight-core consumer CPU, and is marketed towards the consumer and enthusiast gaming crowd. The new chip does have many attractive features: unlocked multipliers across all models, a base frequency of 3.6GHz (unparalleled even by Intel), and a whopping eight cores per CPU.
> 
> ...


----------



## CDdude55 (Oct 22, 2011)

> We also hope that AMD will change some of the Bulldozer core structure to better suit lightly-threaded applications and games, as this is unfortunately an area where Bulldozer falls flat.



Exactly, and that's why im waiting.


----------



## TheoneandonlyMrK (Oct 22, 2011)

no way, intels priceings on the up, what a surprise!, and a kick in the balls for all, and this thread 
at this rate BD will only have to compete with the I5's based on cost by christmass anyway, and thats a fight it would win go intel lol not


----------



## Super XP (Oct 22, 2011)

CDdude55 said:


> Exactly, and that's why im waiting.


I fully agree 100%. 
As for me I cannot wait, I need something to play with before I go nuts


----------



## twilyth (Oct 23, 2011)

Ars Technica had a great overview of BD's advantages and failings.  It's a little low-level in parts but is generally a very accessible article.  If you really want to get under the hood, they reference 2 other articles that are much more demanding.

http://arstechnica.com/gadgets/news/2011/10/can-amd-survive-bulldozers-disappointing-debut.ars/1

One thing of note on page 2 is that the rumors about a registry fix while probably inaccurate if taken literally, do have some truth to them.



> On top of all this, the instruction cache within each module has a flaw that can inflict about a 3% performance penalty whenever threads from different processes are scheduled to the same module. A fix for the issue has been incorporated into the Linux kernel, but the status of other operating systems is currently unclear.


----------



## Inceptor (Oct 23, 2011)

What's an improvement in single threaded performance?
How much?

Piledriver will obviously not catch up with Intel in that department, but how much is enough?
10% is not enough unless it's a low clocked PD that's benched first -- which may happen... or may not.
Then again, what is 'enough' depends on Ivy Bridge benchmarks.  
But, in my opinion, single threaded performance needs to increase 25%, and that only brings it on par with the current i3-21xx cpu  (the 2600K is 20% faster than that, single threaded), that is if we're talking fx-8150 to the PD equivalent.

10% is barely adequate, and downright dismal.  Any Phenom II with a 3.4Ghz or higher clock speed is as fast or faster than the fx-8150 single-threaded.  The only way 10% is OK, is if it's, lets say a 3ghz PD, and it's 10% faster (single threaded) than a 3.6 ghz fx-8150.

From the enthusiast perspective, it has to be more than 10% single threaded increase, clock for clock.

From the long term perspective, maybe it's OK, if they've got more improvements down the line.
It's quite obvious the sh!tty single thread performance delayed the FM2 socket 2nd gen APU... imagine... 2 bulldozer modules (4 integer cores) only being able to (at best) match the Athlon II cores in the FM1 APUs... so Piledriver modules are slotted in for the FM2 now.
They need to EOL the Athlons and Phenoms because of this... the 2nd gen APU will likely be the _performance-equivalent _of a current Phenom II + HD6850 in overall performance, just with a better memory controller.
The APUs are potentially interesting, from a certain point of view, if they're unlocked.  Assuming the following:
1) the cpu power consumption problem is solved.
2) the 'power vs frequency vs performance' is fixed so that it doesn't top out at at ~4.3 Ghz, leading to dimishing returns and absolutely huge power consumption.
3) Third or Fourth generation APUs (Steamroller or whatchamacallit modules) + more than 800 stream processors or their next gen equivalent.

Personally, in the short term (6-12 months), I'm interested to see reviews of the new Ivy Bridge Q77 & Z77 boards and processors.  In the long term, I'm interested to see the third and fourth gen APUs from AMD (especially unlocked versions)... unless Intel manages to somehow catch up in the GPU department.


----------



## twilyth (Oct 23, 2011)

Take a look at the article.  This is the sort of thing they talk about.  For example K10 has 3 128-bit FPU's per core.  BD has 4 per module which amounts to a reduction in the number per thread - well, sort of.  And this is what you see in the benchmarks.

The important thing for me was their explanation of how BD is an expression of AMD's philosophy.  They think that FP operations should be offloaded to the GPU.  That's why they had no problem forcing 2 cores to share FPU's.  Plus you need 2 FPU for the new 256-bit AVX commands.

Also BD was supposed to be much faster.  They made certain design sacrifices to achieve that but it didn't work.


----------



## Inceptor (Oct 23, 2011)

A nice article, it neatly sums up everything.  I don't dispute anything in the article.
But like I said, 10% is not enough, hence my interest in post-Piledriver APUs.
I can roll with a quad phenom II for a while, or a hexa core.
Or an Ivy Bridge.
I just don't see enough of a compelling reason to buy a Bulldozer cpu.  And at this point, even if AMD delivers the promised 10%, I don't see a compelling reason to buy a Piledriver either.  It's not looking to be a large enough performance boost in the 1 to 4 thread domain, which is essentially the core domain of typical PC usage.  

I wait, so that I may see.
Time will tell.


----------



## twilyth (Oct 23, 2011)

I think that's about right for the average user.  In fact, you're being pretty charitable considering the fact that on some benches it falls behind even a P2 X4 - not to mention X6's.

Had it come out at 4.4G, that would have helped, but you'd still have the same issues.

My main criteria is BOINC performance and from the scientific benches in the tech report article, I have to say that I'd be hard pressed to go with BD.  

Of course I'll probably get one anyway - just like I still have my original Phenom 9600.  And for most users, the main issue will be price.  With decent performance and a catchy name, hopefully AMD can find a PR agency that can sell the shit out of those chips.


----------



## Goodman (Oct 23, 2011)

Inceptor said:


> A nice article, it neatly sums up everything.  I don't dispute anything in the article.
> But like I said, 10% is not enough, hence my interest in post-Piledriver APUs.
> I can roll with a quad phenom II for a while, or a hexa core.
> Or an Ivy Bridge.
> ...



Same here ill wait 
Nothing i can't do right now with my PIIx4 for the next year or so , although i don't really need more ill still going to buy a 1090T before Christmas time & i am not going to spend any money on new board just to get the FX CPU... no thx!


----------



## eidairaman1 (Oct 23, 2011)

They will sell to big agencies, no doubt about it.



twilyth said:


> I think that's about right for the average user.  In fact, you're being pretty charitable considering the fact that on some benches it falls behind even a P2 X4 - not to mention X6's.
> 
> Had it come out at 4.4G, that would have helped, but you'd still have the same issues.
> 
> ...


----------



## Super XP (Oct 23, 2011)

And if the FX 8150 was to hit a price tag of $200, it would further fly off the shelves. Many current AM3 and AM3+ owners feel Bulldozer is a great upgrade despite it's lack luster performance in many benchmarks. 

For me Bulldozer makes for a great performance boost over my current setup, and I already have 16GB of DDR3-1866 along with the ASUS Crosshair V Formula and Corsair H100. They are sitting on my desk.


----------



## Super XP (Oct 25, 2011)

*Core i7 2600 @ 4.70 GHz*
View attachment 44086

*AMD FX-8150 @ 4.70 GHz*
View attachment 44087


----------



## Disparia (Oct 25, 2011)

Damn! That i7 just got destroyed!


----------



## eidairaman1 (Oct 25, 2011)

Super XP what about the BD at 4.7GHz?


----------



## erocker (Oct 25, 2011)

Jizzler said:


> Damn! That i7 just got destroyed!



If getting destroyed equals achieving the same FPS while using way less power, cooler and more efficiently I don't know what destroyed means anymore.


----------



## LAN_deRf_HA (Oct 25, 2011)

Why are we looking at a single benchmark? When you have to try this hard to make something not look like a POS that should prod the logic center of your brain into giving up the ghost. Operative word there is "should".


----------



## Deleted member 67555 (Oct 25, 2011)

I'm seriously missing something here...like what yall are talking about cause all i see is 4.7 for an intel chip and 4.7 for an AMD CHIP


----------



## Damn_Smooth (Oct 25, 2011)

jmcslob said:


> I'm seriously missing something here...like what yall are talking about cause all i see is 4.7 for an intel chip and 4.7 for an AMD CHIP



You missed the thread that was deleted. Chew* posted a Dirt 2 benchmark from both of them at 4.7 and they were pretty much equal with BD winning by a few frames.


----------



## eidairaman1 (Oct 25, 2011)

so how do you determine if all cores were being utilized.


----------



## Damn_Smooth (Oct 25, 2011)

eidairaman1 said:


> so how do you determine if all cores were being utilized.



He had 4 cores disabled.


----------



## Super XP (Oct 25, 2011)

Damn_Smooth said:


> He had 4 cores disabled.


I believe he had 1 Core dispbled per Module, so 1 Core per Module was enabled.
Pay close attention to the specs. 
16GB of Dual Channel DDR3-1866



> AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.
> 
> What I'm about to deal with here is comparing 2CU/4C and 4CU/4C Bulldozers.
> (CU stands for Compute Unit, or equivalently 'Module')
> ...


http://www.xtremesystems.org/forums/showthread.php?275873-AMD-FX-quot-Bulldozer-quot-Review-(4)-!exclusive!-Excuse-for-1-Threaded-Perf.



Dam who deleted my thread, I was in the middle of editing one of my posts.


----------



## Damn_Smooth (Oct 25, 2011)

Super XP said:


> I believe he had 1 Core dispbled per Module, so 1 Core per Module was enabled.
> Pay close attention to the specs.
> 16GB of Dual Channel DDR3-1866
> 
> ...



Yeah, I know. He still needs more than a single Dirt 2 benchmark to prove anything though.



> Dam who deleted my thread, I was in the middle of editing one of my posts.



I don't know for sure, but I have a good guess.


----------



## nt300 (Oct 25, 2011)

*B3 Stepping Coming in Q1 2012*

Have you seen this. It seems only the high end will use the B3 stepping and it's modifications and tweaks. If this CPU performs better than the 8150, I may pick it up.

AMD FX-8170
*B3 Stepping * 
3.90 GHz (Base Clock)
4.5 GHz (Turbo Core 3.0)
Q1 2012 (Release Date)

http://en.wikipedia.org/wiki/List_of_AMD_FX_microprocessors


----------



## Super XP (Oct 25, 2011)

A small tool that can run an application using only one CPU core. Can this be used to measure each individual Bulldozer core and to determine which one runs better than the other. If you can figure out what cores run better on there own, then you can perhaps disable the cores that are not optimal.

Just a thought.

LINK:
http://www.softpedia.com/get/Tweak/CPU-Tweak/Single-CPU-loader.shtml

Also check this out:

Maybe need to patch for running core priority.

Now : Core 0 --> Core 1 --> Core 2 --> Core 3 --> Core 4 --> Core 5 --> Core 6 --> Core 7
Right Priority : Core 0 --> Core 2 --> Core 4 --> Core 6 --> Core 1 --> Core 3 --> Core 5 --> Core 7


----------



## nt300 (Oct 27, 2011)

You should pass that one to the FX overclocking thread so somebody can benchmark with it


----------



## LAN_deRf_HA (Oct 27, 2011)

Herp derp. http://techreport.com/articles.x/21865


----------



## Super XP (Oct 28, 2011)

What do you expect from Bulldozer's performance. AMD's top dogs either fired or layed off the wrong people, they got rid of a lot of smart engineers that were responsible for the Athlon & the AMD64.

Bulldozer at launch should have been 20% better performance vs. what it is today. But it's not, and now AMD is working on damage control all thanks to the new CEO. Hopefully we will see better performance with the B3.


----------



## erocker (Oct 28, 2011)

Super XP said:


> A small tool that can run an application using only one CPU core. Can this be used to measure each individual Bulldozer core and to determine which one runs better than the other. If you can figure out what cores run better on there own, then you can perhaps disable the cores that are not optimal.



No, I think I'll either have AMD exchange my current FX chip for a "fixed" one or I'll send it back to the retailer for an exchange. I didn't buy an 8 core chip to use a 4 or 6 core chip.


----------



## Super XP (Oct 28, 2011)

erocker said:


> No, I think I'll either have AMD exchange my current FX chip for a "fixed" one or I'll send it back to the retailer for an exchange. I didn't buy an 8 core chip to use a 4 or 6 core chip.


Then run it at 8-cores. Though I hear your Frustration. For now, put away the Benchmarks, give yourself a nice healthy bump in clock speed and game away until AMD fixes this.


----------



## Super XP (Oct 30, 2011)

*AMD FX - B3 Revision Q1 2012*

This guy now posting on Newegg. I already found some of his posts on Amazon and other related sites.


> ATInsider
> 10/25/2011 6:19:41 PM
> Tech Level:
> Ownership:
> ...


----------



## Super XP (Oct 30, 2011)

Here's another post by this guy.


> ATInsider
> 10/14/2011 9:02:55 AM
> Tech Level:
> Ownership:
> ...


AMD FX-8150 Zambezi 3.6GHz 8MB L2 Cache 8MB L3 Cac...


----------



## CDdude55 (Oct 31, 2011)

He just sounds like a rabid fanboy, in his older post it seems he has no ties to AMD at all either. Ya we get it, blame GF, blame Windows 7, AMD is ''innovation to the max'' and nothing is their fault. More bullshit.

The chip is shit and i'm not looking for more anecdotal testimonies, just fix it.


----------



## Winston_008 (Oct 31, 2011)

CDdude55 said:


> He just sounds like a rabid fanboy, in his older post it seems he has no ties to AMD at all either. Ya we get it, blame GF, blame Windows 7, AMD is ''innovation to the max'' and nothing is their fault. More bullshit.
> 
> .



Agreed. If amd was going to do what he claims, Amd themselves should publicly tell everyone, but then that would mean no one buys the current fx series lol. They really put themselves in a position there.

Just my opinion but what i think Amd should have done is to squeeze whatever last ipc perf tweaks possible out of the pII architecture, shrink it and make it 8 cores. As a stopgap before releasing the true fx cpus which will be piledriver.


----------



## ensabrenoir (Oct 31, 2011)

*didn't realize these threads were still alive.*



Super XP said:


> Here's another post by this guy.
> I blame the software developers along with Windows 7. *The OS is having a hard time trying to figure out what Bulldozer really is and is making a mess in how the CPU is actually suppose to work*



?????  Your not alone windows 7....... your not alone.....

Geeez bd is what it is. Want one? Buy one and just enjoy it.  Nothing else matters. Spend you money as you please. BD doesn't have to be the fastest, most powerful or ... most anything. Better is yet to come? Hoooray even better! But most importantly.Be happy with it.  It would be cool if everyone could share in your happiness but if they don't...so what... just enjoy. No one have to prove to the rest of the  world how great or fail bd is.


----------



## Frick (Oct 31, 2011)

No offense <sarcasm> but when I want updates on how a new chip will perform the first place I look is the Newegg review fields. </sarcasm>


----------

