• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Why Bulldozer's spotty performance is good news.

  • Thread starter Thread starter twilyth
  • Start date Start date
thats good. I'm one of those people that builds a gaming rigs and keeps it forever maybe upgrading GPUs, until stuff starts to fail. My last chip was a X2 5000+ BE. I may get a Thuban and it should keep me happy for the years to come.
 
I just bought one to carry me over till next year when i upgrade my system again.

$189 for a 1100T at the egg was great. I can't wait to get it and get it going to see what sort of clocks I can get.
 
I dont get why AMD would go MCM after they criticized core2 for not making "true quad cores"

Thubans will make you smile, even sitting next to my sandyBridge PCs, I still like using the AMD system better (better desktop feel).
 
Indeed that will depend on how well Intelagos performs in practice.
On the other hand, Bulldozer do have some aggressive power saving features to help with its power consumption.

If this site is to be trusted:
http://www.cpu-world.com/CPUs/Bulldozer/AMD-Opteron 6276.html
Interlagos will have a significantly lower TDP than Magny Cours.
(115W TDP/85W ACP vs 140W TDP/105W ACP.)

Also the fact that AMD stated the Interlagos will share the same platform with Mangy Cours means that they should be within a similar power envelope.
Yes they can be trusted, they were one of the first to break the Bulldozer story along with several new Intel CPU's. It's a good site.
I dont get why AMD would go MCM after they criticized core2 for not making "true quad cores"

Thubans will make you smile, even sitting next to my sandyBridge PCs, I still like using the AMD system better (better desktop feel).
Truthfully it's called Pure Innovation. AMD's 8-Core CPU is like a 4-Core but with Hyper Threading. AMD didn't want to do something like Hyper Threading so they choose to do what they did with Bulldozer. I say they've pushed Innovation quite strong with this CPU, now all they need to do is learn from it and make it more efficient and faster.
 
Those two sentences contradict each other.
Anyway, I'm not a rabid fan of either company. Here's what I based my comments on:
http://www.anandtech.com/bench/Product/49?vs=80
Choose a Phenom II x4 965 or higher vs a C2Q Q6600 or higher. Those are all stock speeds, which is what I was referencing. I never said anything about overclock vs overclock. And I never said anything about clock for clock, since stock speeds are different for Intel and AMD.
And I never reference game benchmarks.

Tell me how do they contradict each other. What you do not see is a Yorkfield or even a Deneb beyond 4 Ghz (my X4 975 was at 4.3) is sufficient for most games. From there, a faster CPU does not make a World of difference. Instead, you should spend the extra on a stronger GPU. And yes, you do sound like a fanboy. That is how you sounded like on the first page. With that aside, you are changing subject from games. Further, Q6600 does not compare to a Deneb. A Q9550/Q9650 etc. are all faster compared to their AMD equivalents. They are faster at both stock and OC'ed. See the HWC link; Phenom loses either way.

Anyway, Mustang against a Silverado... :D

http://www.youtube.com/watch?v=mJmum0N4Mxc&feature=related
 
I dont get why AMD would go MCM after they criticized core2 for not making "true quad cores"

Thubans will make you smile, even sitting next to my sandyBridge PCs, I still like using the AMD system better (better desktop feel).

exactly!! even i dont know why but intels dont feel as fast in everyday application.
 
exactly!! even i dont know why but intels dont feel as fast in everyday application.

It is a placebo effect. Like how lowering your memory timings make you think it is faster. At the end of the day, the end result is the same no matter your instructions. The only difference between different chips is per-Mhz performance. It is still the same image on your screen. BTW, I found a Silverado a few months ago. 350. black with red stripes. Wish I had use for a pick-up though, they don't make those in hardtop.
 
It is a placebo effect. Like how lowering your memory timings make you think it is faster. At the end of the day, the end result is the same no matter your instructions. The only difference between different chips is per-Mhz performance. It is still the same image on your screen. BTW, I found a Silverado a few months ago. 350. black with red stripes. Wish I had use for a pick-up though, they don't make those in hardtop.

nope. its different all right.
 
nope. its different all right.

Well, IDK. Have a look at Windows benches then: like WinZip or PCMark. You can't tell the exact performance without numbers. It's like trying to guess how long an obstacle is by looking at it. There is no way to tell which one is smoother without considering other factors.
 
The only market this is good for is servers, and not much else considering a dual socket with two four core prior generation Opterons offers more performance for less price.
Are you sure? :wtf:
 
Are you sure? :wtf:

A pair of Harpertown chips beat a Nehalem in MaxxPi. On top of it, server environments are even more multi threaded. Parallelism is the key. You have thousands of Input/Outputs in every second. The more cores you have, the better things would get. But the older platform pulls more power so it would be easier to cool single CPU racks. It could make up for the price difference from power and cooling. Depends on the scale of your environment.
 
Are you sure? :wtf:

A pair of Harpertown chips beat a Nehalem in MaxxPi. On top of it, server environments are even more multi threaded. Parallelism is the key. You have thousands of Input/Outputs in every second. The more cores you have, the better things would get. But the older platform pulls more power so it would be easier to cool single CPU racks. It could make up for the price difference from power and cooling. Depends on the scale of your environment.

1 of ASUS KCMA-D8 ATX Server Motherboard Dual Socket C3...

2 of AMD Opteron 4122 Lisbon 2.2GHz 4 x 512KB L2 Cache ...

What you get? 8 threads on real cores, 12MB total of L3 cache, insane amounts of memory, and still have a PCI-e slot, and the chips will run cooler as there are two of them running with more surface area.
 
I can vouch for this also

And I can vouch for the opposite. I've used several systems with Phenom II's and my SB Setup feels a lot more responsive. The Phenom II's ran a tad better than my Q6600, but SB definitely felt smoother. Opinion cannot be used as fact, sorry.
 
O.K. I am revisiting this issue in regards to how well would Bulldozer scale if you slap a Tri-CrossfireX and/or SLI into the mix. Well looking at the benchmarks, from my estimation, Bulldozer IMO could have possibly WON at least 7 out of 10 gaming Benchmarks due to the FACT it was clocked slower for some reason by about 500MHz.

If the Clocks were even Clock for Clock, Bulldozer would have won in these benchmarked games listed bellow IMO:
1) Lost Planet 2 (Very High Image Preset)
2) Unigine Heaven Benchmark (v2.5/DX11/Shaders High/Tess Normal)
3) Aliens vs. Predator (Very High Image Preset)
Right now, they were even on this one, but if they were even clocks, Bulldozer would have killed this one Big Time.
4) Tony Clancy's H.A.W.X.2 (Very High Image Settings)
5) Mafia 2
6) Metro 2033 (High Image Settings)
7) Dirt 3 (Very High Image Preset)
-----------------------
Total system Power Consumption in IDLE and LOAD:
Bulldozer = 136w / 723w
Core i7 2600k = 186w / 703w

So who's saying Bulldozer sucks back a lot of power? I believe it all depends on your overall system as you can see by the numbers above.

The issue I have with this review is Bulldozer is about 500MHz slower, and the reviewer had the balls to call Bulldozer a disapointment :laugh:

http://www.tweaktown.com/articles/4353/amd_fx_8150_vs_intel_i7_2600k_crossfirex_hd_6970_x3_head_to_head/index3.html

Any possible issues Bulldozer has right now should be resolved with Piledriver (Q1 2012). This I cannot wait for, it's much needed competition.
 
You have to consider the OC'ing headroom, and operation temps of a Sandy chip against BD. You can OC SB up 5 Ghz on a Zalman flower... also, from the benches I saw (most of them were from crap sources but yeah), there're times BD performs worse than even a Phenom. It's oriented for multi-threading, not gaming. This has been hashed out endlessly, have a read back on the thread. New games like clock-per-clock, and older ones are all for single threaded performance. These, by themselves, aren't what BD is thought for. I'd also much like to see AMD redesign the chip but that's going to be hard at this rate. But they can decrease leakage. GF is behind TSMC.
 
And yes, you do sound like a fanboy.

Good lord :laugh:
The system I have is the first time I've ever bought AMD :roll:

You know, I sat there thinking a few months ago, I was all ready to buy a 2500k rig with a 560Ti or 570. All components, since I hadn't had a desktop in years. I buy good quality for the price, it would have set me back $1700-1800. I was all pumped about it too. Then I thought to myself, "Am I really going to use it for a lot of gaming? It's really tempting just to get it for the hardware, but can I justify it?". So I started looking around, what did AMD have?, and what did they have on the horizon? First thing I noticed was the price difference, cheaper, that suited me fine. Second thing I noticed was that cpu performance was lower than the high end Intel cpus. Then I thought about what I was going to do with the system...
How much power does it consume?
How much power does it consume when overclocked? (I was going to tinker, after all)
What is the optimal balance between system performance and power consumption, for me?
Taking that into account, what is the most cost effective choice, regardless of performance?
Is there any novelty in it? (no fun going down a well trodden path, if you ask me)
Is there any upgrade path if I take up the choice of novelty?

:roll:

I chose AMD, for the first time ever.
I knew I would not see 'ultimate' performance, but I would have an interesting experience trying to make it go faster.
I had no real knowledge of what BD performance would be, but I knew that it was likely to be in the 2500k range for the FX-8xxx, perhaps not at launch but with a revision or two.

It's likely my next rig will be Intel based, just for the contrast.

:laugh:
 
That seems like an uninformed fanboy post TBH. If a company was to use multiple cores as a single, more powerful core, it'd be Intel. Intel has more resources than AMD. They always had. They're a bigger company to begin with. Add that games today barely scale over 4 cores. So, like said many times before, AMD's target is server enviroments. Intel on the other hand rebadges their quad/hex core chips for servers. Though, they got the per-clock performance lead so things are easy on the desktop side. I'm sure my older Westmere would trump any BD at 4.2. Not to mention it's the slowest of it's line, is server chip (dual QPI). And while doing this, it still pulls less power.
 
Well, the observations concerning Intel's and AMDs resources are common knowledge.
AMD's target being server environments... hmm well they might want to increase their miniscule market share there, but I can't see it being their target.
Intel rejigs their core architecture for Xeons, and then in the next generation takes the Xeon socket and rejigs a current gen core architecture for it to create their enthusiast high end platform, a market segment in which they have no competition. This is also common knowledge.

I have to say, I'm perplexed by your need to inflame argument where there is none.
 
Why? You quoted some random guy with no idea as a reliable source... honestly, what was up with those smiles? You certainly come off like an AMD fanboy. No offense but it's obvious. And yes, AMD's target is servers. They had the lead with their quad socket platforms, Istanbul chips were the first 6 cores, and this. BD was targeted with servers in mind. Hell, even Mr. Fruehe (AMD's server guy -- JF-AMD) was prancing on this platform last year. Saying how it will have solid single thread performance. What happened? He lied then it blew.
 
In the worst case scenario, AMD’s FX launch is disappointing in that the chip doesn't trounce competing Intel devices in performance. Regardless, as the two chip giants battle, they continue to attempt to outdo each other, which benefits consumers.

best point in the article
 
All in all it seems to me that AMD remains the cheaper, less yet still quite fast performance option. And hell, you can still overclock any AMD chip... I expect to see lots of us here at TPU buying lower class BD chips and making them fly faster than the fastest stock BD chip... and the fastest stock Intel chip. That's still crazy performance.
 
Back
Top