# HD 7970: Bulldozer vs. Sandy Bridge vs. Nehalem



## W1zzard (Dec 29, 2011)

In our CPU scaling article we investigate gaming performance of AMD's latest Radeon HD 7970 flagship. The processors in our test group have been selected to match popular CPUs used by gamers and enthusiasts: AMD FX-8150 Bulldozer, Intel Core i5-2500K Sandy Bridge and Intel Core i7-920 Nehalem.

*Show full review*


----------



## qubit (Dec 29, 2011)

Excellent review. 

Unfortunately, Bulldozer is what it is and it doesn't matter what system you put it in, you get about the same results.

I look forward to an updated version of this review once the Windows threading patch comes out.


----------



## Wrigleyvillain (Dec 29, 2011)

Thanks


----------



## Andrei23 (Dec 29, 2011)

Nice review. but bulldozer is still a fail for gaming.


----------



## dj-electric (Dec 29, 2011)

2500K, Chuck norris use this CPU.
thanks for the megabench w1zz, i know how long and frustrating are all those tests


----------



## Frick (Dec 29, 2011)

Andrei23 said:


> Nice review. but bulldozer is still a fail for gaming.



Naah. It's slower than SB but holds up good against .. that older Intel chip. But yeah, if you're a hardcore gamer intel is the way to go.

BTW, it would be awesome if Battlefield 3 was to be included there.


----------



## TotalChaos (Dec 29, 2011)

nice comparison looks like AMD still has work to do !


----------



## W1zzard (Dec 29, 2011)

Frick said:


> BTW, it would be awesome if Battlefield 3 was to be included there.



http://www.techpowerup.com/reviews/AMD/HD_7970_CPU_Scaling/5.html

how could you not find it in the alphabetically sorted list of games ?


----------



## Frick (Dec 30, 2011)

W1zzard said:


> http://www.techpowerup.com/reviews/AMD/HD_7970_CPU_Scaling/5.html
> 
> how could you not find it in the alphabetically sorted list of games ?



Beats me. I don't really know what I was thinking there.


----------



## mastrdrver (Dec 30, 2011)

Very nice, thanks!

A request? How about the same with with crossfire comparing FX and Thuban?


----------



## Damn_Smooth (Dec 30, 2011)

mastrdrver said:


> Very nice, thanks!
> 
> A request? How about the same with with crossfire comparing FX and Thuban?



The results would be a minimal difference. It sounds like a waste of time.


----------



## JustaTinkerer (Dec 30, 2011)

While we are on the subject of requests, any chance of doctoring the results so bulldozer wins something...anything, put in a category like "comes in a nice looking box" bulldozer might win something....make me feel better about my choice, and I only have the 8120!
Cant   believe the 920 beat it more times than not, only thing I want to play now is 3d mark 11 and that's not even a game...
Could have saved my self some cash and just picked up a 920 for £80-90.
This patch better be some kind of magic, Merlin himself better be writing it.


----------



## newtekie1 (Dec 30, 2011)

I'd like to see this again, but this time with "modest" overclocks on the CPUs.  Say the SandyBridge at 4.0GHz, the Nehalem at 3.6-3.8GHz, and the Bulldozer at 4.0GHz+.

None the less, a very nice review, and really goes to show that upgrading CPUs isn't really a priority right now.


----------



## mediasorcerer (Dec 30, 2011)

Good bit of info,thanx.


----------



## FreedomEclipse (Dec 30, 2011)

newtekie1 said:


> I'd like to see this again, but this time with "modest" overclocks on the CPUs.  Say the SandyBridge at 4.0GHz, the Nehalem at 3.6-3.8GHz, and the Bulldozer at 4.0GHz+.
> 
> None the less, a very nice review, and really goes to show that upgrading CPUs isn't really a priority right now.



I think there was a review of this not long before or after BDs release. It showed that clock for clock. SB still beat the hell out of BD.

you can overclock it but the performance gains wouldnt be anywhere near as big as Intels.


----------



## fullinfusion (Dec 30, 2011)

newtekie1 said:


> I'd like to see this again, but this time with "modest" overclocks on the CPUs.  Say the SandyBridge at 4.0GHz, the Nehalem at 3.6-3.8GHz, and the Bulldozer at 4.0GHz+.
> 
> None the less, a very nice review, and really goes to show that upgrading CPUs isn't really a priority right now.


The BD wouldnt show anything different then the review shows... There junk till about 4.8GHz and by that time your looking for a way to cool the bitch.
And shows not to bother upgrading cpu's now.... only if you own a Bulldozer its time to upgrade 


qubit said:


> Excellent review.
> 
> Unfortunately, Bulldozer is what it is and it doesn't matter what system you put it in, you get about the same results.
> 
> I look forward to an updated version of this review once the Windows threading patch comes out.


I wouldnt waste the man's time... It's more then a patch thats going to make this cpu shine. Toss it on the train tracks and after its rolled over and over is about the best it'll shine. 


Damn_Smooth said:


> The results would be a minimal difference. It sounds like a waste of time.


I 2nd that


----------



## DrunkenMafia (Dec 30, 2011)

Andrei23 said:


> Nice review. but bulldozer is still a fail for gaming.



:shadedshu

I have a FX for gaming and its the best gaming machine I have ever owned.


----------



## fullinfusion (Dec 30, 2011)

^ sorry to hear that JJ 



FreedomEclipse said:


> I think there was a review of this not long before or after BDs release. It showed that clock for clock. SB still beat the hell out of BD.
> 
> you can overclock it but the performance gains wouldnt be anywhere near as big as Intels.


you got that right...
Look at the same clocks vs clocks...
Bulldozer 8150 @ 5ghz






Same clocks but on a SB 2700K
Both systems had the memory speeds and timings as well as the gpu was 1000/1500MHz for both runs... SB Is a huge winner


----------



## qubit (Dec 30, 2011)

Frick said:


> Beats me. I don't really know what I was thinking there.



You muppet! 

Don't worry, you're not alone in this...



fullinfusion said:


> I wouldnt waste the man's time... It's more then a patch thats going to make this cpu shine. Toss it on the train tracks and after its rolled over and over is about the best it'll shine.



Sure, it won't change dramatically, but I think it would be nice to perhaps have a quick version of it with just a couple of games out of interest, though.

Toss it on the tracks, oh so cruel... lol


----------



## fullinfusion (Dec 30, 2011)

qubit said:


> Sure, it won't change dramatically, but I think it would be nice to perhaps have a quick version of it with just a couple of games out of interest, though.
> 
> Toss it on the tracks, oh so cruel... lol


lol but its true hahahahahaha


----------



## karnak (Dec 30, 2011)

Great article, but I would like to see a redux with eyefinity on all games.  That would be interesting.


----------



## Jstn7477 (Dec 30, 2011)

It's interesting that in some games Bulldozer only seems to do good at 2560*1600 while being slowest on the rest of the resolutions. My only guess is that the card is taxed at that resolution, negating any CPU differences.

Glad I went 2600K + Z68, as it looks like I shouldn't need an upgrade for a while. AMD and I had some good years, but the performance has been "meh" for quite some time. I know my 4GHz 955BE wouldn't be able to hold 120FPS in Team Fortress 2 on my ASUS VG236H, but my Intel does 98% of the time (if a poorly optimized map comes up or my system has been on for too long it does drop). I'd be interested to see how BD does in that game, as it is still very CPU intensive and doesn't like AMD CPUs too much it seems.

Thanks for the review, W1zzard! Good to know I made a good investment.


----------



## GSquadron (Dec 30, 2011)

> So *if if* you still have a first-generation Core i7, there's no need for you to upgrade



Remove an if please 
Nice idea of making a review like this one
I don't understand why sandy bridge with the core i5 is so powerful 
when it has less parameters showed in the table?


----------



## eidairaman1 (Dec 30, 2011)

qubit said:


> Excellent review.
> 
> Unfortunately, Bulldozer is what it is and it doesn't matter what system you put it in, you get about the same results.
> 
> I look forward to an updated version of this review once the Windows threading patch comes out.



That and the Revised CPU arch (Stepping) or Piledriver Arch)

Bulldozer is far from a fail or horrible just many people blow it out of proportion (all fanboys which are blind anyway).  (I do build AMD or Intel machines for clients, all depends on their budget)

Tell you truth I really want to see Skulltrail 2 or FX dual Machines again


----------



## 1Kurgan1 (Dec 30, 2011)

Nice review, though would have been interesting to see a 100t in there just for comparison sake to the 8150, as I hear everyone saying that the PII's are faster.


----------



## INSTG8R (Dec 30, 2011)

Yep, rather interesting the only time BD pulls ahead(by a VERY small margin) is at super high Res. I am glad I went with my 2600K I mean look at the pic fullinfusion posted. I mean c'mon 1.54V to hit 5Ghz?? That is just pure silliness. It is already a power soak at stock as it is...

Takes a brave man to pick up a BD and have aspirations of big overclocks. I really hope AMD fix this with Piledriver, I mean I said it when BD came out they should have just scrapped it and worked on Piledriver as that seems to be this CPU done right(if we are to believe the info we have)
Granted I haven't been with AMD since my Opty 170 rig and it was great to the point that it still felt "snappier" than the C2D E6600 I replaced it with. After seeing this I KNOW I chose wisely with SB.


----------



## Steevo (Dec 30, 2011)

Awesome review and enlightening. 

Makes me glad I didn't waste my money upgrading to a bd. With 3.7 and 4.1 turbo I have the same clocks and it still works well for what I need.


----------



## PopcornMachine (Dec 30, 2011)

Very interesting comparison.  Thanks for the review.

Makes me even happier about just buying a 2500K.


----------



## NC37 (Dec 30, 2011)

Ya know, I'll just say. It is good that BD is at least delivering playable fps in those benches. Even if it doesn't win. I'll give it that much. 

Glad to see Hard Reset made the benching cut. Bout time. Great game and indie bunch. Knew from the demo that it would be a title I'd want to see benched.

What I'd really like to see is a comprehensive upgrade guide. Take boards and setups from the last 5 years. Then run them on a stock set of benchmarks and list which should be upgraded. 

I know others do per year benching but each year they change the benches. Really if I want to know how a new system is going to run on say, L4D or whatever else...I'll want to see it. But you see it one year, then the next, gone. Yeah you know it will be better, but its still nice to see it. Metro has been one of the only titles I've been able to count on being in the lists. Glad it still is.


----------



## Nirutbs (Dec 30, 2011)

wow bull...nice thanks for review but sandy is better ..ie..ie


----------



## Mussels (Dec 30, 2011)

question for ya w1zz: could you compile the results differently, so we have a DX9/11 comparison between the CPU's?


basically its to see if DX11's multithreading changes the results, since we have so many DX9/poorly threaded games which could skew the overall result towards the intel CPU's, with their better performance per thread, vs AMD's more threads.


edit: to make more sense, one of your 'total performance averaged out' charts, but one for DX9, one for DX11 titles.


----------



## Super XP (Dec 30, 2011)

Very nice review, thanks.
Bulldozer does alright in higher resolutions. Not bad at all, hopefully AMD can enhanced, fix and tweak Piledriver so it can better complete with Intel.
For me going with a FX-8120 was a big performance improvement over my late Phenom II, and it did not cost me an arm and a leg for the best Socket AM3+ mobo and Ram.


----------



## OOZMAN (Dec 30, 2011)

DrunkenMafia said:


> :shadedshu
> 
> I have a FX for gaming and its the best gaming machine I have ever owned.



Does that make his statement invalid? No. My rig is the fastest I've ever used, does that make it the fastest rig _period_?


----------



## DOM (Dec 30, 2011)

why is the ram underclocked ?


----------



## mastrdrver (Dec 30, 2011)

Damn_Smooth said:


> The results would be a minimal difference. It sounds like a waste of time.



With the great memory bandwidth of BD, I think you'd be surprised.


----------



## Damn_Smooth (Dec 30, 2011)

mastrdrver said:


> With the great memory bandwidth of BD, I think you'd be surprised.



Nowhere near as surprised as I was at the difference between SB and BD.


----------



## Delta6326 (Dec 30, 2011)

Great review! thank you for taking time out of your personal holiday's to make my day. 

Lol i'm still going to wait till either Ivy Bridge or later to upgrade my Q6600


----------



## magibeg (Dec 30, 2011)

I'm especially impressed with the results of the starcraft 2 tests. CPU must really be a limiting factor with that game.


----------



## ankan (Dec 30, 2011)

*core 2 cpu comparasions*

How about comparing it with a core 2 quad cpu (e.g, Q6600 or QX9650)? Do I really need to upgrade the CPU if I play at 1920x1080 all settings max?


----------



## theubersmurf (Dec 30, 2011)

Thank you for doing this review. It's nice to have reaffirmed with data what I suspected (and hoped really) that I had no need to replace my i7 920. I knew, but looking at actual numbers was reassuring.


----------



## DanishDevil (Dec 30, 2011)

It's reviews like this that come out of nowhere that really impress me about TPU. Great work, W1zzard!


----------



## H82LUZ73 (Dec 30, 2011)

Dj-ElectriC said:


> 2500K, Chuck norris use this CPU.
> thanks for the megabench w1zz, i know how long and frustrating are all those tests



Chuck who ? I think you need to read up on this guy he kicked his ass more than ounce  Long live 'They Call Me Bruce'!!! Bruce would = the 2500k performance and stamina.

Great fun you had Wizz over the Christmas holiday.We gonna have a Crossfire one too?master reviewer


----------



## Lionheart (Dec 30, 2011)

I was expecting BD to be shit, but it performed better then I thought, the cpu is not as bad as everyone makes it out to be ,candy bridge is still obviously the better choice ^_^


----------



## eidairaman1 (Dec 30, 2011)

If You think about it that is still less voltage than what a Athlon XP required even overclocked.



INSTG8R said:


> Yep, rather interesting the only time BD pulls ahead(by a VERY small margin) is at super high Res. I am glad I went with my 2600K I mean look at the pic fullinfusion posted. I mean c'mon 1.54V to hit 5Ghz?? That is just pure silliness. It is already a power soak at stock as it is...
> 
> Takes a brave man to pick up a BD and have aspirations of big overclocks. I really hope AMD fix this with Piledriver, I mean I said it when BD came out they should have just scrapped it and worked on Piledriver as that seems to be this CPU done right(if we are to believe the info we have)
> Granted I haven't been with AMD since my Opty 170 rig and it was great to the point that it still felt "snappier" than the C2D E6600 I replaced it with. After seeing this I KNOW I chose wisely with SB.


----------



## cdawall (Dec 30, 2011)

Lionheart said:


> I was expecting BD to be shit, but it performed better then I thought, the cpu is not as bad as everyone makes it out to be ,candy bridge is still obviously the better choice ^_^



I wonder how many people can honestly see that few percent difference in the chips...


----------



## AphexDreamer (Dec 30, 2011)

My BD has been doing me very well and I have no regrets from it at all. I'd be interested in seeing a review when the Windows 7 BD FIX patch comes out and I like Mussle's Idea of an average split between DX9 games and dX11 games due to DX11's multithreading optimization.


----------



## INSTG8R (Dec 30, 2011)

eidairaman1 said:


> If You think about it that is still less voltage than what a Athlon XP required even overclocked.



Well that's just it isn't? Generations later and you'd think lower wattage would come with progress.


----------



## theJesus (Dec 30, 2011)

Frick said:


> Naah. It's slower than SB but holds up good against .. that older Intel chip. But yeah, if you're a hardcore gamer intel is the way to go.


You're forgetting that the 2500K is cheaper.  Seriously, I got mine _and_ a Z68 board for $300 flat, brand new.


----------



## eidairaman1 (Dec 30, 2011)

INSTG8R said:


> Well that's just it isn't? Generations later and you'd think lower wattage would come with progress.



so u want wine with that cheese lol


----------



## W1zzard (Dec 30, 2011)

Mussels said:


> question for ya w1zz: could you compile the results differently, so we have a DX9/11 comparison between the CPU's?



all benchmarks are dx11. except for hard reset and star craft II


----------



## qubit (Dec 30, 2011)

W1zzard said:


> all benchmarks are dx11. except for hard reset and star craft II



Then it will _definitely_ be interesting to see those benchies after the scheduling/threading patch comes out.


----------



## FreedomEclipse (Dec 30, 2011)

qubit said:


> Then it will _definitely_ be interesting to see those benchies after the scheduling/threading patch comes out.



I wont hold my breath, as many have said - you cant polish a turd.

Its like back in the day when game devs used to release dualcore optimisation patches for their games, I cant remember any of them patches really giving a huge boost in frame rate

Programs are written for multiple cores these days. All the multicore processors didnt need a patch a side from AMDs dualcore optimiser.

what we have here is a multicore patch being written for a multicore program because the CPU aint multicore enough and handles like driving a car through pig shit up to your knees. Intel didnt need this patch and im sure the multicore AMD cpus didnt need it either to this one. 

In a small sense, I see this as cheating. their horse lost the race so they bring it back to the stalls and pump it full of steriods,

the patch wont bring the results that people are hoping for and any good results will be heavily skewed making the results either hard to believe or totally untrustworthy


----------



## eidairaman1 (Dec 30, 2011)

zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz, broken record anyone?


----------



## qubit (Dec 30, 2011)

FreedomEclipse said:


> I wont hold my breath, as many have said - you cant polish a turd.
> 
> Its like back in the day when game devs used to release dualcore optimisation patches for their games, I cant remember any of them patches really giving a huge boost in frame rate
> 
> ...



Yes indeed, I agree that BD shouldn't need this and the other points you make and I'm not holding my breath at all. It's just interesting to see the results, whatever they show - and you can bet the threads on this will be long and ranty strongly worded.


----------



## Mussels (Dec 30, 2011)

W1zzard said:


> all benchmarks are dx11. except for hard reset and star craft II



i spose clear majority is good enough  and probably why BD performed better than in some other reviews where people hated on it.


----------



## EarthDog (Dec 30, 2011)

I havent read the thread so my apologies is this was asked/answered before.

First, Excellent review!

I would have loved to have seen a 2600k and 1100t?
Was the BD patch for W7 used? I cant imagine it making much of a difference but, its available, so....
Ram speed looks underclocked? Not that it makes a difference in most cases but....1333Mhz on AMD BD and especially 1066 on SB are awfully low.

Again sorry if I missed the info if it was listed anywhere else.


----------



## OOZMAN (Dec 30, 2011)

EarthDog said:


> I havent read the thread so my apologies is this was asked/answered before.
> 
> First, Excellent review!
> 
> ...



Meh... 2600k performs the same as the 2500k in games... 1100t on the other hand, would've been interesting.


----------



## Lost Hatter (Dec 30, 2011)

Dj-ElectriC said:


> 2500K, Chuck norris use this CPU.
> thanks for the megabench w1zz, i know how long and frustrating are all those tests



Chuck Norris Runs a 486-dx2-66. But gets the speeds of a i7-3960x


----------



## ensabrenoir (Dec 30, 2011)

*This will never end*

First it was wait till bulldozer, now its wait till the patch, then itll be wait till piledriver...and  on and on.  Bulldozer only fails when its trying to beat intel. It is what it is. If u have one..... its yours. Just enjoy the darn thing. Careful for what u ask for.  Someone is gonna make a patch that,pushes bd to the ultimate and your gonna have a gigabite experience.


----------



## Lost Hatter (Dec 30, 2011)

Actually Chuck doesnt need a CPU at all. His computer runs without 1. out of fear for its life.


----------



## Yellow&Nerdy? (Dec 30, 2011)

Bulldozer has highest default clock, but still gets beat. This really shows, that the 2500K is the gamer's choice for CPU: it's faster, cheaper and consumes less power than Bulldozer.


----------



## nINJAkECIL (Dec 30, 2011)

Looking at the Skyrim result (all cpu across all resolution), I do wonder whether this 7970 is too much overkill even for 2500K. At the lowest resolution up until max resolution, there's barely any difference. Is this a sign that this game eats cpu for breakfast?


----------



## ensabrenoir (Dec 30, 2011)

Yellow&Nerdy? said:


> Bulldozer has highest default clock, but still gets beat. This really shows, that the 2500K is the gamer's choice for CPU: it's faster, cheaper and consumes less power than Bulldozer.



Careful partner .....the truth aint welcome in these here, parts....amd has a patch for that kinda thinking.


----------



## Wyverex (Dec 30, 2011)

qubit said:


> Excellent review.
> 
> Unfortunately, Bulldozer is what it is and it doesn't matter what system you put it in, you get about the same results.
> 
> I look forward to an updated version of this review once the Windows threading patch comes out.


Ditto!


----------



## bear jesus (Dec 30, 2011)

I find it kind of funny that to me (gaming at over 2560x1600) the difference between sandy bridge and bulldozer is about 2fps down to 0.1fps, who here can see a 0.1fps difference? or even 2fps difference?

I admit for low res usage then of course sandy bridge pulls way ahead but who would be using sandy bridge and a *7970* at 1280x1024?

It seams that for the most part playing games at high settings and resolution makes them so GPU limited it does not really matter what CPU is being used so in that case the people who bought bulldozer are getting the same gaming performance as people who bought sandy bridge.


----------



## W1zzard (Dec 30, 2011)

bear jesus said:


> so GPU limited it does not really matter what CPU is being used



well, the cpu just needs to be fast enough to make the rendering gpu limited. those are basically the two options. either cpu limited or gpu limited.


----------



## bear jesus (Dec 30, 2011)

W1zzard said:


> well, the cpu just needs to be fast enough to make the rendering gpu limited. those are basically the two options. either cpu limited or gpu limited.



It seams that most modern CPUs are so powerful that when it comes to gaming the odds of it being CPU limited are pretty low and using a higher resolution just pushes the GPU more, so what i am getting from this is that the testing might have shown me more improvement for my situation if it was being done with an 8970 or 9970 

Really what this shows me is that all modern high end CPUs are damn powerful and that as far as high resolution gaming goes i should concentrate way more on the GPU than the CPU.


----------



## Hayder_Master (Dec 30, 2011)

Great idea, awesome review w1z.

LOl, that's mean if u try SB I7 it's put any CPU away in all tests.

Cograts intel.


----------



## mlee49 (Dec 30, 2011)

I'd love to see more reviews like this. An awesome move from the standard product reviews.


----------



## TheoneandonlyMrK (Dec 30, 2011)

firstly good review wizzard, must have taken some time that one but worth it in the end imho, nice to see the BD bashers arent bored yet too , come on peeps BD aint all that for sure but at higher gameing res's it isnt as bad as some of your are portraying, 10% down on 1 2500K isnt that bad to me and thats an average, again as more multi threaded games come out the cores will start to count more

and to freedom intel might not need a patch per say but  recent BF3 hyperthreading jitters and the crappy chipset mobo recall recently inform me that the intel worlds not perfect either and only a crack smuggler or meph fiend would think the sun shined out of either companys rear end anyway, sort your bias peeps, i vote you jump on the consumers side as opposed to either co's that way you can slate them both from time to time like me


----------



## cdawall (Dec 30, 2011)

FreedomEclipse said:


> I wont hold my breath, as many have said - you cant polish a turd.
> 
> Its like back in the day when game devs used to release dualcore optimisation patches for their games, I cant remember any of them patches really giving a huge boost in frame rate
> 
> ...



So if the patch works its cheating? But the dual core optimisor was ok? Wtf?


----------



## trt740 (Dec 30, 2011)

Andrei23 said:


> Nice review. but bulldozer is still a fail for gaming.



Hardly, it appears to me that the cpu makes very little difference at most gaming resolutions, and it is more about GPU than cpu.  6-10 frames is not that big a deal and anything over 40 Fps sustained make zero difference. All these cpus are fast as heck buy the cheapest cpu and the most expensive gpu you can afford.


----------



## PopcornMachine (Dec 30, 2011)

trt740 said:


> Hardly, it appears to me that the cpu makes very little difference at most gaming resolutions, and it is more about GPU than cpu.  6-10 frames is not that big a deal and anything over 40 Fps sustained make zero difference. All these cpus are fast as heck buy the cheapest cpu and the most expensive gpu you can afford.



Yes many of the tests were within margin of error, or ties.

But when there was separation, it was the 2500k doing best.

And while these may not all have been noticeable differences, I'm happy I have the more powerful and cheaper and more power efficient cpu.

The main thing to realize is that not only does the bulldozer not beat the I5 2500K, it pretty much is matched by the I7 920.  And it would have been nice to see a 6-core phenom II in here.  Bet it would have given it all it could handle too.

Not what AMD was hoping for from it's new flagship CPU.


----------



## qubit (Dec 30, 2011)

bear jesus said:


> I find it kind of funny that to me (gaming at over 2560x1600) the difference...



You game at a resolution I can only dream of.

I hate you.

End.

j/k 

Oh and 1280x1024? I play at 640x480 sometimes just for that 'pixellated experience'.


----------



## FreedomEclipse (Dec 30, 2011)

cdawall said:


> So if the patch works its cheating? But the dual core optimisor was ok? Wtf?



Of course.... Dual core optimiser was more of a patch that helped 'sync' some programs that had some timing issues when it came to dual core.

A good example of this would be CoD:UO - the first map in the SP stage where your in the forrest. sometimes things wont spawn or scripts that are ment to happen at a certain time dont happen at all.

It didnt really boost performance of Windows per se but when it came to programs that need to bypass Windows API for timing thus fixing some of the problems some games were having.

plus even WITHOUT the dual core optimiser patch installed, it hardly made a difference to benchmarks at all in my experience. you could live without it and never know it existed.

People are looking at this patch from M$ like its some holy grail of patches and its gonna bring balance to the force when its not going to do shit all. If M$ are truely capable of making a CPU shine just by patching it then why the hell would we need such powerful processors??? We could buy any cheap ass low-end multicored CPU and just have M$ write a patch that optimises the hell out of it.

With that being said... you could put Michael Schumacher in a banged up Ford Cortina and he will drive like a boss and put on a world class performance, but at the same time. he wont win any races. because a driver is only as good as his car. and the car totally sucks at this juncture of all junctures.


----------



## Damn_Smooth (Dec 30, 2011)

FreedomEclipse said:


> Of course.... Dual core optimiser was more of a patch that helped 'sync' some programs that had some timing issues when it came to dual core.
> 
> A good example of this would be CoD:UO - the first map in the SP stage where your in the forrest. sometimes things wont spawn or scripts that are ment to happen at a certain time dont happen at all.
> 
> ...



What a nicely written post just to say that BD sucks and it always will suck. I already knew that though.


----------



## FreedomEclipse (Dec 30, 2011)

the worst thing is im not trying to troll lol


----------



## theJesus (Dec 30, 2011)

nINJAkECIL said:


> Looking at the Skyrim result (all cpu across all resolution), I do wonder whether this 7970 is too much overkill even for 2500K. At the lowest resolution up until max resolution, there's barely any difference. Is this a sign that this game eats cpu for breakfast?


Skyrim _does_ eat CPU for breakfast.


----------



## Damn_Smooth (Dec 30, 2011)

FreedomEclipse said:


> the worst thing is im not trying to troll lol



I don't think it is possible to troll when talking about BD anymore. I was one of the biggest AMD fanboys on here until reality kicked me in the ass, and that reality is that BD isn't worth the price of admission for anyone that considers gaming their primary use. 

I wasn't trying to be sarcastic in my other post or anything, I did find your post very nicely written.


----------



## qubit (Dec 31, 2011)

FreedomEclipse said:


> the worst thing is im not trying to troll lol



I know you're not, as well  and you gotta love the rhyme in "troll lol".


----------



## Mussels (Dec 31, 2011)

qubit said:


> I know you're not, as well  and you gotta love the rhyme in "troll lol".



trollololololol.


----------



## qubit (Dec 31, 2011)

Mussels said:


> trollololololol.



Well, what can I say, but 'lol'?


----------



## TheoneandonlyMrK (Dec 31, 2011)

FreedomEclipse said:


> the worst thing is im not trying to troll lol



well you and damn smooth are doin a terrible job Not trollin


----------



## Mussels (Dec 31, 2011)

considering most people are all 'hey, BD isnt as bad as some people claim' and you two are all OMG ITS STILL LAAAAAAME


its kinda funny, actually. sure its not the best bang for your buck, but not every CPU can be? theres always gotta be a winner, and more often than not, its not the most recently launched one.


----------



## FreedomEclipse (Dec 31, 2011)

Mussels said:


> and you two are all OMG ITS STILL LAAAAAAME



quite the opposite in fact. I didnt say it was totally lame, at best its a passable attempt by AMD. I know the CPU was primarily designed with servers in mind so they might excel at doing things a server normally would do...

WITH THAT BEING SAID...even Intels Xeon chips for servers are just totally awesome, and it doesnt matter if you use them in a server or a desktop, they will still perform really really well and not require a patch to boost performance.


----------



## TheoneandonlyMrK (Dec 31, 2011)

that said amd amd amd amd amd haha not intel intel not  simple 

ps intel intel intel nvidia the end 

ford or GM what ya sayin?


----------



## Damn_Smooth (Dec 31, 2011)

Mussels said:


> considering most people are all 'hey, BD isnt as bad as some people claim' and you two are all OMG ITS STILL LAAAAAAME
> 
> 
> its kinda funny, actually. sure its not the best bang for your buck, but not every CPU can be? theres always gotta be a winner, and more often than not, its not the most recently launched one.



I'm the one saying it's lame, and it is. I'm not going to sugarcoat it, AMD dropped the ball on this for gamers and that's who they marketed it to. I didn't even care about the price performance, I wanted a decent upgrade for the Phenom II X6s and every review of the CPU that is out there shows that it was a sidegrade at best.

As you can see by the board I am presently using, I fell hard for their decent marketing. I take full accountability for that because I know it was my own stupidity, but I am going to call a turd a turd when talking about the CPU.


----------



## ensabrenoir (Dec 31, 2011)

Damn_Smooth said:


> I'm the one saying it's lame, and it is. I'm not going to sugarcoat it, AMD dropped the ball on this for gamers and that's who they marketed it to. I didn't even care about the price performance, I wanted a decent upgrade for the Phenom II X6s and every review of the CPU that is out there shows that it was a sidegrade at best.
> 
> As you can see by the board I am presently using, I fell hard for their decent marketing. I take full accountability for that because I know it was my own stupidity, but I am going to call a turd a turd when talking about the CPU.



Actually bd comic value is awesome I would have been sure by now we'd run out of jokes or gotten tired of making jokes but man...there just dosent seem to be an end to it....what great head room.


----------



## cdawall (Dec 31, 2011)

FreedomEclipse said:


> Of course.... Dual core optimiser was more of a patch that helped 'sync' some programs that had some timing issues when it came to dual core.
> 
> A good example of this would be CoD:UO - the first map in the SP stage where your in the forrest. sometimes things wont spawn or scripts that are ment to happen at a certain time dont happen at all.
> 
> ...



Because there is a huge difference between a cheap Q6600 and an FX. Thread scheduling should be completely different as the way they are set up is completely different. I am not saying the patch will be the holy grail I am saying its not a "cheat" its a fix becuase the current task scheduling is not designed for a Bulldozer. This is no different than the task scheduling changes that have been built into windows from its various versions. I would put money a quad on windows 95 would behave completely different from windows 98, 2K, NT, XP, Vista and 7.


----------



## theJesus (Dec 31, 2011)

cdawall said:


> I would put money a quad on windows 95 would behave completely different from windows 98, 2K, NT, XP, Vista and 7.


Now I want to try installing 95 on my machine just for the lulz.


----------



## eidairaman1 (Dec 31, 2011)

theJesus said:


> Now I want to try installing 95 on my machine just for the lulz.



you will have to pull some of that ram out, 98 supports 1.5Gigs at most if im not mistaken


----------



## qubit (Dec 31, 2011)

eidairaman1 said:


> you will have to pull some of that ram out, 98 supports 1.5Gigs at most if im not mistaken



Oh crap, I've got 16 gigs of the stuff in my new Sandy build!  and 4 of it is in a single module. Also, Win95 had a bug in it, where it would crash at initial bootup if the processor was too fast - and we're talking about ancient Petium 400MHz CPU's here, let alone the monsters we have today. But Microsoft had a fix for it - yay! Download the exe from their website and boot into your new Windows 95. Run it to automatically unpack the installer and apply the patch in one step... can you see a problem with this? I kid you not. :shadedshu


----------



## theJesus (Dec 31, 2011)

lol, there is a very good reason I said "try".  I'm not going to anyways; I know it would be hell just getting it to recognize even the simplest of things.


----------



## qubit (Dec 31, 2011)

theJesus said:


> lol, there is a very good reason I said "try".  I'm not going to anyways; I know it would be hell just getting it to recognize even the simplest of things.



Oh yeah, hell indeed. 

There is actually a way round this Win95 glitch: slow down the processor or boot it up on a very slow processor eg 200MHz Pentium or 486. Let Windows sort itself out with the drivers, then run the patch, then run on the fast system again and it should work. I know this works, because I remember going through this hassle a decade ago, lol. It's almost like Microsoft deliberately made the patch into a catch 22 situation to discourage Win95 use. Bad Microsoft! 

You remember what the problem was? A timing loop overflowed. Apparently, some little counter in the bowls of Windows would wrap around through zero if the CPU was too quick and cause a blue screen.


----------



## pantherx12 (Dec 31, 2011)

FreedomEclipse said:


> the worst thing is im not trying to troll lol



You realise a design like bulldozers has never been done before right?

Whilst I'm not expecting miracles a software patch could easily bump up results 10%

Whilst it still won't be beating Intel chips that a nice boost and makes the fx8120 very competitively priced.

Considering even with the unfinished leaked patched ( only half of it) got me a 7% higher cine-bench single core score maybe even 10-15% wouldn't be crazy.


----------



## Hamlet (Dec 31, 2011)

The conclusion I take away from this well-done review is:
CPU doesn't matter once you go to 1920x1200 (which should be the MINIMUM resolution for ANYONE who buys a HD7970...
I mean, come on?

However, once you go up there, the CPU becomes negligible.
I've read some previews which show that Windows8 will be much better suited for taking advantage of the Bulldozer architecture... so I guess the results will even be closer...


Suggestion:
In the resume, I'd loved to see a comment on the fact that some games seem to be so tight, that there is barely a difference while others are showing much larger gaps between the 3 CPUs.
Are those the (shoddily programmed) games that only use 1 core (so that Intel wins by IPC)?


As I'm not in the market for 500€ video cards, I'll pass the HD7970 anyway.
Looking forward to seeing the new 7870/7850!


----------



## bear jesus (Dec 31, 2011)

qubit said:


> You game at a resolution I can only dream of.
> 
> I hate you.
> 
> ...



Ah 640x480, reminds me of the 90's 

I only have 1,196,000 more pixels than 2560x1600  but it cost less than £600 for the 6970 and 3 22" monitors, many spend that on a GPU or monitor alone, i admit the screens use very cheap panels so don't compare to a £600 monitor but the 5,292,000 colourful flashing pixels filling my vision more than make up for it to me /end gloat  

But i have dug myself an expensive hole, as the results show the higher the res the less the CPU plays a part and the more GPU power needed and in the case of DX11 games a 6970 just can't cut it with everything including AA maxed out, even the 7970 seams like just about enough, although with a little overclock and at my silly res the 7970 catches up with the 6990 and 590 yet all three are only just enough to max out some DX11 games above 2560x1600 and in some cases they are still not enough so it will take at least a water cooled 1.1 GHZ+ 7970 to dig me out of this. 

OK I'm a very lucky bear


----------



## pantherx12 (Dec 31, 2011)

bear jesus said:


> Ah 640x480, reminds me of the 90's
> 
> I only have 1,196,000 more pixels than 2560x1600  but it cost less than £600 for the 6970 and 3 22" monitors, many spend that on a GPU or monitor alone, i admit the screens use very cheap panels so don't compare to a £600 monitor but the 5,292,000 colourful flashing pixels filling my vision more than make up for it to me /end gloat
> 
> ...




You just seen the news post about the saphire cards.

Try 1.3ghz + on your 7970


----------



## bear jesus (Dec 31, 2011)

pantherx12 said:


> You just seen the news post about the saphire cards.
> 
> Try 1.3ghz + on your 7970



I just noticed it after posing


----------



## pantherx12 (Dec 31, 2011)

Someone needs to get me one of these cards, or give me a job so I can get one myself!

This + new WC set up would last me a pretty long time I reckon 

5ghz + Bulldozer and 1.45ghz* 7970 ploz.

*Or as close to as possible.

These days unless I get a 50% over clock I'm not happy


----------



## Hamlet (Dec 31, 2011)

Hey bear!
Computerbase had just the test you need: http://www.computerbase.de/artikel/grafikkarten/2011/test-amd-radeon-hd-7970/18/#abschnitt_eyefinity


----------



## cdawall (Dec 31, 2011)

W1z is there any chance we could see a crossfire scaling benchmark between the 3 processors you chose? Maybe toss in a Thuban for shits and giggles?


----------



## virtue (Jan 1, 2012)

meh, would have liked to see the intel CPUs clocked at the same frequency (let's say 4ghz) to show the FPS difference
everyone knows 2500k is superior to the old gen already, comparing it with older gen at higher frequencies was just pointless. basically skimmed the entire article because of that


----------



## cdawall (Jan 1, 2012)

virtue said:


> meh, would have liked to see the intel CPUs clocked at the same frequency (let's say 4ghz) to show the FPS difference
> everyone knows 2500k is superior to the old gen already, comparing it with older gen at higher frequencies was just pointless. basically skimmed the entire article because of that



He compared stock clocks vs stock clocks. Its a bit unfair to compare them all at 4ghz if you are going to do that go for normal OC's on them all 3.6-3.8ghz on the 920, 4-4.4ghz on the i5 and 4.4-4.8ghz on the AMD.


----------



## JustaTinkerer (Jan 1, 2012)

cdawall said:


> He compared stock clocks vs stock clocks. Its a bit unfair to compare them all at 4ghz if you are going to do that go for normal OC's on them all 3.6-3.8ghz on the 920, 4-4.4ghz on the i5 and 4.4-4.8ghz on the AMD.



very well said.


----------



## Dent1 (Jan 1, 2012)

The Bulldozer's gaming performances is on par with the i7 Nehalem. That is pretty respectable.


----------



## FreedomEclipse (Jan 1, 2012)

Dent1 said:


> The Bulldozer's gaming performances is on par with the i7 Nehalem. That is pretty respectable.



depends how you see it.... BD's architecture has been in development for around 4years or longer. Nehalem was released back in November 2008 but they had an early preview in IDF sometime in 2007.

Its taken them that long to make a processor that competes with something that was probably finished & ready to ship back in 2007/08?? Hardly the game changer that everyone predicted in the early days.

I think Thuban fills that performance gap pretty well.


----------



## Makaveli (Jan 1, 2012)

FreedomEclipse said:


> depends how you see it.... BD's architecture has been in development for around 4years or longer. Nehalem was released back in November 2008 but they had an early preview in IDF sometime in 2007.
> 
> Its taken them that long to make a processor that competes with something that was probably finished & ready to ship back in 2007/08?? Hardly the game changer that everyone predicted in the early days.
> 
> I think Thuban fills that performance gap pretty well.



I agree BD numbers are not respectable at all. 

It only closes those gaps when its GPU limited.

And the 920 is a 2008 processor its down right embarrassing if you ask me.


----------



## Makaveli (Jan 1, 2012)

theubersmurf said:


> Thank you for doing this review. It's nice to have reaffirmed with data what I suspected (and hoped really) that I had no need to replace my i7 920. I knew, but looking at actual numbers was reassuring.



Agreed!

I'm currently at 3.8 on my 920 and i'm skipping all of SB and may even skip IB and go straight to Haswell.


----------



## Dent1 (Jan 1, 2012)

FreedomEclipse said:


> depends how you see it.... BD's architecture has been in development for around 4years or longer. Nehalem was released back in November 2008 but they had an early preview in IDF sometime in 2007.
> 
> Its taken them that long to make a processor that competes with something that was probably finished & ready to ship back in 2007/08?? Hardly the game changer that everyone predicted in the early days.
> 
> I think Thuban fills that performance gap pretty well.



The way i see it is:

I'm not too fused about 2007/2008, only right now. And if I can buy a CPU with gaming performance on par with a  i7 Nehalem for cheaper than it I'm all for it. 

Ok the Bulldozer had delays, but looking back at this thread people are calling it "junk" and whatnot. To say the Bulldozers gaming performance is weak is to say the i7 Nehalem gaming performance is weak too, which is rediculous.

It's OK to be disappointed, we are all disappointed, but it doesnt mean that disappointment is automatically translated into a bad overal product. It has good (not fantastic) single threaded gaming performance, and fantastic multi threaded performance and on a platform with an upgrade path, this is a good thing.


----------



## Makaveli (Jan 1, 2012)

virtue said:


> meh, would have liked to see the intel CPUs clocked at the same frequency (let's say 4ghz) to show the FPS difference
> everyone knows 2500k is superior to the old gen already, comparing it with older gen at higher frequencies was just pointless. basically skimmed the entire article because of that





cdawall said:


> He compared stock clocks vs stock clocks. Its a bit unfair to compare them all at 4ghz if you are going to do that go for normal OC's on them all 3.6-3.8ghz on the 920, 4-4.4ghz on the i5 and 4.4-4.8ghz on the AMD.



Here you go

http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/6

The 920 at 4ghz is still beating the BD chip. The only time it wins is with cinebench and some rendering so it doesn't really change much.


----------



## Steevo (Jan 1, 2012)

W1zz.....



Bench ALL the options.......lol




You should be able to ascertain the same poor performance from BD at anything but GPU limitied resolutions and stacking up filter options to get the same GPU limited results at lower resolutions won't make the BD any better of a processor.


----------



## Makaveli (Jan 1, 2012)

Dent1 said:


> The way i see it is:
> 
> It has good (not fantastic) single threaded gaming performance, and fantastic multi threaded performance and on a platform with an upgrade path, this is a good thing.



How is having single threaded performance that is lower than the previous chip it is suppose to replace "good"?


----------



## TheGuruStud (Jan 1, 2012)

Makaveli said:


> Here you go
> 
> http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/6
> 
> The 920 at 4ghz is still beating the BD chip. The only time it wins is with cinibench and some rendering so it doesn't really change much.



I see, so when it's doing work it's designed to do with new instructions disabled (b/c of ICC) and super duper intel optimized benchmark (cinebench) and it still beats intel, then it doesn't matter. 

I think you and Tom over at Tom's Hardware would get along nicely. Maybe you can both play in traffic.

I know, AMD can never use SSE3/4, AVX or FMA b/c that just wouldn't be fair.


----------



## Steevo (Jan 1, 2012)

TheGuruStud said:


> I see, so when it's doing work it's designed to do with new instructions disabled (b/c of ICC) and super duper intel optimized benchmark (cinebench) and it still beats intel, then it doesn't matter.
> 
> I think you and Tom over at Tom's Hardware would get along nicely. Maybe you can both play in traffic.
> 
> I know, AMD can never use SSE3/4, AVX or FMA b/c that just wouldn't be fair.



If no real life software uses it then does it matter?


This is like saying a piano goes 0-60 as fast as a sports car, when you drop it from a helicopter. 

A great fact, but worthless in real life.


----------



## TheGuruStud (Jan 1, 2012)

Steevo said:


> If no real life software uses it then does it matter?
> 
> 
> This is like saying a piano goes 0-60 as fast as a sports car, when you drop it from a helicopter.
> ...



Hmmm, real world apps on an OS that doesn't suck use it, but then again, that OS isn't one of the evil monopolies, so I guess that doesn't count either (linux).

Did you ever stop to wonder why this is? Is it perhaps that intel has been paying off everyone for the last 20 years? During the K7 generation AMD had about 27% market share. It was a very strong CPU, but the athlon 64 was even better and what happened? They LOST market share all the way down to about 15%. Intel doesn't play games. They lie, cheat, bribe, steal, threaten and strong arm anyone in there way. That lawsuit pay out was a joke and so are the results. Almost nothing has changed.

It's true, AMD doesn't help themselves by their position with software devs, but intel actively makes their compiler cripple AMD chips and then pays devs where appropriate to make their CPU look massively better, regardless of its actual perfromance (tons of synthetic benchmarks that are used my the masses). Anywhere you see their logo plastered has effectively been bought.

Oh, and for all you apologists, since when is SSE3/4 and AVX not used in real world apps? FMA is the only new kid on the block (and intel lied about using FMA4, then switched to 3 to screw AMD).


----------



## Makaveli (Jan 1, 2012)

Steevo said:


> If no real life software uses it then does it matter?
> 
> 
> This is like saying a piano goes 0-60 as fast as a sports car, when you drop it from a helicopter.
> ...



thank you for pointing out how stupid that point was.

I lol at his personal attack too very childish.


----------



## Makaveli (Jan 1, 2012)

TheGuruStud said:


> Hmmm, real world apps on an OS that doesn't suck use it, but then again, that OS isn't one of the evil monopolies, so I guess that doesn't count either (linux).
> 
> Did you ever stop to wonder why this is? Is it perhaps that intel has been paying off everyone for the last 20 years? During the K7 generation AMD had about 27% market share. It was a very strong CPU, but the athlon 64 was even better and what happened? They LOST market share all the way down to about 15%. Intel doesn't play games. They lie, cheat, bribe, steal, threaten and strong arm anyone in there way. That lawsuit pay out was a joke and so are the results. Almost nothing has changed.
> 
> It's true, AMD doesn't help themselves by their position with software devs, but intel actively makes their compiler cripple AMD chips and then pays devs where appropriate to make their CPU look massively better, regardless of its actual perfromance (tons of synthetic benchmarks that are used my the masses). Anywhere you see their logo plastered has effectively been bought.



I don't know what bridge you been living under but life isn't fair.


----------



## TheGuruStud (Jan 1, 2012)

Makaveli said:


> I don't know what bridge you been living under but life isn't fair.



What's your address? I want some of your belongings. I'm going to stroll in, take what I want, then leave. Don't bother calling the cops, I will just pay them off.

I love your attitude. Keep supporting crooks. We see how well that's working for the world.


----------



## Makaveli (Jan 1, 2012)

I didn't create the situation we are in this is way of the world. And it will continue to be this way well after we are both gone. 

I however work within the system we have instead of crying over something I cannot control.


----------



## cadaveca (Jan 1, 2012)

TheGuruStud said:


> What's your address? I want some of your belongings. I'm going to stroll in, take what I want, then leave. Don't bother calling the cops, I will just pay them off.
> 
> I love your attitude. Keep supporting crooks. We see how well that's working for the world.



Intel OWNS x86.

AMD OWNS x64.

They both pay each other for every processor sold. Every Intel CPU sold gets AMD cash, and vice versa.

While Intel and AMD may seem to compete for our wallets, really they are not in direct competition. AMD has a limited number of chips that they can produce, as does Intel. Nearly every single chip AMD produces is sold, and likewise for Intel. AMD originally simply bought a segment of Intel's market, via liscencing, that Intel could not produce chips for.

AMD could have a chip that was 10x as fast as SB, and everyone would be just as pissed as they are at how "slow" bd is, because they'd be so rare that they'd sell for thousands each. While you may not liek the business practices, AMD and INtel are actually PARTNERS in producing computing solutions for the masses. They share so much tech the only real differences between them are their core design, and the silicon it's built on.

Anyway, nice to see the older Intel chips can still ahng out with the new big boys. No need for X58 users to upgrade, if gaming, that I can see. Sure, SB is faster, but X58 is no slouch, either.


----------



## ensabrenoir (Jan 1, 2012)

^ + 1 
The only major difference I found is that is easier to poke fun at amd .....it gets a bigger raise out of their fan base...intellers just laugh with the jokes. But seriously amd fails badly at being intel.  Let amd be amd...their execellent at that and let intel b intel.


----------



## Dent1 (Jan 1, 2012)

Makaveli said:


> How is having single threaded performance that is lower than the previous chip it is suppose to replace "good"?



Just because the Bulldozer  isnt consistantly faster than it's replacement in singlethreaded activities doesnt mean it can't still be good, it means it isnt "as good" but its still good. And like I said earlier overall it's a good processor when taking into consideration of its fantastic multi threaded performance and upgrade path.

It's like comparing the 5850 with a 6850. Just because the 5850 is faster it doesnt mean the 6850 still isnt "good". They are both good.


----------



## Makaveli (Jan 1, 2012)

Dent1 said:


> Just because the Bulldozer  isnt consistantly faster than it's replacement in singlethreaded activities doesnt mean it can't still be good, it means it isnt "as good" but its still good. And like I said earlier overall it's a good processor when taking into consideration of its fantastic multi threaded performance and upgrade path.
> 
> It's like comparing the 5850 with a 6850. Just because the 5850 is faster it doesnt mean the 6850 still isnt "good". They are both good.



The 6850 is not a replacement for the 5850 in AMD's lineup that would be the 6950 which is faster than the 5850.

if the 6970 was slower than the 5870 everyone would say it was a failure so why does bulldozer get a pass?


----------



## Dent1 (Jan 1, 2012)

Makaveli said:


> The 6850 is not a replacement for the 5850 in AMD's lineup that would be the 6950 which is faster than the 5850.
> 
> if the 6970 was slower than the 5870 everyone would say it was a failure so why does bulldozer get a pass?



My argument was according to Wizzards review. The Bulldozer performs on par with the i7 Nehalem. If Bulldozers single threaded performance is so subpar, why does the Nehalem get a pass? Sure its older but I bet 7/10 of TPU members still own a i7 920 or had one, and they wouldn't classify it as a shitty  performer at all. 

It's not that I'm asking for a pass, I'm just pointing out that people buy whats suitable for them. Some people dont mind average gaming performance if they can encode their videos super quick, doesnt mean the CPU is rubbish, just means its not for you.


----------



## Makaveli (Jan 1, 2012)

Dent1 said:


> My argument was according to Wizzards review. The Bulldozer performs on par with the i7 Nehalem. If Bulldozers single threaded performance is so subpar, why does the Nehalem get a pass? Sure its older but I bet 7/10 of TPU members still own a i7 920 or had one, and they wouldn't classify it as a shitty  performer at all.
> 
> It's not that I'm asking for a pass, I'm just pointing out that people buy whats suitable for them. Some people dont mind average gaming performance if they can encode their videos super quick, doesnt mean the CPU is rubbish, just means its not for you.



In this review it only performs on par do to GPU limitations at the higher res. If you look at the lower res it looses. Nehalem doesn't need a pass because it offers superior IPC to Bulldozer. When you start looking at benchmarks outside of games you will see this. I am a TPU user with a 920 and I classify it as a better performer than BD even more so at my overclocked speed.

And I agree people should buy what suits their needs and purchasing level. I didn't say its total rubbish but I think Bulldozer is still a flop because it doesn't offer performance much greater than phenom II or intel 3 year old Nehalem cpu's.

When you spend 4+ years designing a cpu and it doesn't beat your previous model this is bad no matter how you spin it. And yes it offers you an upgrade path but not a performance improvement so why bother upgrading?


----------



## Hamlet (Jan 2, 2012)

In an equation as performance/€ there are 2 cogs you can tweak, not just 1.
One way is obvious: double the performance.
But there is also another way: Halve the price!

This is what happened with the iteration from HD5850 --> HD6850/HD6870.
The HD6870 was only slightly faster than the HD5850 but the price was smashed into half.
In my book that's just as good and led me to buy one.

Of course there are also some other factors playing into that equation (e.g. power consumption), where the HD6870 also did better than its predecessor.


----------



## Dent1 (Jan 2, 2012)

Makaveli said:


> When you spend 4+ years designing a cpu and it doesn't beat your previous model this is bad no matter how you spin it. And yes it offers you an upgrade path but not a performance improvement so why bother upgrading?



Fair enough the development for bulldozer took far too long.

To say it's upgrade path is "not a performance improvement" is a misconception. Look at the FX-8150 reviews in encoding, 3D rendering, compression and general productivity and you'll see that for people coming off an Athlon II, Phenom II X4/X6 and you'll see the upgrade path does yeild an big improvement.

Not all upgrades are based on gaming. I can go from a low end Athlon II X4 to FX-8150 spanking an i5 2500k in rendering, just by dropping in a new CPU. The perfect upgrade path for a 3D animator.


----------



## Makaveli (Jan 2, 2012)

Dent1 said:


> Fair enough the development for bulldozer took far too long.
> 
> To say it's upgrade path is "not a performance improvement" is a misconception. Look at the FX-8150 reviews in encoding, 3D rendering, compression and general productivity and you'll see that for people coming off an Athlon II, Phenom II X4/X6 and you'll see the upgrade path does yeild an big improvement.
> 
> Not all upgrades are based on gaming. I can go from a low end Athlon II X4 to FX-8150 spanking an i5 2500k in rendering, just by dropping in a new CPU. The perfect upgrade path for a 3D animator.



yes in those area's it will show an improvement over phenom.

The 8150k will indeed beat a 2500k in rendering but it will not beat a 2600k which is in the same price bracket.

with a am3 board you don't really have much of a choice when upgrading. And that would be the only reason I would buy one because there is no other option.


----------



## Dent1 (Jan 2, 2012)

Makaveli said:


> yes in those area's it will show an improvement over phenom.



Gd gd


Makaveli said:


> The 8150k will indeed beat a 2500k in rendering but it will not beat a 2600k which is in the same price bracket.



In the UK the FX8150 price falls inbetween both.

Ebuyer.com
2600K - £240
FX 8150 - £210
2500K £170

Anyways if I render all day as a job, I'm running socket AM3 and I have limited money and it's a choice between dropping in a FX-8150 or changing platforms specifically for a 2600K. Most people would select the FX-8150 all day.

And to be fair, there are some situations where the FX-8150 does outperform the 2600K, transcoding comes to mind. Now if you transcode all day, Intel doesnt have a commercial solution.




Makaveli said:


> with a am3 board you don't really have much of a choice when upgrading..



Huh??? Socket AM3 has more CPUs on it's support list than any current Intel socket.



Makaveli said:


> And that would be the only reason I would buy one because there is no other option.



Granted if coming from a Phenom II X6, yes you'll have little option.


----------



## trt740 (Jan 2, 2012)

Makaveli said:


> In this review it only performs on par do to GPU limitations at the higher res. If you look at the lower res it looses. Nehalem doesn't need a pass because it offers superior IPC to Bulldozer. When you start looking at benchmarks outside of games you will see this. I am a TPU user with a 920 and I classify it as a better performer than BD even more so at my overclocked speed.
> 
> And I agree people should buy what suits their needs and purchasing level. I didn't say its total rubbish but I think Bulldozer is still a flop because it doesn't offer performance much greater than phenom II or intel 3 year old Nehalem cpu's.
> 
> When you spend 4+ years designing a cpu and it doesn't beat your previous model this is bad no matter how you spin it. And yes it offers you an upgrade path but not a performance improvement so why bother upgrading?



I7 is a hell of a chip.  As are all the other current chips and again this is dumb, buy the cheapest cpu and oc the hell out of it and use the extra money to buy the best performing gpu your system supports whether it is Amd or Intel.


----------



## Makaveli (Jan 2, 2012)

Dent1 said:


> Gd gd
> 
> 
> In the UK the FX8150 price falls inbetween both.
> ...



I agree with all you have said, and sorry when I said am3 I was talking about just moving from a phenom level chip you are correct there are many more choices on the AMD side as this has been one of the areas AMD is better than Intel for socket compatibility.

as for the transcoding part all of Intel 6 core chips I believe bulldozer cannot touch. So gulftown and sb-e and yes if you mention price they won't be comparable but then again if that's my job and a business expense cost is not that big a factor.


----------



## cdawall (Jan 2, 2012)

Makaveli said:


> I agree with all you have said, and sorry when I said am3 I was talking about just moving from a phenom level chip you are correct there are many more choices on the AMD side as this has been one of the areas AMD is better than Intel for socket compatibility.
> 
> as for the transcoding part all of Intel 6 core chips I believe bulldozer cannot touch. So gulftown and sb-e and yes if you mention price they won't be comparable but then again if that's my job and a business expense cost is not that big a factor.



Depends as a small business I would choose to towers with an AMD processor vs one tower with a highend Intel. Buisiness right off or not you still have to have the up front capitol. 

As for the GPU comparison an HD2900XT smoked a 3870 yet which was the better seller?

As for GPU limited benching I'm just saying the $280 for a cpu another $150-200 on mobo and $500 on a GPU I'm buying at least a pair of 1920x1080P monitors and running eyefinity. At that point it makes no sense to choose anyone cpu for its vast gaming improvements. That would be GPU limited I7, I5, AMD it doesn't matter they will all push the same framerate.


----------



## Makaveli (Jan 2, 2012)

cdawall said:


> Depends as a small business I would choose to towers with an AMD processor vs one tower with a highend Intel. Buisiness right off or not you still have to have the up front capitol.
> 
> As for the GPU comparison an HD2900XT smoked a 3870 yet which was the better seller?
> 
> As for GPU limited benching I'm just saying the $280 for a cpu another $150-200 on mobo and $500 on a GPU I'm buying at least a pair of 1920x1080P monitors and running eyefinity. At that point it makes no sense to choose anyone cpu for its vast gaming improvements. That would be GPU limited I7, I5, AMD it doesn't matter they will all push the same framerate.



Yes it does depend.

If my business was rendering work for clients time is money. So yes cost of Intel's rigs would be higher but the investment will also pay for itself sooner.

I don't remember difference being as large as you state for it to be considered "smoking" it.
its been awhile since I looked at benchmarks but I do own an Asus 3870 512mb in my closet somewhere.

I'm someone that prefers high-end single gpu's to xfire/sli due to the stuttering issue and drivers that come with that kind of setup. 

But good point if going that route.


----------



## Mussels (Jan 2, 2012)

cdawall said:


> As for GPU limited benching I'm just saying the $280 for a cpu another $150-200 on mobo and $500 on a GPU I'm buying at least a pair of 1920x1080P monitors and running eyefinity. At that point it makes no sense to choose anyone cpu for its vast gaming improvements. That would be GPU limited I7, I5, AMD it doesn't matter they will all push the same framerate.



that argument has never made sense.



yes, if you slap in more and more video cards, eventually you end up CPU limited... but thats it.

people always use the 'logical' extension of 'so that means, CPU doesnt matter!' that is wrong.


your CPU requirements matter just as much, or even more - and who's to say you cant lower one or two CPU settings to get higher GPU usage/FPS? every percent matters when you want to get the most out of your video cards.


----------



## cdawall (Jan 2, 2012)

Makaveli said:


> Yes it does depend.
> 
> If my business was rendering work for clients time is money. So yes cost of Intel's rigs would be higher but the investment will also pay for itself sooner.



Rendering performance isn't 2x on a 6 core SB-e, but it is double the cost. You can shave some money off of both with lesser boards but most companies will pick an Intel mobo so to be fair I picked a midrange AMD 990FX. With the entire PC price factored in its still only a $400ish difference. So for every 2 SB-e's you could pretty much build 3 BD rigs... I would get better turn around running 3 "slower" rigs than 2 "faster" ones.

SB-e

I7 3930K $599
Intel X79 $279

*$878*


BD

FX8150 $269
Gigabyte 990FX $154
*$423*



Spoiler



rest of rendering rig

Geil 4x8GB $279
Powercolour 6770 $106
Seagate 1TB $104
Coolermaster Case+PSU $160

*$649*





Makaveli said:


> I don't remember difference being as large as you state for it to be considered "smoking" it.
> its been awhile since I looked at benchmarks but I do own an Asus 3870 512mb in my closet somewhere.



It was only a couple of %'s in all honesty the stock oc'd cards were better performers overall.


Makaveli said:


> I'm someone that prefers high-end single gpu's to xfire/sli due to the stuttering issue and drivers that come with that kind of setup.
> 
> But good point if going that route.



I like whatever gives the best performance last setup I had was dual 4870X2's or dual 4850X2's before that dual 3870X2's before that quad 3850's with nvidia stuff thrown about in the mix. 



Mussels said:


> that argument has never made sense.
> 
> 
> 
> ...




I am saying with currently available GPU's and high resolution what difference does CPU make? <2FPS? When GPU's drop new CPU's will be out and as you can see with Intel's 3 year old i7 920 again not huge performance gain. As it sits at a resolution I would play there is next to no difference between a 3 year old i7, 3 year old phenom x4 or brand spanking new SB. Current/last generation highend CPU's have more than enough performance for all current games GPU's are still the limiting factor. Anything below 1080P is pretty much a useless resolution on a $2000 PC.


----------



## antuk15 (Jan 2, 2012)

Review is fail... Average frame rates only tell half the tail when it comes to CPU performance.

TPU should of included the minimum frame rates as that would of showed a much wider gap between the 3 architectures used in the testing.


----------



## THE_EGG (Jan 3, 2012)

i am still surprised how well the nehalem processors are holding up especially because of how old they are. I upgraded from my 920 at the start of last year to a 2600k only because it seemed more fun and it ran much cooler.


----------



## Steevo (Jan 3, 2012)

antuk15 said:


> Review is fail... Average frame rates only tell half the tail when it comes to CPU performance.
> 
> TPU should of included the minimum frame rates as that would of showed a much wider gap between the 3 architectures used in the testing.



Yeah, cause things like updates, slower hard drives, network congestion, and many other factors are always the same. 



Wait, they aren't. 


Minimum frame rates on a system are not indicative of overall performance, they are useful, but not as much as common sense. Common sense tells me that if it is less than 40FPS it should have the settings dropped, the resolution dropped, or its time for a upgrade.


----------



## antuk15 (Jan 3, 2012)

Steevo said:


> Yeah, cause things like updates, slower hard drives, network congestion, and many other factors are always the same.
> 
> 
> 
> ...



Very very rarely doors any thing that you listed above cause low minimums.

And common sense should tell you that


----------



## Mussels (Jan 3, 2012)

antuk15 said:


> Very very rarely doors any thing that you listed above cause low minimums.
> 
> And common sense should tell you that



it takes just one stutter from a hard drive or SATA controller or background windows task to cause a min FPS dip. reviews that use them, always have to manually look for such dips and remove them from their results, so you're still getting an average - just say, the bottom 10% averaged, instead of the total amount.


----------



## antuk15 (Jan 3, 2012)

Mussels said:


> it takes just one stutter from a hard drive or SATA controller or background windows task to cause a min FPS dip. reviews that use them, always have to manually look for such dips and remove them from their results, so you're still getting an average - just say, the bottom 10% averaged, instead of the total amount.



If a HDD stutter causes a frame dip then its either a bad engine or you need more memory.

All game should be kept in local memory so a HDD shouldn't affect it at all.


----------



## Mussels (Jan 3, 2012)

```

```



antuk15 said:


> If a HDD stutter causes a frame dip then its either a bad engine or you need more memory.
> 
> All game should be kept in local memory so a HDD shouldn't affect it at all.



sorry, but you've clearly never dealt with this stuff before. short of running everything from a ramdrive, you're never going to prevent these kinds of problems.


----------



## cdawall (Jan 3, 2012)

Mussels said:


> ```
> 
> ```
> 
> sorry, but you've clearly never dealt with this stuff before. short of running everything from a ramdrive, you're never going to prevent these kinds of problems.



And even then memory controllers are not perfect. There will still be dips running a ram drive.


----------



## antuk15 (Jan 3, 2012)

Mussels said:


> ```
> 
> ```
> 
> sorry, but you've clearly never dealt with this stuff before. short of running everything from a ramdrive, you're never going to prevent these kinds of problems.



My drives are in active when I game.... And have never gone crazy and lowered my fps.


----------



## Mussels (Jan 3, 2012)

antuk15 said:


> My drives are in active when I game.... And have never gone crazy and lowered my fps.



you cant see the dips with the naked eye, they show up in FRAPS logs and such, hence ruining the minimum FPS benchmarks. if your drives are inactive, you must be playing small console ports. every game has a load screen, every OS has background tasks. this isnt worth arguing over, what i've stated is fact - minimum FPS is tricky to measure.


----------



## antuk15 (Jan 3, 2012)

Mussels said:


> you cant see the dips with the naked eye, they show up in FRAPS logs and such, hence ruining the minimum FPS benchmarks. if your drives are inactive, you must be playing small console ports. every game has a load screen, every OS has background tasks. this isnt worth arguing over, what i've stated is fact - minimum FPS is tricky to measure.



Its not tricky at all, And fraps is not 100% accurate and causes issues itself.

If you get to the point where HDD access is affecting your frame rate in a big enough way for you to notice it without using frame rate counters ( again not 100% accurate and there own issues ) then there us something wrong with your rig.


----------



## Mussels (Jan 3, 2012)

antuk15 said:


> Its not tricky at all, And fraps is not 100% accurate and causes issues itself.
> 
> If you get to the point where HDD access is affecting your frame rate in a big enough way for you to notice it without using frame rate counters ( again not 100% accurate and there own issues ) then there us something wrong with your rig.



this has nothing to do with the original discussion of why minimum FPS wasnt used in the reviews.


----------



## W1zzard (Jan 3, 2012)

antuk15 said:


> And fraps is not 100% accurate and causes issues itself



doesn't that mean "tricky" ? actually your statement confirms that.

how do you propose to reliably measure minimum fps? how to ensure decent accuracy? what resolution and accuracy for the measurement do you consider acceptable?

how is minimum fps defined? (just one frame? over one second?)
when does a frame start and end anyway? what about the time between frames?

everybody who shows minimum fps in their reviews uses fraps.


----------



## cdawall (Jan 3, 2012)

W1zzard said:


> doesn't that mean "tricky" ? actually your statement confirms that.
> 
> how do you propose to reliably measure minimum fps? how to ensure decent accuracy? what resolution and accuracy for the measurement do you consider acceptable?
> 
> ...



Haha I read that post as go play in traffic antuk15. You know what W1zzard probably has no idea what he is talking about I mean he only runs a forum with 65 thousand members and 2.4 million posts, not to mention some of the most in depth reviews on the internet. :shadedshu


----------



## W1zzard (Jan 3, 2012)

cdawall said:


> You know what W1zzard probably has no idea what he is talking about I mean he only runs a forum with 65 thousand members and 2.4 million posts, not to mention some of the most in depth reviews on the internet.



while i appreciate your comment, i disagree. i dont run a religious outfit, so feel free to ask questions and criticize. we're all here to learn


----------



## cdawall (Jan 3, 2012)

W1zzard said:


> while i appreciate your comment, i disagree. i dont run a religious outfit, so feel free to ask questions and criticize. we're all here to learn



Very true. Just out of curiosity why do you still test at such low resolutions? Just to get a comparison with some of the lower end cards or is there a deeper reasoning?


----------



## W1zzard (Jan 3, 2012)

cdawall said:


> Very true. Just out of curiosity why do you still test at such low resolutions? Just to get a comparison with some of the lower end cards or is there a deeper reasoning?



comparison data for low end cards.

in theory i could leave out the graphs of the lower resolutions for high-end cards. i'd still have to bench them for comparison. but some readers might be interested in the low-res graphs to look at them for advanced concepts like cpu dependency, resolution scaling etc.

the majority of readers should have no issues skipping over a few graphs on each page


----------



## Tatty_One (Jan 3, 2012)

THE_EGG said:


> i am still surprised how well the nehalem processors are holding up especially because of how old they are. I upgraded from my 920 at the start of last year to a 2600k only because it seemed more fun and it ran much cooler.



Ditto^^^  it makes me feel kind of good even though I have had this chip for 3 odd years.....and as most of us here overclock, I would guess that if all the CPU's on test were cranked up to BD's stock clocks, the results would have been even more interesting.


----------



## Makaveli (Jan 3, 2012)

W1zzard said:


> comparison data for low end cards.
> 
> in theory i could leave out the graphs of the lower resolutions for high-end cards. i'd still have to bench them for comparison. but some readers might be interested in the low-res graphs to look at them for advanced concepts like cpu dependency, resolution scaling etc.
> 
> the majority of readers should have no issues skipping over a few graphs on each page



Hey Wiz have you seen techreports new approach to graphing FPS if so what do you think about it?


----------



## antuk15 (Jan 3, 2012)

W1zzard said:


> doesn't that mean "tricky" ? actually your statement confirms that.
> 
> how do you propose to reliably measure minimum fps? how to ensure decent accuracy? what resolution and accuracy for the measurement do you consider acceptable?
> 
> ...



Crysis
Crysis 2
Stalker series
Far Cry 2
Cryostsis
Batman : AA
Metro 2033

There's HUNDREDS of games that have built in benchmark programs that messure minimum fps without the need for fraps or any other program.

A few other website actually provide minimum frame rates but if you look closely its only for games that have built in benchmarks that show and record them. These built in benchmarks are much more consistent then fraps and you should know that.

But yes you're right, They all use fraps


----------



## OOZMAN (Jan 3, 2012)

antuk15 said:


> Crysis
> Crysis 2
> Stalker series
> Far Cry 2
> ...



Hey man, fraps is awesome.


----------



## W1zzard (Jan 3, 2012)

antuk15 said:


> Crysis
> Crysis 2
> Stalker series
> Far Cry 2
> ...



you want me to bench with those games only? almost all new titles dont have benchmarking functionality



> These built in benchmarks are much more consistent then fraps and you should know that.



how do you define consistent ? and whats your reference value to compare to ? part of the issue is what i mentioned further above for which you apparently have no answers. another problem is that time measurements are quite difficult to do on windows. i'd expect fraps to do better in that department than most engine benchmarking code



Makaveli said:


> Hey Wiz have you seen techreports new approach to graphing FPS if so what do you think about it?


just looked at it. good to see someone trying new things.
http://techreport.com/articles.x/22192/11






personally i think fps graphs are too complicated for many readers and offer little additional insight. 
not sure why tr graphs their data the way they do, but frame number on the x axis seems like a bad choice. you want to put time on x axis. look how each of their graphs has a different number of frames for its own run.
frametimes on y is also counterintuitive to what most readers expect, especially if the values are in the 20-100 range where people instantly think fps

the use of 99th percentile frametime makes no sense to me (yes i know what 99th percentile is). most people will look at that graph with the big scientific name, skip it, and be impressed with it

time spent beyond 50 ms: good idea. bad naming, i thought FPS again
so each of their benchmark runs runs a different time duration. then they add up how long the frametimes were 50+ ms (for a different number of frames in each run) and then compare these values by putting them in a graph. so they compare a shorter maximum time with a longer time?

edit: so i found their article explaining the changes: http://techreport.com/articles.x/21516
good read. their choices make more sense now. need to think more about it, but that alone is a problem. review readers dont want to read an instruction manual for the review


----------



## someone (Jan 4, 2012)

Hi there 

@techreport
I generally like their ideas very much. As a former user of a multi-gpu setup I particularly know that fps don't tell the whole story. In fact, they don't even tell half the story sometimes. I remember playing Crysis at 70-80fps (vsync=off) which stuttered big time and felt worse than 40fps without micro stuttering.

As per the metrics, they should definiitely *normalize *everything, that would eliminate the problem of comparing runs of different lengths (or different amounts of frame times). Instead of counting the number of frame times larger than a given threshold, they would then report the proportion which is larger than 50ms. I find this transformation quite standard and straightforward, and it's a clear improvement IMO.



@BD vs. SB vs. NH
The results are pretty much as expected (sadly for AMD, one should note).

Here are two interesting things in the results:

1. In Skyrim, going from 1024 to 1280 keeps the fps constant, the same for 1680 to 1920.
But there's a difference between the upper resolutions and the lower ones:




(black arrow=no difference, red arrow=difference)


2. In Starcraft, on the other hand, there are neither "horizontal" nor "vertical" changes, or at least the upper resolutions are more similar to the lower ones than in our example before (Skyrim):






I suspect the aspect ratio makes the difference, since the upper resolutions are 4:3 and 5:4 whilst the lower ones are 16:10 and 16:9 respectively. The wide screen aspects require more rendering in the horizontal than in the vertical, compared to the 4:3 (and 5:4) ratio.


The explanation, why there is a difference in Skyrim and (almost) none in Starcraft is then, that
- there's not so much going on in the vertical in Skyrim: floor texture and sky texture, from which especially the latter is very simple to compute for the CPU. So, when you add more horizontal pixels, I would expect a much larger amount of CPU computations necessary than when adding vertical ones (just paint some more heaven and floor, to oversimplify).

- in Starcraft, since it's a top-down view, adding horizontal pixels and adding vertical ones should make (almost) no difference. It's far less asymmetric than Skyrim, since more terrain, buildings and units will be computed, regardless in what direction the image is expanded.


----------



## HTC (Jan 4, 2012)

W1zzard said:


> you want me to bench with those games only? almost all new titles dont have benchmarking functionality
> 
> 
> 
> ...



Interesting read: hadn't seen that approach before.

If i may make a suggestion: instead of using seconds, why not use the next value along the line, as in tenths of seconds? Wouldn't this catch more of the issues described in that techreport site's article?

Ofc, i'm assuming there's a tool that can measure this because, otherwise, no point in even trying.


----------



## MilesRdz (Jan 8, 2012)

A lot of people seem to not notice or care that Bulldozer is vastly underutilized in most of these games. 
While SB is using half or more of it's resources, BD is using about 1/4th. 

If games used more threads to feed the graphics card, we wouldn't be discussing this issue to death. 

Some people could argue that if BD had better single-threaded performance, games would run better. 
That is true, but it doesn't change the fact that BD is underutilized.


----------



## Mussels (Jan 8, 2012)

MilesRdz said:


> A lot of people seem to not notice or care that Bulldozer is vastly underutilized in most of these games.
> While SB is using half or more of it's resources, BD is using about 1/4th.
> 
> If games used more threads to feed the graphics card, we wouldn't be discussing this issue to death.
> ...



and thats why people still have hope for the two-patch solution MS is working on for windows 7 and BD.


----------



## ensabrenoir (Jan 8, 2012)

Mussels said:


> and thats why people still have hope for the two-patch solution MS is working on for windows 7 and BD.



Yeeeahhh but ain't

1/4  of 8 =2

1/2 of 4 =2

Sooo.....its  pretty much a fair match up ratio

Wait.....no it aint....sorry my math was off


----------



## RejZoR (Jan 9, 2012)

Good to see my "outdated" Core i7 920 is not for old junk just yet...


----------



## TheGuruStud (Feb 4, 2012)

Mussels said:


> and thats why people still have hope for the two-patch solution MS is working on for windows 7 and BD.



That doesn't fix apps only using a couple threads, though.

And we all know that since M$ released the patch it doesn't do anything at all. We will have to wait for win8 for any small improvement.



MilesRdz said:


> A lot of people seem to not notice or care that Bulldozer is vastly underutilized in most of these games.
> While SB is using half or more of it's resources, BD is using about 1/4th.
> 
> If games used more threads to feed the graphics card, we wouldn't be discussing this issue to death.



It wouldn't be discussed at all b/c intel would lose in every game. That's how the game works don't you know? 

A lot of sites (paid by you know who) won't bench apps favorable to AMD b/c they're paid not to. We end up with reviews massively one siding a situation regardless of real performance. 
I remember Athlon 64s losing to pentium 4s back in the day in benches....now, I wonder how that was possible LOL


----------



## BlackOmega (Feb 4, 2012)

Where do people get off saying BD is fail? I looked through every page and, more often than not, it was beating Nehalem, and sometimes even beating the 2500k.

 While sure it's not superdooperawesome like a lot of people were hoping for, but it's far from fail IMO.

EDIT: W1zzard, how come you guys don't test in 1080p? I'd dare say that the probably THE MOST common resolution used these days, yet it's always omitted in tests.


----------



## Aquinus (Feb 4, 2012)

BlackOmega said:


> Where do people get off saying BD is fail? I looked through every page and, more often than not, it was beating Nehalem, and sometimes even beating the 2500k.
> 
> While sure it's not superdooperawesome like a lot of people were hoping for, but it's far from fail IMO.
> 
> EDIT: W1zzard, how come you guys don't test in 1080p? I'd dare say that the probably THE MOST common resolution used these days, yet it's always omitted in tests.



That is because at higher resolutions, your bottleneck is almost always your video card. Bulldozer didn't fail, it just had a lot of hype for something that was decent at best in comparison to the T1100 and T1090. Bulldozer did good enough where it has to and shines when SMP really matters. It's the first step towards something better. There aren't a whole lot of applications that use a lot of SMP, but there very well could in the future.

This review kind of puts everything into perspective imho.
http://guru3d.com/article/radeon-hd-7970-cpu-scaling-performance-review

In all realism, BD isn't that bad. Keep in mind that a lot of these titles don't use a lot of cores yet, so there is a lot of horse power BD still has waiting to be used. Nothing is stopping someone from transcoding video while playing a video game and not have a problem. *That* is what AMD is trying to do.

Albeit, Intel has better IPC counts, but that is only because SB has a shorter pipeline than BD. BD has some obstacles to overcome, but all in all, it is more space friendly, so you can cram more cores on the same amount of die space.


----------



## BlackOmega (Feb 4, 2012)

Aquinus said:


> That is because at higher resolutions, your bottleneck is almost always your video card.


 I know that, but thanks anyway. The reason I'm asking is because why would you review at a resolution that 7% use, as opposed to a resolution that is THE MOST commonly used --1920x1080 (25%). Which is followed by 1680x1050 @ 17%. (source)


Aquinus said:


> Bulldozer didn't fail, it just had a lot of hype for something that was decent at best in comparison to the T1100 and T1090. Bulldozer did good enough where it has to and shines when SMP really matters. It's the first step towards something better. There aren't a whole lot of applications that use a lot of SMP, but there very well could in the future.
> 
> This review kind of puts everything into perspective imho.
> http://guru3d.com/article/radeon-hd-7970-cpu-scaling-performance-review
> ...


You're absolutely right, it didn't but all of these intel fanboi's would lead people to believe that it's slower than socket 939 single core. When in reality, it's their fastest CPUs' to date. Regardless of whether the software can utilize it or not.


----------



## cadaveca (Feb 4, 2012)

BlackOmega said:


> Regardless of whether the software can utilize it or not.



For most peopel though, this is what's most important.

The question gets asked "Will this make what I do now faster?"

And the answer, of course, for most is "Not Really".


The hype let people down, but of course it's not a bad chip...but it's not "The best" either. I think many more people would be happier if the 8150's price matched the 2500k's, but it doesn't.

I've recommended to many PHenom II quad users that they upgrade to 8150, and soem have. Not one has been disappointed in the change.


----------



## ensabrenoir (Feb 5, 2012)

BlackOmega said:


> I know that, but thanks anyway. The reason I'm asking is because why would you review at a resolution that 7% use, as opposed to a resolution that is THE MOST commonly used --1920x1080 (25%). Which is followed by 1680x1050 @ 17%. (source)
> 
> You're absolutely right, it didn't but all of these intel fanboi's would lead people to believe that it's slower than socket 939 single core. When in reality, it's their fastest CPUs' to date. Regardless of whether the software can utilize it or not.



Aaaaahhhhhhhh thats like having the fastest speed boat....in the  middle of the desert.  JUST ENJOY THE DARN THING!!!!!!!!  There  is no mighty, morphing power ranger upgrade fix for it.  Its not the intel killer.  By the time software catches up to it....sometime 10 times better will be available... I still use my mindisc and laserdisc player....why? Because I paid for it and it makes me happy....get it? Apply


----------



## NdMk2o1o (Feb 5, 2012)

Don't really see what all the fuss is about with BD, it compares and competes directly with Intel Nehalem which before SB was the target. I would be happy with BD if I had one, just so happens I have SB and am able to  BD lol but seriously, still a half decent chip and if anything with AMD the 2nd revision will always be stronger as they improve upon the 1st build. Nice review W1zz, thanks as always!!


----------



## Aquinus (Feb 7, 2012)

You also have to realize that single-threaded workloads isn't bulldozer's strong suit. You do some media encoding with the 8150 and it will give any 1155 CPU (at the moment,) a run for it's money, and in some cases get's close to 990X performance when it comes to media. If I look at framerates for any game and see them practically at 50-60fps, I wouldn't complain. Also SC2 isnt as dependent on IPC as it is on memory bandwidth, which is what Intel's chips are currently excelling it. Just keep in mind that a properly tuned bulldozer can crank out some impressive numbers.

Also, rumor has it that the next version of bulldozer, "Enhanced Bulldozer," may have a quad-channel memory controller while still using AM3+. That could be a good selling point as AMD processors are reasonable to replace without having to change all of your hardware. Intel's IPC counts are much nicer than AMD's, but AMD has something going for it because bulldozer has a very scalable architecture. Once AMD trims off the fat, reduces the length of the pipeline and gets its memory controller up to snuff. It will do better in single-threaded applications, and there will be more cores at the same time.

The future isn't single-threaded applications, just keep that in mind. Remember where we were 10 years ago, and 10 years ago before that.


----------



## Prima.Vera (Feb 14, 2012)

Damn. I have a 4 years old Core 2 Quad Q9650 which is on pair with the I7-920, and still beats the crap out of the Shitdozer. Shame AMD, shame!


----------



## Mussels (Feb 14, 2012)

Prima.Vera said:


> Damn. I have a 4 years old Core 2 Quad Q9650 which is on pair with the I7-920, and still beats the crap out of the Shitdozer. Shame AMD, shame!



show me where you got these numbers from, please


----------



## InnocentCriminal (Feb 14, 2012)

Yeah I'll 2nd that. I'll be interested to see how it compares with my Q9550.


----------



## Tatty_One (Feb 14, 2012)

He probably means that an overclocked Q9650 @ 4.5gig is near to an overcocked i7 920 @ 4.2 - 4.3gig, I went 920 from 9650 and found a decent increase in performance across the board and thats with my old 9650 @ 4.5gig.


----------



## jaredpace (Feb 14, 2012)

Mussels said:


> show me where you got these numbers from, please


----------



## Tatty_One (Feb 14, 2012)

Cant read/see any of them lol..... maybe just me, old eyes and that!


----------



## jaredpace (Feb 14, 2012)

Tatty_One said:


> Cant read/see any of them lol..... maybe just me, old eyes and that!



oh shoot, here ya go
http://www.hardware.fr/articles/842-20/jeux-3d-crysis-2-arma-ii-oa.html


----------



## ZenZimZaliben (Feb 14, 2012)

Great Review. Makes me happy to see my i7-930 is still a great performer at the resolution I run which is 2560x1600. Almost no difference at least not enough to force me to upgrade. Here's to holding out for i9.


----------



## abundant threading (Feb 24, 2012)

*Thanks*

Nice review thanks, but there is one thing i don't understand.
The FX matched the 2500K in the highest test, that's the FX bottlenecking the GPU more then the 2500K, ok fine..

But if you look here http://amdfx.blogspot.com/2012/02/dirt-3-revisited-again-by-request.html

They have set the AA to 8x where here it is 4x.

The FPS are about a continuation from what they are in the test here, and suddenly clock for clock the FX overtakes the 2500K by 20% (avr-FPS)

What is going on with that?


----------



## Outback Bronze (Feb 26, 2012)

Its Actually very good to see the i7 920 still holding up with the big guns. It deserves credit. How old is it now and what speed is it running at! Im going to thank w1zzard 4 this one. Very informative review.


----------



## xenocide (Feb 26, 2012)

abundant threading said:


> Nice review thanks, but there is one thing i don't understand.
> The FX matched the 2500K in the highest test, that's the FX bottlenecking the GPU more then the 2500K, ok fine..
> 
> But if you look here http://amdfx.blogspot.com/2012/02/dirt-3-revisited-again-by-request.html
> ...



And where were the i5-2500k numbers pulled from?  Since they clearly weren't taken from this actual review which baselined using a 7970.  Those appear to be from some TomsHardware review judging by the coloring scheme and graph setups, let's see how W1zzards latest review ranks an i7-920 @ 3.8GHz with a 6990;






If Nahalem won there, I assure you SB would have done even better.  You cannot compare data from multiple data sets since everyone uses different setups.  That post is clearly bias since it's from, you know, AMDFX.blogspot.


----------



## erocker (Feb 26, 2012)

abundant threading said:


> Nice review thanks, but there is one thing i don't understand.
> The FX matched the 2500K in the highest test, that's the FX bottlenecking the GPU more then the 2500K, ok fine..
> 
> But if you look here http://amdfx.blogspot.com/2012/02/dirt-3-revisited-again-by-request.html
> ...



I'd like to see another person do this benchmark with an 8150. I find it hard to believe.


----------



## Aquinus (Feb 26, 2012)

erocker said:


> I'd like to see another person do this benchmark with an 8150. I find it hard to believe.



You're going to notice difference when you start adjusting image quality. When you start jacking up AA you're taxing the GPU more than the CPU... but if you have a nice video card, that might be exactly what you want. Once again I will reference Guru3d's CPU scaling review with the Radeon HD 7970, and for heavily multi-threaded games such as Crysis 2 and Battlefield 3, you will see minimal improvement by upgrading the CPU beyond Phenom II quad-core (modern that is, not an AM2+ like mine, not to say my 940 doesn't handle Crysis 2 fairly well at full graphics with a 6870.)

Seriously though, it's a review worth checking out if you haven't already.
http://www.guru3d.com/article/radeon-hd-7970-cpu-scaling-performance-review/

Bulldozer isn't bad, it's just not as good as Sandy Bridge. The sooner everyone realizes that, the less flaming there will be. Also you can only improve a processors IPC so much before you hit a top-end limit. As software develops for post anno-2012, you will notice a lot more software utilizing more threads, and for bulldozer, an architecture that scales almost linearly with the amount of cores (unlike Intel's HyperThreading,) you will see that in the future more cores will result in more speed. Similar to video encoding on the FX-8150, something that it keeps up with the 2600k very well.

Reference for video encoding: http://www.guru3d.com/article/amd-fx-8150-processor-review/14
(Yes, I know that the FX-8150 doesn't do as well on other benchmarks, but this is a practical application where the FX-8150 is more than adequate when all cores are being used.)
Also keep in mind that when you're playing a game that can't use all 8 cores, you have that extra power to do other things. Virtual machines? Folding? Video encoding? Yeah, it can do that. :|


----------



## Aquinus (Feb 26, 2012)

Since no one really wants to make a comparison, I grabbed CPU benchmark numbers from Guru3d and plugged them into excel. I calculated Performance of FX-8150 vs i7 2600k and i5 2500k.

If you take a look at the attached document, you will see bulldozer (at stock speeds,) excels over the i7 2600k in the following applications:

FlyRender
Espresso Transcode
Video Transcoding (H.264 (DTS5.1) to x.264 AC3 5.1)
CPU Hash Bench using SHA1
Sandra Memory Index (matches i7,)

When you overclock Bulldozer to 4.6ghz, it takes the lead in the following applications:
SiSoft Sandra FPU GFLOPS
Zlib Compression
VP8 Video Compression

Notice a trend? All the heavily multi-threaded applications seem to chug right on through with a bulldozer. It only suffers on severely single-threaded applications, and even if it does it runs adequately fast for them anyways, but if you're getting a BD, it is not for single-threaded performance, that is for sure.

Also keep in mind the price of the 2600k and the FX-8150, so anything that is not less than 25% slower than the i7 2600k gives you what you paid for in 75% of all cases. Bulldozer delivers for the price that it is set at very well and for any instance where the 8150 overtakes the 2600k is simply a great value for what you're getting.

You can all bash Bulldozer as much as you want, but numbers don't lie. For a brand new, first revision platform, it does pretty damn well.

Information provided is courtesy of Guru3d.
http://www.guru3d.com/article/amd-fx-8150-processor-review/1

Edit: I would also like to add that all of these benchmarks use DDR3-1600, and not the FX-8150's native 1866. Guru3d also didn't try overclocking the 8150's FSB, which doesn't help the memory controller or cache latency.


----------



## abundant threading (Feb 26, 2012)

Aquinus said:


> You're going to notice difference when you start adjusting image quality. When you start jacking up AA you're taxing the GPU more than the CPU... but if you have a nice video card, that might be exactly what you want. Once again I will reference Guru3d's CPU scaling review with the Radeon HD 7970, and for heavily multi-threaded games such as Crysis 2 and Battlefield 3, you will see minimal improvement by upgrading the CPU beyond Phenom II quad-core (modern that is, not an AM2+ like mine, not to say my 940 doesn't handle Crysis 2 fairly well at full graphics with a 6870.)
> 
> Seriously though, it's a review worth checking out if you haven't already.
> http://www.guru3d.com/article/radeon-hd-7970-cpu-scaling-performance-review/
> ...



Yes, and can i suggest the CPU's workload is not affected by screen resolution? so at a low resolution like 1024x768 is basically the maximum speed the CPU is capable of running that game at. Hence the consistent 120 FPS, the FX is simply topping out at the speed it can render the image at, like the refresh rate of your monitor.

Crank the resolution up, the GPU needs to do more, and more to keep up, enable sufficiently demanding anti-aliasing and anisotropic filtering the GPU will no longer be able to keep up with the CPU, just because this test shows the 2500K running 200 FPS compared with 120 FPS with the FX is not to say the 2500K is the more powerful CPU for THIS game.

If you look at this test more closely you can see where the ceiling vs the ceiling of the FX meet, or not... as at the very least they match in the highest test, if not the FX starts to overtake,- it is computing more FPS in the last test (all be it very slight) the FX FPS has barely moved from where is was at low resolution, where the 2500K has dropped by about 50%.

If you look at another review sites (mentioning no names here) test of BF3 it shows an FX-4100 matching a 2500K and as near as makes no difference an FX-8150 and 2600K with an i3 2100 not far behind at high graphics settings / resolution.

I would not write the FX Blog off, had a higher test been published here we MIGHT be having a very different conversation.

@ Aquinus, i have a 6 core Thuban, clock for clock it beats a 2500K in multi-threaded benchmarks, an FX-8150 beats my Thuban in the same benchmarks, you compare them in a Linux OS where thread allocation is far more efficient than in Windows and AMD monster Intel.

CRAY are using 16 thread Bulldozer CPU's on a Linux based OS for there new supper computer.


----------



## BarbaricSoul (Feb 26, 2012)

Guys, erocker has a FX8150 chip and a 7970 gfx card. He has a good idea of what he's talking about.


----------



## Aquinus (Feb 26, 2012)

abundant threading said:


> Yes, and can i suggest the CPU's workload is not affected by screen resolution? so at a low resolution like 1024x768 is basically the maximum speed the CPU is capable of running that game at. Hence the consistent 120 FPS, the FX is simply topping out at the speed it can render the image at, like the refresh rate of your monitor.
> 
> Crank the resolution up, the GPU needs to do more, and more to keep up, enable sufficiently demanding anti-aliasing and anisotropic filtering the GPU will no longer be able to keep up with the CPU, just because this test shows the 2500K running 200 FPS compared with 120 FPS with the FX is not to say the 2500K is the more powerful CPU for THIS game.
> 
> ...



BF3 is very cpu efficient in general, even an Athlon x4 handles BF3 reasonably at higher resolutions. 

(For those of you who don't know, at higher resolutions there is more stress on the GPU, not the CPU.)

You can't compare the Linux kernel and Windows NT (yes, ever since Windows NT, all versions of Windows has been based on the NT kernel,) because they're completely different animals with different goals in mind... also the average user won't care about *nix performance on their platform... unlike us. 



BarbaricSoul said:


> Guys, erocker has a FX8150 chip and a 7970 gfx card. He has a good idea of what he's talking about.



Then lets see some numbers! 
Edit: If you have time, I don't want to be pushy because I know how valuable time is and how long benchmarks can take.


----------



## abundant threading (Feb 26, 2012)

Yes i know, as i said in my post an i3 doesn't do to badly either...  although it is the one where its ability to keep up has faltered, an i3 2100 is an i5 2500 with half as many cores... and a 45nm Athlon is basically a Phenom without L3. (what do you make of that)

For me, in this test we simply don't know whats better for this game, there comes a point where the CPU can no longer keep up, or we could run these games on Pentium II's, this test does not show where the 2500K vs the FX gives up first. it shows they are both as capable as eachother at setting and REZ this game will be played at, the 2500K is not the winner in this Dirt3 or BF3... test as it stands, more data is needed.


----------



## cadaveca (Feb 26, 2012)

abundant threading said:


> test as it stands, more data is needed.



I'm not sure why though? I mean, it's a given that the 8 threads of the 8150, in some instances, will beat out the  four threads of the 2500k.

But the 2500K costs less than the 8150, and in gaming, the actual number of titles that benefit from the extra threads are few and far between.

the FX chips aren't bad..they are just badly priced in order to compete. When most people are concerned about cost, that completely kills the FX's chance of success on a wide scale. That doesn't mean it's a failure...it's just not going to be the popular choice.

A few benchmarks isn't going to change that.


----------



## abundant threading (Feb 26, 2012)

cadaveca said:


> I'm not sure why though? I mean, it's a given that the 8 threads of the 8150, in some instances, will beat out the  four threads of the 2500k.
> 
> But the 2500K costs less than the 8150, and in gaming, the actual number of titles that benefit from the extra threads are few and far between.
> 
> ...



Um... well, the FX-8120 costs the same as the 2500K and is the same CPU with a lesser clock speed, the FX-8120 is a down clocked FX-8150. i would not buy an FX-8150 i would get the 8120 and then boost the multiplier to the same level as the 8150 and there you have it... an 8150


----------



## cadaveca (Feb 26, 2012)

abundant threading said:


> Um... well, the FX-8120 costs the same as the 2500K and is the same CPU with a lesser clock speed, the FX-8120 is a down clocked FX-8150. i would not buy an FX-8150 i would get the 8120 and then boost the multiplier to the same level as the 8150 and there you have it... an 8150



Great, unfortunately, overclocking voids your warranty, and that's enough for people, average people, to not bother. That makes that point a bit invalid.

When electronic retailers sell "upgrade" warranties for considerable value, consumers on a whole put real monetary value on their warranties, and having made just one successful RMA, really bolsters that value.

Enthusiasts know the real story already. However, stock performance of an 8150 isn't what enthusiasts are after. They are after overclocked performance, and on Intel, you can purchase a warranty to cover your OC, nevermind that cost of cooling that OC'ed system is much less, thanks to Intel's lower power consumption, so the average consumer and enthusiast alike is more likely to OC with Intel, provided sales people do their jobs properly.


Either way, you can go with either camp and have a fairly decent system, each with it's own strength and weaknesses. With ALL things considered, to me, Intel is better with current generation stuff. For the same power on the high-end, you get x79, not a 2500k, so for me, that's where the real comparison is, and at that, Intel to me still offers more.

After all, A75 is AMD's entry platform right now, while 1155 is Intel's entry. Both come with onboard GPUs. AMD is mroe cost-effective, but lacks Intel's CPU processing grunt. X79 and FX...well, you get the picture.


----------



## abundant threading (Feb 26, 2012)

Oh no argument from me, the FX is a disappointment to me but i don't think its as bad as some like to portray it, as you say in a round about way.

AMD have taken a big gamble here designing a completely new architecture, its not paying diffident yet, but it is new and may pay off in the longer term, Intel had the same problems with P4, you may remember?

A more interesting discussion is the future Piledriver chips which will use a resonant clock mesh based on that new architecture.

That's exiting stuff, but perhaps not for this thread.


----------



## Aquinus (Feb 27, 2012)

cadaveca said:


> Great, unfortunately, overclocking voids your warranty, and that's enough for people, average people, to not bother. That makes that point a bit invalid.



You of all people should know that you can't detect if a chip has been overclocked when it gets returned. It's an empty threat the CPU vendors can't enforce. It's a scare tactic to prevent people from returning fried CPUs.

As for resident clock mesh, I don't think it will be enough to lower power consumption as much as Intel's 3d (3 sources, 3 drains per transistor,) transistor. Like I said before, it is a new architecture and for the amount of die space per module, you can cram more cores on the same die area. The future is multi-threading.

This really has been running on but I will say this. Intel has the CPU market figured out for now. AMD has the CPU market figured out for down the road. It really is as simple as that.

(Bulldozer makes a very nice server chip, and for workstations, dual 8-core Valencias at 3.3ghz is just plain sexy and cost effective.)


----------



## abundant threading (Feb 27, 2012)

Aquinus said:


> As for resident clock mesh, I don't think it will be enough to lower power consumption as much as Intel's 3d (3 sources, 3 drains per transistor,) transistor. Like I said before, it is a new architecture and for the amount of die space per module, you can cram more cores on the same die area. The future is multi-threading.
> 
> This really has been running on but I will say this. Intel has the CPU market figured out for now. AMD has the CPU market figured out for down the road. It really is as simple as that.
> 
> (Bulldozer makes a very nice server chip, and for workstations, dual 8-core Valencias at 3.3ghz is just plain sexy and cost effective.)



I'm not talking about any sort of comparisons, not everything needs to compare with Intel 

PD's will start with a clock speed of around 4Ghz and moving up from there with 8 or maybe 10 cores with better power consumption and less heat. it might not be as efficient as Ivy but a whole lot more cores and better OC potential than BD..... that's something worth an interest in.


----------



## xenocide (Feb 27, 2012)

Your argument falls flat on its face when you consider that, who cares how threaded the CPU's of today are when the future is in multi-threading?  Intel scales their Core Count with what the average consumer uses, they could probably release 6 and 8 core iterations if they wanted, but why release a product only 0.1% of your userbase could benefit from?  By the time 8 threads is the norm for all applications, BD will be long gone.  People have been praising multi-threaded apps since like 2005, but software to accomedate that is just too slow.


----------



## trickson (Feb 27, 2012)

Looks like Sandy E Bridge is killing every thing! Man Intel is still on top and looks like it will be there for some time to come! Intel FTW!!!


----------



## abundant threading (Feb 27, 2012)

trickson said:


> Looks like Sandy E Bridge is killing every thing! Man Intel is still on top and looks like it will be there for some time to come! Intel FTW!!!



Yes and if it kills AMD watch the Price of Intel go waaaaaaay up, that would be nice for all of us, right?


----------



## trickson (Feb 27, 2012)

abundant threading said:


> Yes and if it kills AMD watch the Price of Intel go waaaaaaay up, that would be nice for all of us, right?



Who cares as long as you get the best! AMD has only AMD to blame! If they want to do better then they better step it up!


----------



## ensabrenoir (Feb 27, 2012)

This will never end......  when ivy comes  and renders everything else irrellevant the wait a few more years till software catches up arguement will continue......  If looked at from the right angle.... Even the most crookedest(not sure if thats a word) line looks straight.


----------



## abundant threading (Feb 27, 2012)

trickson said:


> Who cares as long as you get the best! AMD has only AMD to blame! If they want to do better then they better step it up!



If you don't care about CPU's costing 2, 3, 4 times as much your an idiot.

Oh, and inovation will also stop.


----------



## trickson (Feb 27, 2012)

abundant threading said:


> If you don't care about CPU's costing 2, 3, 4 times as much you an idiot.
> 
> Oh, and innovation will also stop.



Are you high? Just how do you think this is going to happen? Man AMD is still a head to head contender! They may not be out right on top but still contenders none the less. Put the pipe down and step away from the bong! Innovation will stop? Are you smoking pot?


----------



## Super XP (Feb 27, 2012)

> not an AM2+ like mine, not to say my 940 doesn't handle Crysis 2 fairly well at full graphics with a 6870.


It won't do well at high settings with your setup. I used to have the 940 @ 3.60GHz with the HD 6970 and could not go high PQ regardless. This was for games such as Metro 2033 and the Crysis series.

Now that I have a FX 8120 @ 4.40GHz, I can set the PQ to high (playable frame rates), but not super Max. Anybody claiming otherwise is full of shit.


----------



## abundant threading (Feb 27, 2012)

trickson said:


> Are you high? Just how do you think this is going to happen? Man AMD is still a head to head contender! They may not be out right on top but still contenders none the less. Put the pipe down and step away from the bong! Innovation will stop? Are you smoking pot?



I'm talking if Intel kill AMD, no competition prices go up, investment and innovation stops..... its not hard to understand.

Every review site shows / portrays AMD's as junk. it will kill AMD if things don't improve, AMD have already suggested they are pulling out of competing with Intel in x86 and concentrate on mobile, If PD does not pick it up they will pull out, that will free Intel of competition, that's not good. it may even kill the enthusiast market all together, unless your rich.


----------



## trickson (Feb 27, 2012)

abundant threading said:


> I'm talking if Intel kill AMD, no competition prices go up, investment and innovation stops..... its not hard to understand.
> 
> Every review site shows / portrays AMD's as junk. it will kill AMD if things don't improve.



Man you are high!  AMD will never be killed off by Intel! BD is NOT THAT FAR BEHIND AT ALL!! Look at what you are saying. Every review site shows or portrays AMD as JUNK? Really that is what you got from this review? Is that AMD is JUNK? Man please do us all a favor and post when you are sober not high!

Yes Intel is the crown champ but NOT BY MUCH AT ALL! Not in the way that will kill AMD or prevent more innovation or price hikes! You are see this as black and white. It is not that cut and dry my friend. AMD has the lead in cores it has the lead when it comes to APU'S! So there is no validation in what you are saying. AMD is still a very hard contender and Intel will never put them down.


----------



## abundant threading (Feb 27, 2012)

trickson said:


> Man you are high!  AMD will never be killed off by Intel! BD is NOT THAT FAR BEHIND AT ALL!! Look at what you are saying. Every review site shows or portrays AMD as JUNK? Really that is what you got from this review? Is that AMD is JUNK? Man please do us all a favor and post when you are sober not high!
> 
> Yes Intel is the crown champ but NOT BY MUCH AT ALL! Not in the way that will kill AMD or prevent more innovation or price hikes! You are see this as black and white. It is not that cut and dry my friend. AMD has the lead in cores it has the lead when it comes to APU'S! So there is no validation in what you are saying. AMD is still a very hard contender and Intel will never put them down.



Show me a single game review where the FX-8150 can match a 2600K let alone a 2500K which is cheaper, most reviews show the FX to be way behind even the 2500K, Intel already own 86% of market share, AMD have about 15% and that's falling to Intel, AMD are loosing money by the truck load because of there x86 CPU's and the investment they have tied up there.

AMD have said they are done competing with Intel and are looking to pull out of the enthusiasts market.


----------



## trickson (Feb 27, 2012)




----------



## xenocide (Feb 27, 2012)

abundant threading said:


> AMD have said they are done competing with Intel and are looking to pull out of the enthusiasts market.



This is not technicdally accurate.  They said they were done competing at the high end, and were going to focus on things like APU's and Mid-Range\Entry Level products.  They will still compete with Intel there, but they won't be constantly trying to take the performance crown.  To be honest, I think it was just a mind game on AMD's part to say that, I think they just wanted everyone to lower their expectations so if they do release a highly competative product people are blown away.

As for your claim to no competition, you are very wrong.  If Intel kills AMD, they would be disassembled by Anti-Trust Watchdogs almost instantly.  If that weren' the case, they would be forced to compete with themselves, a foe much scarier than AMD lol.  If AMD were not around, and Intel never innovated, why would people buy new computers?  Or upgrade to new hardware?  Guess what would happen if people stopped buying new computer parts because they never needed to upgrade and upgrading resulted in no benefit and was far too costly, Intel would be dead in a few years.

People need to remember that if Intel wants people to keep buying their products, they have to keep making them better.  AMD's CPU's aren't the only thing driving Intel to be competative.


----------



## trickson (Feb 27, 2012)

xenocide said:


> This is not technicdally accurate.  They said they were done competing at the high end, and were going to focus on things like APU's and Mid-Range\Entry Level products.  They will still compete with Intel there, but they won't be constantly trying to take the performance crown.  To be honest, I think it was just a mind game on AMD's part to say that, I think they just wanted everyone to lower their expectations so if they do release a highly competative product people are blown away.
> 
> As for your claim to no competition, you are very wrong.  If Intel kills AMD, they would be disassembled by Anti-Trust Watchdogs almost instantly.  If that weren' the case, they would be forced to compete with themselves, a foe much scarier than AMD lol.  If AMD were not around, and Intel never innovated, why would people buy new computers?  Or upgrade to new hardware?  Guess what would happen if people stopped buying new computer parts because they never needed to upgrade and upgrading resulted in no benefit and was far too costly, Intel would be dead in a few years.
> 
> People need to remember that if Intel wants people to keep buying their products, they have to keep making them better.  AMD's CPU's aren't the only thing driving Intel to be competative.


Well said my friend! Well said! This is one thing AMD fanboys keep FAILING to see.


----------



## abundant threading (Feb 27, 2012)

How is Intel competing with themselves going to be any sort of problem?

APU's are aimed mainly at the mobile / low end desktop market, that is where they will move to. anything the likes of you and me want would be provided only by Intel, they can set prices to whatever they want, high profit, less- labour, FAB, production, investment costs.

You don't seriously believe that with AMD not competing at this level prices will stay unchanged do you? and what reason would they have to work on new stuff other then boosting the performance or efficiency every so often to keep pepople upgrading.

Competition is good.... that's all.


----------



## trickson (Feb 27, 2012)

abundant threading said:


> Competition is good.... that's all.



Do you think for one second Intel doesn't know this? Just what is it about Intel that makes you think this is going to happen any way? Remember back in the Athlon days when AMD was #1? How they priced there shit so high you needed a grand or more for there coveted FX and FX Black CPU's ? When AMD was on top Intel still thrived with the P4 crap! P4 is what Bulldozer is now. AMD is going to be on top again it will take some time but never fear there is going to be a time soon that AMD pulls the big guns out. 
I remember a time when AMD called out Intel, Yes they challenged Intel to a CPU face off, Now look at what they got! They got SMACKED DOWN! Bitch slapped like a 2 dollar whore! And what is supper funny is the DOOM and gloom the AMD fanboys are preaching! No it is far from over pall. 
Core wars begun have they.


----------



## abundant threading (Feb 27, 2012)

Your making assumptions, AMD would do the same if Intel was out of the picture i know that, competition is good....

And you might be right, AMD may well have something up there sleeves that drops a vast boom right at the heart of Intel, they have done it before as you eluded to, but while such a possibility exists, to me that is very unlikely.


----------



## Mussels (Feb 27, 2012)

i cant beleive the amount of hate just beacause BD is a small amount behind intel. this isnt like the old days where there was massive differences (netburst vs A64), most people wont even notice - especially not on a single GPU setup.


i wish you fanboys could just grow up. you dont need to pick a side and fight everyone who picked the other side, just buy whichever one suits you, and shut the hell up about it -.-


----------



## trickson (Feb 27, 2012)

Mussels said:


> i cant beleive the amount of hate just beacause BD is a small amount behind intel. this isnt like the old days where there was massive differences (netburst vs A64), most people wont even notice - especially not on a single GPU setup.
> 
> 
> i wish you fanboys could just grow up. you dont need to pick a side and fight everyone who picked the other side, just buy whichever one suits you, and shut the hell up about it -.-



I agree. I like them both really. I just wished I could afford an AMD setup right now. Put this old tired Q9650 to rest.


----------



## xenocide (Feb 27, 2012)

abundant threading said:


> Your making assumptions, AMD would do the same if Intel was out of the picture i know that, competition is good....



This is the kind of thinking that leads to duopolies where the consumer always loses.  Look at ISP's in the United States, sure, nobody really has a monopoly, but they have unofficial agreements about pricing and availability that basically means you are stuck with 1 ISP if you want decent high-speed internet, and they usually charge a fortune for any service they offer.

You said APU's are targetted towards mobile\low-end?  AMD's strategy is to make it so APU's are acceptable to everyone except enthusiasts.  Trinity looks like a real game changer in the laptop market, and that is the kind of stuff that is going to keep AMD in the running.  Bulldozer is good at 8 cores, but when you go down to 6 or 4 it just becomes awful compared to even last generation offerings.


----------



## Super XP (Feb 27, 2012)

abundant threading said:


> I'm talking if Intel kill AMD, no competition prices go up, investment and innovation stops..... its not hard to understand.
> 
> Every review site shows / portrays AMD's as junk. it will kill AMD if things don't improve, AMD have already suggested they are pulling out of competing with Intel in x86 and concentrate on mobile, If PD does not pick it up they will pull out, that will free Intel of competition, that's not good. it may even kill the enthusiast market all together, unless your rich.


I believe xenocide summed it up quite nice. In regards to Review Sites portraying AMD as junk, I don't think so, that is going way overboard IMO. A lot of them are in shock because of Bulldozers super hype.

AMD telling the media that they are no longer going to compete with Intel in the high end is nonesense. You can easily refrase this as AMD won't be chasing Intel in the high end, there's no way AMD can take Intel on head on anymore at this time. Intel's R&D is massive.

What I believe AMD is saying in code is we are no longer going to chase Intel in the high end. It does not say AMD is going to stop making high end Desktop CPU's. They will continue to supply to meet demand. Nothing wrong with mid range, because the majority of gamers don't buy high end anyway. And this is where AMD caters too. 

Is the FX 8120 and FX 8150 AMD's high end Desktop CPU's? Yup, which is why I bought one


----------



## abundant threading (Feb 27, 2012)

The trouble is there isn't enough demand and its falling, the only reason AMD have not gone under already is because unlike there CPU's there GPU's are very good, they are fast, efficient, powerful (everything FX are not) people like them and they are selling very well.

Shocked? yes....

I was looking forward to FX, I have a 6 core Phenom II and i love it, it runs nicely at 4.3Ghz on an inexpensive cooler, it can handle any game pretty well, in multithreading the thing is for its money untouchable, it shrugs off just about anything you can throw at it, no it doesn't bench as high in single threaded x86 as a SB i5, but its never far behind and it can easily be excused considering how old it is, so old people have been yearning for something new for years.

Then finally.... it came! They finally dumped the ancient K8 design, not unreasonable to expect the core for core performance to have been brought up that little bit to match the 2500K, and 8 of them..... WOW......yeah what an anticlimax, it can't even match my Phenom II in any aspect.

For me to upgrade is a step? no fall and break my leg.... down, so the wait continues.


----------



## Daimus (Feb 27, 2012)

abundant threading said:


> Show me a single game review where the FX-8150 can match a 2600K let alone a 2500K which is cheaper, most reviews show the FX to be way behind even the 2500K



Something found.
http://www.hardwareheaven.com/revie...i7-2600k-review-deus-ex-human-revolution.html

http://www.hardwareheaven.com/revie...rocessor-vs-core-i7-2600k-review-f1-2011.html

http://www.hardwareheaven.com/revie...-core-i7-2600k-review-total-war-shogun-2.html


----------



## abundant threading (Feb 27, 2012)

Daimus said:


> Something found.
> http://www.hardwareheaven.com/revie...i7-2600k-review-deus-ex-human-revolution.html
> 
> http://www.hardwareheaven.com/revie...rocessor-vs-core-i7-2600k-review-f1-2011.html
> ...



Great, fantastic... there's 3 wins for FX. lets have more of that, thanks


----------



## Daimus (Feb 27, 2012)

abundant threading said:


> Great, fantastic... there's 3 wins for FX. lets have more of that, thanks



You asked for at least one review, I did. With regard to competition, you're somewhat right. If there is no competition in high-end - prices will not fall.


----------



## abundant threading (Feb 27, 2012)

Yeah, I'm just getting greedy now  thanks for that.


----------



## Daimus (Feb 27, 2012)

abundant threading said:


> Yeah, I'm just getting greedy now



Why so sarcastic? It's not so bad, and as has already been said here, losing in the games a little, fits in price difference.


----------



## Red_Machine (Feb 27, 2012)

ALL HAIL NEHALEM.


----------



## xenocide (Feb 27, 2012)

That review was the only one from the FX-8150 launch that showed the FX Setup winning, I remember seeing it cited dozens of times.  Something is telling about the fact that there are dozens of reviews that show SB setups winning and only a couple where FX is even on par... just saying...


----------



## Mussels (Feb 27, 2012)

xenocide said:


> That review was the only one from the FX-8150 launch that showed the FX Setup winning, I remember seeing it cited dozens of times.  Something is telling about the fact that there are dozens of reviews that show SB setups winning and only a couple where FX is even on par... just saying...



its all about what games. if you look, DX11 games is where bulldozer shines. look at the reviews where intel is ahead, and its all DX9 games where single threaded performance matters.


----------



## abundant threading (Feb 27, 2012)

xenocide said:


> That review was the only one from the FX-8150 launch that showed the FX Setup winning, I remember seeing it cited dozens of times.  Something is telling about the fact that there are dozens of reviews that show SB setups winning and only a couple where FX is even on par... just saying...



Where there are 3 more will come. perhaps its not so much that the FX needs to catch up and more about how apps might catch up.


----------



## Daimus (Feb 27, 2012)

I wonder why all the attention on the game. Many people use bulldozer as a low-cost server, a friend of mine built a 10 render farms on the basis of BD and very happy, because it cost him a lot cheaper than a similar farm productivity on the basis of a competitor.


----------



## Artas1984 (Mar 17, 2012)

It's all normal - remember that Phenom II was originally targeted against Core 2 Duo Wolfdale/Yorkfield processors, and seeing as how clock/per clock the Phenom II is faster than FX, it's no wonder that FX can not beat Nehalem or Westmere processors..

Phenom II * <->* FX
Core 2 Duo *-->* Nehalem/Westmere *--->* Sandy Bridge

AMD is 2 generations behind and lagging. With Ivy Bridge launch AMD will be 3 generations lagging. Amazing...


----------



## Aquinus (Mar 17, 2012)

Artas1984 said:


> It's all normal - remember that Phenom II was originally targeted against Core 2 Duo Wolfdale/Yorkfield processors, and seeing as how clock/per clock the Phenom II is faster than FX, it's no wonder that FX can not beat Nehalem or Westmere processors..
> 
> Phenom II * <->* FX
> Core 2 Duo *-->* Nehalem/Westmere *--->* Sandy Bridge
> ...



Reviving a dead thread to say what everyone has already said multiple times? AMD is not 2 generations behind and I'm sick of saying this. Software will not remain single threaded and AMD is looking towards the future. BD is fast *ENOUGH* on a single thread (yes, IPC needs improvement,) but when it comes to pushing this chip on all cores, it's just as good at the 2600k and when pile-driver and various successive chips that come in to replace BD, you will see that AMD has a much better platform in terms of scaling to more cores. You can only make circuitry so small and eventually Intel will realize as well that optimizing for space will become a concern over pushing single-threaded speeds to their limits.

Imagine if GPUs kept using pixel and vertex pipelines. Yeah, they were faster but they were more specialized and took up a ton of space. Shaders were slower but they were general purpose and you could shove a ton of them on a GPU. BD and CPU cores are no different, it just takes time.


----------



## ensabrenoir (Mar 18, 2012)

Deadhorse alert


----------



## trickson (Mar 18, 2012)

This says it all!


----------



## XNine (Mar 18, 2012)

Quick Question, how well do you think this would fair with LGA2011 and the Sandy Bridge-E CPU 3820?  Better or similar to the 2500K setup that was tested?


----------



## TRWOV (Mar 18, 2012)

I'd think the 3820 should be at least on par with the 2500K. Not that much of a difference really, unless you're using 3 or 4 GPUs (the 3820 has more PCIe lanes)


----------



## XNine (Mar 19, 2012)

TRWOV said:


> I'd think the 3820 should be at least on par with the 2500K. Not that much of a difference really, unless you're using 3 or 4 GPUs (the 3820 has more PCIe lanes)



Awesome! Thanks for the input.    I am a new Intel convert.  Been using AMD for years so it's just a whole new world coming over to Intel lol


----------

