# Are AMD Current CPUs Not Enough For Solid 60 FPS Anymore?



## Robert-The-Rambler (Sep 7, 2010)

I want to make this opening statement as brief as possible. I have been an AMD user for a long time and I have several AMD systems with a variety of processors. 2 systems have Phenom X3 8750s, 1 has a Phenom 9600, 2 have Phenom X4 9850s, 1 has a Phenom II 920, and 1 has an Athlon II 635 quad core at 2.9 ghz.

I also have 2 systems with I7 920s. Now for a while I always thought that my I7 systems were overkill from a CPU standpoint but ever since Ghostbusters came out I am starting to notice HUGE differences in performance in real gameplay in newer games. In that game none of my AMD systems can stay at a solid 60 FPS even though all but one of the systems have dual Radeon 4850s and the Phenom II and Athlon II each have triple 4850s. I lock the FPS at 30 and things stabilize otherwise the FPS is all over the map. The I7s seem to never drop below 60 FPS in that game with similar graphic setups as the AMD systems. (Dual 4890s or 1 GTX 460 overclocked to 800/1600)

Dragon Age Origins is another game where in my case none of the AMD systems can sustain 60 FPS throughout the entire experience. I did stuff like disable FSAA and the result is largely the same. Something other than the GPU is bottlenecking the system. Sometimes it is great but man especially sections like just arriving at Ostagar drop into the 20s where the I7s just stay fast. It is ridiculous just how much better the I7 platform is starting to handle the latest games.

I don't want this thread to be an AMD vs Intel Thread. I want this discussion to be a serious investigation of real world gaming experiences with the latest AMD processors or at least modern AMD processors where we figure out which games seem to just hate your AMD processor. I tracked performance using FRAPS. I am anal so I have it running on my screen pretty much all the time. I have found 2 AMD haters in Ghostbusters and Dragon Age Origins. What else is out there?


----------



## cadaveca (Sep 7, 2010)

If you play games, you want i7. That fact is old, old news, since i7 launched. Lots of people thought maybe Phenom2 would fix this(myself included), but overtime ,this has proven to not be the case...unless you want to overclock.

What's not old news is the exact reason why...but I personally am investigating what's causing the "cpu bottlenecks" many users are seeing, myself included.

For some reason, it seems i7 can get more data to the gpus. And then, when the cpu is not the bottleneck, and the gpu is, Microstutter is introduced, and we now have a very quantifiable way of measuring Microstutter.

It's Lose-Lose...and I have no more to say on this subject until I can generate more data. rather than guess, I'll let the numbers do the talking.


----------



## dir_d (Sep 7, 2010)

cadaveca said:


> If you play games, you want i7. That fact is old, old news, since i7 launched. Lots of people thought maybe Phenom2 would fix this(myself included), but overtime ,this has proven to not be the case...unless you want to overclock.



He is 100% correct but a simple overclock on the phenom 2 or thurban can equal if not outperform a i7 in gaming; its all about the Northbridge overclock. Here is a half motherboard, half thurban review from Anandtech click. Its a mysery why AMD didnt set the stock CPU-NB to 2.4 or 2.6ghz.


----------



## TheMailMan78 (Sep 7, 2010)

cadaveca said:


> If you play games, you want i7. That fact is old, old news, since i7 launched. Lots of people thought maybe Phenom2 would fix this(myself included), but overtime ,this has proven to not be the case...unless you want to overclock.
> 
> What's not old news is the exact reason why...but I personally am investigating what's causing the "cpu bottlenecks" many users are seeing, myself included.
> 
> ...



My mind cannot process that level of bullshit.


----------



## cadaveca (Sep 7, 2010)

dir_d said:


> He is 100% correct but a simple overclock on the phenom 2 or thurban can equal if not outperform a i7 in gaming; its all about the Northbridge overclock. Here is a half motherboard, half thurban review from Anandtech click. Its a mysery why AMD didnt set the stock CPU-NB to 2.4 or 2.6ghz.



I am stil ltesting, but it seems the i7 overclocked still beats out AMD. i7 costs more, so to me, it's no big deal...you get what you pay for.



TheMailMan78 said:


> My mind cannot process that level of bullshit.



Again, MM, i'll say this, you are trolling. Either post data that refutes it, or get out. Even Mussels and erocker agree cpu is the bottleneck.

 Crossfired 5850's Issue


----------



## Steevo (Sep 7, 2010)

Strange as I run almost all games at very high frame rates. 

http://www.overclock3d.net/reviews/cpu_mainboard/amd_vs_intel_-_the_gaming_sweetspot/3


This and other tests show the 965BE on par with a common i7 chip using the same GPU, the i7 does have more system memory, and more bandwidth available.


----------



## erocker (Sep 7, 2010)

cadaveca said:


> If you play games, you want i7. That fact is old, old news, since i7 launched. Lots of people thought maybe Phenom2 would fix this(myself included), but overtime ,this has proven to not be the case...unless you want to overclock.
> 
> What's not old news is the exact reason why...but I personally am investigating what's causing the "cpu bottlenecks" many users are seeing, myself included.
> 
> ...



The scaling on my system was much better than the member in the CrossFire problem thread. I think it was more than a problem with CPU bandwith. I really should of tried running CrossFire with my CPU at stock to further the investigation.


----------



## Darknova (Sep 7, 2010)

cadaveca said:


> If you play games, you want i7. That fact is old, old news, since i7 launched. Lots of people thought maybe Phenom2 would fix this(myself included), but overtime ,this has proven to not be the case...unless you want to overclock.





TheMailMan78 said:


> My mind cannot process that level of bullshit.



I'm in total agreement with TheMailMan78. Considering the fact that I own a rather old Phenom II (first tri-core) and I can still play ALL games I throw at it smoothly (ok, maybe not the solid 60 the OP was after, but still smooth) I call bullshit as well.

To make such a broad sweeping statement as you just did requires you to prove that Phenom II or any AMD chips can't handle games at all, which I'm sure most AMD users on this forum can attest otherwise.

(oh, and up until recently, I haven't been overclocked)


----------



## cadaveca (Sep 7, 2010)

Steevo said:


> Strange as I run almost all games at very high frame rates.
> 
> http://www.overclock3d.net/reviews/cpu_mainboard/amd_vs_intel_-_the_gaming_sweetspot/3
> 
> ...



That was before the driver change(Oct, 2009). Those numbers do not reflect perforamcne since the change @ driver level, when performance profiles were introduced as a seperate part of the driver.


DarkNova, I am currently generating performance numbers for about 300 games, and when I post the numbers, you can judge for yourself. I have been finding that anything over 5870 levels is a bit bottlenecked...I'm eager to see how the high-end 6-series performs, just because of this. AMd drivers currently have an issue keeping more than 1600 shaders loaded.


----------



## suraswami (Sep 7, 2010)

Robert-The-Rambler said:


> I want to make this opening statement as brief as possible. I have been an AMD user for a long time and I have several AMD systems with a variety of processors. 2 systems have Phenom X3 8750s, 1 has a Phenom 9600, 2 have Phenom X4 9850s, 1 has a Phenom II 920, and 1 has an Athlon II 635 quad core at 2.9 ghz.
> 
> I also have 2 systems with I7 920s. Now for a while I always thought that my I7 systems were overkill from a CPU standpoint but ever since Ghostbusters came out I am starting to notice HUGE differences in performance in real gameplay in newer games. In that game none of my AMD systems can stay at a solid 60 FPS even though all but one of the systems have dual Radeon 4850s and the Phenom II and Athlon II each have triple 4850s. I lock the FPS at 30 and things stabilize otherwise the FPS is all over the map. The I7s seem to never drop below 60 FPS in that game with similar graphic setups as the AMD systems. (Dual 4890s or 1 GTX 460 overclocked to 800/1600)
> 
> ...



Just curious, did you try disabling Cool 'n' Quiet and running all the chips at full speed?  If the frames drop drastically that might be the culprit.

Also if you can put those 4890s in CF on the AMD 920 machine and put the 4850 CF in the I7 machine and see if the issue is still there.


----------



## cadaveca (Sep 7, 2010)

erocker said:


> The scaling on my system was much better than the member in the CrossFire problem thread. I think it was more than a problem with CPU bandwith. I really should of tried running CrossFire with my CPU at stock to further the investigation.



he was running 775, you on AMD, so not exactly the same platform...that has to be considered as well.

I can't make any real conclusions as to what exactly the bottleneck is, but it does seem cpu-related. Once I've got all the data, I can then start to filter it and see what shows up. I currently think it may be L3, or NB-PCI-E, hard to tell when NB speed increases also affect L3.


Really, I just want to be able to make accurate recommendations as to what specs are needed when putting systems together, also keeping resolution in mind.

In the end, I will also be testing dual, tri, and hexacore cpus, to see if they have any impact. But with that all said, it's going to take LOTS of work....more than any review does, benching so many apps and different configs...i7 will be included. It's going  to take several months, maybe, adn by that point, 6-series should be here, so I can jsut add them into the mix, and we can see how things have changed.

I also have ordered HD5830x3, to see how shader count affects that scaling. It's too bad you are getting rid of Crossfire, as your numbers could validate, or deny my own.


----------



## CDdude55 (Sep 7, 2010)

Man, i can just feel this thread getting locked not to long from now.

And it's not a matter of 'omg i has an AMD cpu im gonna get low frames!!!1', there are so many things that factor into how may frames you'll get in a game. i7's perform better then the majority of AMD CPU's currently out and that's a fact, anyone who tells you otherwise is either full of shit or a fanboy. AMD CPU's are great gaming CPU's and can get the job done most of the time, so they perform as good an Intels current platform?, no... but if you go with AMD are you going to be seeing below 60 frames in most to every game?, no, that's a load of shit (unless you're running an older system.) 

Some games are more CPU depended and some are more GPU dependent, some game use more RAM, some games take more space on a HDD. You see how every game can differ?, it's just a matter of what your system can push out in those departments if needed by the game. i7's have the advantage of HT and more memory bandwidth, have you thought that that could be why?, i7's have a better architecture overall, have you thought that that could be why?.

It's nonsensical and frankly retarded for anyone to think that somehow there low frames in games is contributed automatically to whether or not they have an Intel or AMD CPU.(not directed towards you robert the rambler) But hey, maybe that's just me.


----------



## cadaveca (Sep 7, 2010)

If people can keep thier emotions in check, there will be no lock. Personally, for me, the difference in tech between AMD and Intel are pretty huge...and raw FPS numbers don't tell the entire story.

I don't care who's better...but I'll test and see what the numbers say, and make conclusions based on that. All I can truly say at this point is that stock AMD NB is not enough for more than a single 1600 shader gpu. 3x 4850...2400 shaders...there's gonna be issues there.


----------



## paulharrison123 (Sep 7, 2010)

Maybe im a little different But my 1090T is perfectly fine in any game giving well over 60FPS on avaerage (cept Metro 2033 with ADV DOF on - but that to me is poor optimisation)

Not disputing the above but ive not once had a single problem (but then again my chip is a 6 core and more near the price of an i7)


----------



## cadaveca (Sep 7, 2010)

paulharrison123 said:


> Maybe im a little different But my 1090T is perfectly fine in any game giving well over 60FPS on avaerage (cept Metro 2033 with ADV DOF on - but that to me is poor optimisation)
> 
> Not disputing the above but ive not once had a single problem (but then again my chip is a 6 core and more near the price of an i7)



It not as simple as CDdude says..each app is going to be a bit different, and you really need to account for what vgas are in the system, as well as other variables. 

these bottlenecks are very obvious once you get into 4x vgas...W1zzard's review reflects this as well..even i7 isn't capable of fully pushing that much gpu grunt...scaling is horrible.

But, myself, I'm not exactly sure where the bottleneck is...both AMD and Intel are affected, once the number of gpus increases. It just seems Intel has a slight advantage before that limit is hit...and it not "blowing AMD out of the water" differences, either.


----------



## pantherx12 (Sep 7, 2010)

cadaveca said:


> If people can keep thier emotions in check, there will be no lock. Personally, for me, the difference in tech between AMD and Intel are pretty huge...and raw FPS numbers don't tell the entire story.
> 
> I don't care who's better...but I'll test and see what the numbers say, and make conclusions based on that. All I can truly say at this point is that stock AMD NB is not enough for more than a single 1600 shader gpu. 3x 4850...2400 shaders...there's gonna be issues there.




Why not over-clock northbridge then?


----------



## CDdude55 (Sep 7, 2010)

cadaveca said:


> If people can keep thier emotions in check, there will be no lock. Personally, for me, the difference in tech between AMD and Intel are pretty huge...and raw FPS numbers don't tell the entire story.
> 
> I don't care who's better...but I'll test and see what the numbers say, and make conclusions based on that. All I can truly say at this point is that stock AMD NB is not enough for more than a single 1600 shader gpu. 3x 4850...2400 shaders...there's gonna be issues there.



I'm not sure how well 3x4850's scale, but that could be a factor. And it could as well be that the AMD CPU can't handle all that GPU power, so i agree.

I suggest he tries some other games and to see what's what. But it shouldn't be surprising that an i7 is outperforming a Phenom II in in the majority of games.


----------



## CDdude55 (Sep 7, 2010)

cadaveca said:


> It not as simple as CDdude says..each app is going to be a bit different, and you really need to account for what vgas are in the system, as well as other variables.



That's exactly what i was saying in that post.

There's tons of different factors that contribute to whether a game plays well on a system or doesn't.


----------



## cadaveca (Sep 7, 2010)

pantherx12 said:


> Why not over-clock northbridge then?



Yeah, that seems to have a LARGE impact at this point...but rather than just getting the performacne I need...what interests me is WHY that boost is needed. And liek I said..NB increase also affects L3 cache speed..so it's ahrd to say what's really the issue here...



CDdude55 said:


> I'm not sure how well 3x4850's scale, but that could be a factor. And it could as well be that the AMD CPU can't handle all that GPU power, so i agree.
> 
> I suggest he tries some other games and to see what's what. But it shouldn't be surprising that an i7 is outperforming a Phenom II in in the majority of games.



Like I've said, it's far to early for me to make any definate conclusions. I'm testing every game I own...I'm sure for many, there will be no difference...

Also, Microstutter is becoming more important, it seems. Even single-gpu set-ups are getting affected, when gpu is @ 100% load. So just because you get higher FPS, doesn't mean your gameplay expereince is going to be better...


----------



## pantherx12 (Sep 7, 2010)

Well it sounds like its nothing to do with the CPUS, and more to do with the North-bridge then.

Try over-clock north-bridge and lower the multi ( aim for stock frequency of course)  to see if you get get better performance, that be the way to test if it is cpu or north-bridge for certain. 

I've never noticed this problem as I always over-clock and never use up the multi to do it, I change FSB.


----------



## CDdude55 (Sep 7, 2010)

cadaveca said:


> Yeah, that seems to have a LARGE impact at this point...but rather than just getting the performacne I need...what interests me is WHY that boost is needed. And liek I said..NB increase also affects L3 cache speed..so it's ahrd to say what's really the issue here...
> 
> 
> 
> ...



Well keep testing and report back with some good info if ya can.

And also, yes microstutter does affect performance... but you can still generally see a bump in performance with better hardware. Considering the microstuttering isn't to bad.


----------



## cadaveca (Sep 7, 2010)

pantherx12 said:


> Well it sounds like its nothing to do with the CPUS, and more to do with the North-bridge then.



I am refering to CPU-NB, part of the cpu. It might not be the cpu cores themselves...might be memory-pci-e bandwidth, might jsut be L3 speed...

Anyway, I know how to get better performance, and that's not what the issue is for me...I want to identify the blottleneck specifically, as hard as that may be. Just satisfying curiosity...plus...I do not feel that people should be forced to overclock, thereby invalidating warranties, just to make proper use of vga products.

you know, relaly, if I can generate enough hard, irrefutable data, I can then take it to court. Wouldn't you love it if we could force OEMs to warranty overclocks?


(BTW, I'm just being silly with that last bit. I would like companies to give out specific system requirements in order to use specific technologies, though.).


----------



## crazyeyesreaper (Sep 7, 2010)

cadaveca if u want some extra performance numbers on some games let me know  ive got quite a few games not 300 + but i got some titles and can probably compare notes on performance send me a PM if you want a hand and ill list out what games i can test


----------



## LAN_deRf_HA (Sep 7, 2010)

I believe the root of this issue is AMDs desire to maintain backwards compatibility. Hopefully AM3+ will really contribute the the performance gain of bulldozer. Also for those in doubt; massive game difference - http://www.anandtech.com/bench/Product/109?vs=147


----------



## 1Kurgan1 (Sep 7, 2010)

I'm going to step in and say it here, if you are playing any modern games, even 3x 4850's isn't going to be enough to hold 60fps as your minimum. Despite what your processor is, thats just not enough GPU power there. And thats for good looking games, great looking games, and even newer cards just aren't going to hold a game above 60fps constantly, the minimum is always going to be lower than 60.


----------



## theonedub (Sep 7, 2010)

LAN_deRf_HA said:


> I believe the root of this issue is AMDs desire to maintain backwards compatibility. Hopefully AM3+ will really contribute the the performance gain of bulldozer. Also for those in doubt; massive game difference - http://www.anandtech.com/bench/Product/109?vs=147



Are those benches really accurate? I thought the performance gap was smaller- those results look pretty one sided.


----------



## cadaveca (Sep 7, 2010)

theonedub said:


> Are those benches really accurate? I thought the performance gap was smaller- those results look pretty one sided.



So far my results seem to mimick those anadtech ones...in cpu-intensive apps, AMD wins, but Intel wins on 3D. Even better, alot of times 8x8 PCI-E Crossfire gives better performance than 16x16 PCI-E... and I do not understand why.

Also, there is no mention of drivers used, etc...too many unknown variables to call those metrics "fact", so I agree they are questionable.


----------



## Dent1 (Sep 7, 2010)

Robert-The-Rambler,

The problem is on your side, believe me. Ghostbusters isnt my idea of an intensive game and if you can not maintain 60FPS its on your side.

Download the Final Fantasy XIV amd 3D Mark Vantage Benchmark and lets see if there is a bottleneck or if its driver related.


----------



## erocker (Sep 7, 2010)

Dent1 said:


> lets see if there is a bottleneck or if its driver related.



That's what cadaveca is trying to point out. The bottleneck with ATi's latest drivers are the drivers! Crazy stuff, but through some little testing I did, it indeed has merit... plus it's dissapointing.


----------



## Dent1 (Sep 7, 2010)

erocker said:


> That's what cadaveca is trying to point out. The bottleneck with ATi's latest drivers are the drivers! Crazy stuff, but through some little testing I did, it indeed has merit... plus it's dissapointing.



To me this thread seems flame baity. I can not remember the last time I couldn’t sustain 60FPS average on my crusty rig (by today's standard). Robert-The-Rambler has posted once and once only in this thread so far. He seems like the type of guy that posts stuff to get people worked up then disappears and sits back and watches us fools argue.

Where can I download the Dragon Age benchmark from? I have a Athlon II X4 which I would like to test.


----------



## erocker (Sep 7, 2010)

Dent1 said:


> To me this thread seems flame baity. I can not remember the last time I couldn’t sustain 60FPS average on my crusty rig (by today's standard). Robert-The-Rambler has posted once and once only. He seems like the type of guy that posts stuff to get people worked up then disappears and sits back and watches us fools argue.
> 
> Where can I download the Dragon Age benchmark from? I have a Athlon II X4 which I would like to test.



Well, I can attest with a 5850 mildly overclocked and my 965be at 4ghz I easily get 60fps with both Dragon Age and Ghostbusters. There are no benchmarks for them unfortunately, you need to use Fraps. Tonight I should do a test of these games with my CPU at stock and overclocked.


----------



## cadaveca (Sep 7, 2010)

Dragonage is cpu-limited(i think cache-limited, unsure of mem, drive, or what...)...I mentioned this a long time ago, using my 4890 rig as an example..I was getting over 60FPS...barely...but vgas were only running @ about 66% utilization. Cpu wasn't maxed out...vgas weren't either...but something was keeping the framerate down.


I look forward to seeing what you find, e.

I'm working through my steam games, then I'll move onto others...Mass Effect has proven interesting, but my plan was to hit up Dragon Age afterwords.

As an aside, afterburner also reports FPS, and can then write it to a log file....



erocker said:


> That's what cadaveca is trying to point out. The bottleneck with ATi's latest drivers are the drivers! Crazy stuff, but through some little testing I did, it indeed has merit... plus it's dissapointing.



yes, exactly. I also think that AMD is unaware of the problem, as I think most of thier driver team is working on i7-based machines, and i7 doesn't run into these limitations as quickly...but is still a victim of the same behavior.


----------



## dir_d (Sep 7, 2010)

Cadaveca did you read that Anandtech article i gave a link to. It talked about the HT link bandwidth and how the stock 2Ghz is not enough to saturate the whole HT pci-e 4x bandwidth. 2.8ghz+ is where we start seeing the pci-e 4x bandwidth being saturated fully, thus resulting in better performance. Read it...its interesting.


----------



## erocker (Sep 7, 2010)

dir_d said:


> Cadaveca did you read that Anandtech article i gave a link to. It talked about the HT link bandwidth and how the stock 2Ghz is not enough to saturate the whole HT pci-e 4x bandwidth. 2.8ghz+ is where we start seeing the pci-e 4x bandwidth being saturated fully, thus resulting in better performance. Read it...its interesting.



Thing is, it's not the HT bandwith. It's the north bridge or more specifically the IMC frequency that is causing the problems. It is also stock at 2ghz like HT and it's not enough.


----------



## Robert-The-Rambler (Sep 7, 2010)

*Here is what I'm looking for*

This is not flame bait thread. I've been a message board moderator before and I know what that is. I'm not even trying to start a debate about the difference in technologies. Everybody knows the I7 Socket 1366 platform is the best performing gaming platform. I just want to know if anybody else out there knows what other games I might have trouble trying to play with my AMD setups.

I'm looking for games that AMD CPUs, even the latest ones like the X6s, just simply can't do 60 FPS for the majority of the time with Vsync enabled. It might even be a platform limitation but in order to use AMD CPUs you have to accept the limitations of the given platforms whether they have DDR2/DDR3. I stated an observation that I noticed when utilizing FRAPs. I was hoping some of you guys might also try just playing the actual games for a while just watching the FRAPS counter. I'm more interested in the real world gameplay then benchmark results.

I also want to mention that the AMD setups I have are running 4 gigs of DDR2 800 RAM and one of them is running 4 gigs of DDR 1066 RAM.

P.S 3 4850 1 gig cards in triple crossfire can play Metro 2033 at very high DirectX 10 at reasonable FPS. (Not 60 solid but playable) They are enough for most things. Nobody even mention Crysis. I may launch a cruise missile your way.


----------



## cadaveca (Sep 7, 2010)

dir_d said:


> Read it...its interesting.



I did. And as erocker says, it's IMC speed(CPU-NB) that they are testing, not HTT. I get very minimal gains from HTT clocking...there are some, but CPU-NB gains are far more drastic.


Now, again, increasing CPU-NB increases L3 speed, and Intel's L3 speed, as well as memory bandwidth, are both higher than AMD, and when I consider that Intel doesn't run into this problem as quickly, it's easy to just say it's those things that make the difference.

However, I am not 100% sure on this, so I am doing this testing, in hopes that maybe the truth will be exposed, so to speak.

In the end, it's not really THAT important...the majority of us are going to overclock, and when overclocked, this isn't such a big issue. But because we have to void our warranties to avoid this problem , I do not feel that the OEMs are really doing us any good having technology on the market that requires overclocking..specifically SLi and Crossfire.

Of course, Sli seems to not be affected by this sort of issue so quickly...so that points a finger @ Ati drivers...which we kinda looked at in the 5850Crossfire thread.

Of course, I've already said all of this, as currently, that's all there is to say. Sli sclaing is better than Crossfire, and it seems a big part of that is CPU bottlenecks...not real bottlenecks, but ones created by how the AMD driver works currently.

And because of it looking to be drivers, I think it's even less important..as those drivers could change at any time...that's up to AMD. However, without knowing about the issue, it's gonna be hard for AMD to fix it...so in the end, my goal is to see if there is any real merit to this, and if there is, to put it in the public's eye, in hopes that it can be rectified.

And to you, RObert, clearly I see some of the same sort of behavior you describe...and I'm currently looking further into this. Who knows..in the end, I could be wrong...adnd that's fine...but noone will truly know until the numbers speak for themselves.


----------



## Robert-The-Rambler (Sep 7, 2010)

*But why does crossfire kick ass on the I7 Platform?*



cadaveca said:


> I did. And as erocker says, it's IMC speed(CPU-NB) that they are testing, not HTT. I get very minimal gains from HTT clocking...there are some, but CPU-NB gains are far more drastic.
> 
> 
> Now, again, increasing CPU-NB increases L3 speed, and Intel's L3 speed, as well as memory bandwidth, are both higher than AMD, and when I consider that Intel doesn't run into this problem as quickly, it's easy to just say it's those things that make the difference.
> ...



Don't you think the performance issues could be hardware limitations? My two 4890s are crazy even at 2560 * 1600 is most games with an I7 920 stock on the accelerator. For example part of what inspired this thread is that they simply stayed stuck 60 FPS at max detail in Dragon Age Origins. Now unfortunately I could not read much of the text. Screw Bioware and EA for not handling the 2560 * 1600 resolution properly. You literally need a magnifying glass to read the text.

But again I want to say that I am looking for the other games that hate AMD and simply will not perform "perfectly" at 60 solid FPS with Vsync on.


----------



## Reventon (Sep 7, 2010)

A 1090T is also $600 cheaper than an i7-980X. Just sayin.


----------



## erocker (Sep 7, 2010)

Robert-The-Rambler said:


> But again I want to say that I am looking for the other games that hate AMD and simply will not perform "perfectly" at 60 solid FPS with Vsync on.



I don't know of any. Are you reffering to an AMD CPU at stock speed or overclocked?


----------



## CDdude55 (Sep 7, 2010)

Reventon said:


> A 1090T is also $600 cheaper than an i7-980X. Just sayin.



And it's also significantly slower then a 980x. Just sayin.


----------



## Robert-The-Rambler (Sep 7, 2010)

*I don't think I ever enabled it*



suraswami said:


> Just curious, did you try disabling Cool 'n' Quiet and running all the chips at full speed?  If the frames drop drastically that might be the culprit.
> 
> Also if you can put those 4890s in CF on the AMD 920 machine and put the 4850 CF in the I7 machine and see if the issue is still there.



I only enabled cool and quiet with the X3s to get them to run cool at idle since they are used mostly for surfing the web. My little ASRock motherboard even tells you how many watts you are consuming at idle. I think it was 5 or some ridiculous number.

Unfortunately I can't move any parts around. I did that so often in the past I am burned out on doing that any more. A recent nasty injury that required stitches has me apprehensive about doing much handy work inside my PCs.


----------



## Reventon (Sep 7, 2010)

CDdude55 said:


> And it's also significantly slower then a 980x. Just sayin.






> Operating Frequency
> 3.2GHz
> 
> Hyper Transports
> ...



It's still a good processor. Just sayin.


----------



## Robert-The-Rambler (Sep 7, 2010)

*I mean stock*



erocker said:


> I don't know of any. Are you reffering to an AMD CPU at stock speed or overclocked?



If you have to overclock then something is amiss. Overclocking is supposed to be for fun not a necessity!

Anyhow I am going to try some games out and report back.

I will be using a rig with these specs.

Windows Vista 64 bit Home Premium

MSI K9A2 Platinum

4 gigs of DDR 2 800 RAM

3 Radeon 4850 1 gig in triple crossfire

1 Asus Xonar DS

2 500 gig hard drives not in RAID

2 DVD ROMs and 1 DVD Burner

I'll see if I can illustrate this with FRAPS.


----------



## CDdude55 (Sep 7, 2010)

Reventon said:


> It's still a good processor. Just sayin.



I agree. Just sayin.


----------



## crazyeyesreaper (Sep 7, 2010)

okay first things first

scaling with Tri Fire on 4850s on an older motherboard in Vista isnt going to be stellar trifire dosent scale well unless your pushing your CPU to the limits in order to feed those gpus and thats even with older drivers. before they changed the way they did things.

also anything thats not Phenom II aka Athlon II has no L3 this is a 10% blow on average no matter what when equal cores are used at same clocks.

but yea with 3 gpus overclocking is a Necessity.

oh and as far as GPU usage etc i think Cadaveca is onto something here aka how the ATi driver uses cpu time ive always noticed that at lower resolutions Nvidias GPU can walk all over the ATi cards depending on games but as the resolution goes up they even out etc. I think now i have a reason for why that is


----------



## TheMailMan78 (Sep 7, 2010)

cadaveca said:


> I am stil ltesting, but it seems the i7 overclocked still beats out AMD. i7 costs more, so to me, it's no big deal...you get what you pay for.
> 
> 
> 
> ...



Oh I'm trolling? I'm not the one that came on to this forum and claimed to have been the creator of the TWKR chip. You walk right of Mordor with that bullshit.

Second I get 60+ FPS in ALL of my games with my current rig and....(gasp) its all AMD?!

If you are not with a 5850 or a 5870 then you have a driver issue or something is wrong with your rig. Thats the bottom line. It has NOTHING to do with the CPU itself.

Here I built you a sand castle in honor of your BS. I call it Mt. Cadaveca.


----------



## cadaveca (Sep 7, 2010)

TheMailMan78 said:


> Oh I'm trolling? I'm not the one that came on to this forum and claimed to have been the creator of the TWKR chip. You walk right of Mordor with that bullshit.




That was a joke, MM. I guess you've never seen the Windows "It was MY idea" commercials?








TheMailMan78 said:


> If you are not with a 5850 or a 5870 then you have a driver issue or something is wrong with your rig. Thats the bottom line. It has NOTHING to do with the CPU itself.



Um, pay attention dude..I'm saying it's all about the driver. Post reported 4 trolling and flamebaiting.



erocker said:


> That's what cadaveca is trying to point out. The bottleneck with ATi's latest drivers are the drivers! Crazy stuff, but *through some little testing I did, it indeed has merit*... plus it's dissapointing.


----------



## TheMailMan78 (Sep 7, 2010)

cadaveca said:


> That was a joke, MM. I guess you've never seen the Windows "It was MY idea" commercials?
> 
> 
> 
> ...



Be sure to include the screen I posted showing a solid 60 FPS in your complaint.


----------



## cadaveca (Sep 7, 2010)

Just because you don't have any issues, doesn't mean there isn't one. You haven't read the thread, clearly, as this issue only presents itself @ stock cpu, with more than 1600 shaders to push. Again, you are trolling, trying to start a fight here, and I'm not falling for it.

Post yet again reported.


----------



## TheMailMan78 (Sep 7, 2010)

cadaveca said:


> Just because you don't have any issues, doesn't mean there isn't one. You haven't read the thread, clearly, as this issue only present itself @ stock cpu. Again, you are trolling, trying to start a fight here, and I'm not falling for it.
> 
> Post yet again reported.



Mine is at stock. Explain that.


----------



## cadaveca (Sep 7, 2010)

EDIT:

It doesn't affect all apps. agian clearly you aren't reading the thread, and are harrassing me. Add a second card, and performance will tank.

Crossfire scaling is broken...due to driver inefficiencies. This behavior is only noticed when trying to push more than 1600 shaders. The driver seems to have issues keeping all shaders at work, and this can be rectified by increasing CPU-NB. This may be due to a cpu-bottleneck, or a cpu bottleneck created by the driver...currently there's no way to tell exactly.


Not too sure why I'm explaining this again, but there you go.


----------



## TheMailMan78 (Sep 7, 2010)

cadaveca said:


> EDIT:
> 
> It doesn't affect all apps. agian clearly you aren't reading the thread, and are harrassing me. Add a second card, and performance will tank.



No one is harassing you. If you can't take criticism then I'm sorry. However Ill be happy to run a test one any game I currently own.


----------



## cadaveca (Sep 7, 2010)

It's harrasment, as this info is already in the thread, and I'm repeating myself. Take the time to read the thread, and you'd see that.

Do you have more than one 5-series card? IF you do not, then no testing you do can help. Again, this info is already here.

Meet the specs that cause the issue, and sure, I'd love the help. But nothing of your "criticism" has anything to do with making the problem happen, so yes, you are trolling and harrassing. erocker did some quick testing, as well as another user in another thread, and there seems to be something here...what's going on, we don't know yet. But we can easily replicate the problem. 

It takes 2 5-series cards, and a stock AMD cpu. Bench with one card. Then bench with two. then 3, and finally, 4.

Then, increase only cpu-NB, and bench gain..single card, and dual card, triple card...quadfire. Monitor and graph cpu usage, gpu usage, and FPS. Also get the Microstutter measurement app, and run that as well.

Rinse and repeat for every 3D app out there.


----------



## TheMailMan78 (Sep 7, 2010)

cadaveca said:


> It's harrasment, as this info is already in the thread, and I'm repeating myself. Take the time to read the thread, and you'd see that.
> 
> Do you have more than one 5-series card? IF you do not, then no testing you do can help. Agian, this info is already here.



Its a driver limitation obviously because it didn't exist before. Anyway that wasn't the OP question and that was not your initial answer.

He asked why i7 were so much faster then AMD CPUs in gaming and your answer was NOT "oh because you are trying to crossfire and they have a driver issue with scaling and AMD CPUS"

When someone posted proof you were wrong from elsewhere THEN you brought up drivers.


----------



## Robert-The-Rambler (Sep 7, 2010)

*Just ran Ghostbusters The History Museum*

The performance was much better than I had seen in the past. I gave up it was so bad. Albeit it is only one level and the worst performance was in the level with Stay Puft. I ran the game at max detail 2560 * 1600 and while not a perfect 60 FPS it did not drop as horribly as I had noticed in the past. Frankly it was enjoyable and I could actually aim the damn positron glider without going cockeyed when the frames skipped all over the place. I'll have to see how the Stay Puft level works but surely the History Museum is my favorite level.

I have to learn how to capture a video and post it on Youtube. That would make it easiest to see the Fraps counter and get an idea just what the hell I'm talking about.

Again system settings are Phenom II 920 stock with 3 4850 1 gigs in triple crossfire in Vista 64 bit with 4 gigs of DDR 2 800 RAM.


----------



## cadaveca (Sep 7, 2010)

TheMailMan78 said:


> Its a driver limitation obviously because it didn't exist before. Anyway that wasn't the OP question and that was not your initial answer.
> 
> He asked why i7 were so much faster then AMD CPUs in gaming and your answer was NOT "oh because you are trying to crossfire and they have a driver issue with scaling and AMD CPUS"
> 
> When someone posted proof you were wrong from elsewhere THEN you brought up drivers.



Actually, I clarified my answer to point out that the cpu limitation is made evident by the driver...and hence the link to the thread(post #5). Twist it how you will, even erocker knew what I was talking about, as he took part in the conversation that lead to that conclusion(see link in post #5). i7 doesn't run into this driver problem as quickly, and as such, i7 is the better gaming cpu. I didn't change my story, I merely explained how I came to that conclusion.



Robert-The-Rambler said:


> I have to learn how to capture a video and post it on Youtube. That would make it easiest to see the Fraps counter and get an idea just what the hell I'm talking about.


 you should be able to get Fraps to output a graph, and that would be just as useful.


----------



## LAN_deRf_HA (Sep 7, 2010)

Slightly related... I recall there was an odd drive issue not long ago that had nvidia cards getting a boost when paired with an AMD cpu. I don't remember if it was 10% or 10 fps, or if it was all configs or just sli/xfire. Does anyone know if that issue was sorted out or equalized? I think it had to do directly with driver use of multiple cpu cores.


----------



## erocker (Sep 7, 2010)

I'll be adding to this post...

965BE @ 4ghz, RAM at 1400mhz cas 6 and close to 2800mhz CPU/NB. Using a single 5850 @ 900/1000

Dragon Age Origins "Final Battle" I get above 60fps (60-120) most of the time. In the part where you are walking towards a gate and there's a good 100 soldiers cheering you on as you go, my frames drop to about 45. 100 soldiers is very CPU dependant. 

Installing Ghostbusters now.. then I run both at stock settings.


----------



## Steevo (Sep 7, 2010)

One of the beauties of the internet is that anyone can share their "experiences", some choose to do so as more than an opinion, and state such as a fact without performing the standard "double blind" and submitting it for "peer reviewed" justification. 


Have you performed the same tests with multiple drivers on clean installations of windows?
Have you tried multiple cards of the same type?
Have you tried multiple systems with each one in the same configuration?
Have you tried multiple vendors of GPU, CPU, and motherboard?
Have you submitted it to peer review for them to try the same settings and standardize your test scenarios?


Unless you fulfill all the above criteria, it is only your opinion. Unfortunately testing the same card multiple times still leaves a single data point that might be the issue. A weak VRM could cause stuttering, a memory chip might cause retransmits of data, a damaged or bad trace might have more resistance than others. There are hundreds of factors that can cause repeatable issues with one card, making your experience and test results very different than TMM, or mine.


----------



## cadaveca (Sep 7, 2010)

Steevo said:


> Have you performed the same tests with multiple drivers on clean installations of windows?



Yes, I did a fresh install in the other thread, where this issue was brought up. Testing all drivers since AMD made the change to seperate profiles



> Have you tried multiple cards of the same type?



Yes, I have 5x 5870.



> Have you tried multiple systems with each one in the same configuration?



Yes, I perform overclocked tests on one system, and stock on another..each are exactly same, except the stock rig has a different PSU. But I can swap in the same PSU too.



> Have you tried multiple vendors of GPU, CPU, and motherboard?



 One other memeber here, as well as erocker have done/are doing, the same tests.



> Have you submitted it to peer review for them to try the same settings and standardize your test scenarios?



We are at this point now. I'm generating the data, I will post, and then you all can make sure I've got it right.

I actually want to thank you for this awesome post, as this is the path I am headed down, but it takes time for it all to happen.




At this point, I've made my theory known, and we have some preliminary data, that seems to warrant further investigation(see erocker working to replicate the problem).


Baby steps..as I've said, it's too early to make a final judgement, so your post is 100% spot on.


----------



## CDdude55 (Sep 7, 2010)

Steevo said:


> One of the beauties of the internet is that anyone can share their "experiences", some choose to do so as more than an opinion, and state such as a fact without performing the standard "double blind" and submitting it for "peer reviewed" justification.
> 
> 
> Have you performed the same tests with multiple drivers on clean installations of windows?
> ...



Researching is another option, not everyone has multiple 5 series cards and multiple CPU's to bench. It's about looking in the right places and then determining a correct opinion based on those actual facts.(though no opinion is ''incorrect'', there are still factual opinions and non factual ones)


----------



## phanbuey (Sep 7, 2010)

ive been saying this for a while about the 1090T.  AMD do suffer from a performance hit when it comes to gaming vs i5 750 even.


----------



## erocker (Sep 7, 2010)

People, cool your jets.. post what you need to post but do not harrass one another. I'd rather participate in this thread then moderate it right now. So if you do make me have to moderate it, justice will be swift and most likely harsh. Be nice. Now.. back to the topic.


----------



## TheMailMan78 (Sep 7, 2010)

Fine Ill play along.

One thing you guys need to check as you are doing this is to sync your catalyst. What I mean by that is make damn sure you are running the same settings. Ill post screens for my suggestion as its the best I can do since I do not have multiple 5 series cards.


----------



## Solaris17 (Sep 7, 2010)

TheMailMan78 said:


> My mind cannot process that level of bullshit.





Darknova said:


> I'm in total agreement with TheMailMan78. Considering the fact that I own a rather old Phenom II (first tri-core) and I can still play ALL games I throw at it smoothly (ok, maybe not the solid 60 the OP was after, but still smooth) I call bullshit as well.
> 
> To make such a broad sweeping statement as you just did requires you to prove that Phenom II or any AMD chips can't handle games at all, which I'm sure most AMD users on this forum can attest otherwise.
> 
> (oh, and up until recently, I haven't been overclocked)



while you guys call BS im going to go enjoy my 60+ fps constant.

but an intresting question I have. is this the same with all of intels Ix chips? or just the i7's?


----------



## cadaveca (Sep 7, 2010)

phanbuey said:


> ive been saying this for a while about the 1090T.  AMD do suffer from a performance hit when it comes to gaming vs i5 750 even.



Well, even this is questionable at this point, as all the data that i see that reflects this was done almost a year ago, and before AMD made the big driver change.

After a year, if drivers didn't affect performance in one way or another, then something is wrong with the driver team.

So we have to ignore all previous testing, and start over with fresh data. Fortunately I have the time and resources to make it happen, but it's not going to happen overnight.


Maybe 1090T is better now...even mobo bios have changed a bit(AGESA CODE), so basic cpu performance may not be what it was either. Hence Steevo's post being important...I'm not approaching this with any bias towards any company..I just want to find out what's going on.


----------



## TheMailMan78 (Sep 7, 2010)

Run these.


----------



## erocker (Sep 7, 2010)

TheMailMan78 said:


> Run these.
> 
> http://img.techpowerup.org/100907/AA 2.jpg
> http://img.techpowerup.org/100907/AAMode 3.jpg
> ...



Looks like default settings to me. That's what I run.


----------



## cadaveca (Sep 7, 2010)

You can always just click "restore Factory Defaults", too.


----------



## TheMailMan78 (Sep 7, 2010)

erocker said:


> Looks like default settings to me. That's what I run.



They are not. All of those were adjusted with the exception of the AI. I leave that at default.



cadaveca said:


> You can always just click "restore Factory Defaults", too.



And you will end up killing performance.


----------



## cadaveca (Sep 7, 2010)

TheMailMan78 said:


> They are not. All of those were adjusted with the exception of the AI. I leave that at default.



So why would we use those settings? I'm open to any ideas, if you can justify them. how would restoring factory defaults ruin performamnce? I'm curious now...


----------



## TheMailMan78 (Sep 7, 2010)

cadaveca said:


> So why would we use those settings?



Because it leave the option up to the application and not the driver. AA, AF all that good stuff will be up to the app instead of being "forced" by the catalyst.



cadaveca said:


> So why would we use those settings? I'm open to any ideas, if you can justify them. how would restoring factory defaults ruin performance? I'm curious now...



Because for years ATI in all their wisdom likes to use brute force in a lot of cases that do not need it.


----------



## cadaveca (Sep 7, 2010)

TheMailMan78 said:


> Because it leave the option up to the application and not the driver. AA, AF all that good stuff will be up to the app instead of being "forced" by the catalyst.



Well, the whole point it to compare each app on it's own, as they will require differnet work from the driver.

Really, my point is to not approach this as an enthusiast, but as a normal consumer...so a fresh OS, with freshly installed driver, and no modification of anything, is the norm, not customized settings.



TheMailMan78 said:


> Because for years ATI in all their wisdom likes to use brute force in a lot of cases that do not need it.



Exactly. So custom settings that override this problem, don't do anyone any favors. We want to expose the faults in AMD's driver, not cover it up.


----------



## erocker (Sep 7, 2010)

TheMailMan78 said:


> They are not. All of those were adjusted with the exception of the AI. I leave that at default.



No, I just checked. Those are default settings.


----------



## Robert-The-Rambler (Sep 7, 2010)

*Ghostbusters Still Runs Like Crap*

Try the Times Square level and you will be mortified at how bad AMD does there. Stock Phenom II 920, triple 4850 1 gig, 4 gigs of DDR 2 RAM @ 2560 * 1600. It locks up, the frames drop from 60 down to 20 in an instant and it is almost unplayable and certainly unenjoyable. It literally stops at times. That is my experience in Ghostbusters. Perhaps I will just move on to my Athlon II Rig and test at 1080p. If that sucks the same I'll try the 9850 with 2 1 gig 4850s.

I've been using Cat 10.5. Won't upgrade just yet since I had bug where manual fan control wouldn't work after upgrading drivers. A 4850 X2 without manual fan control is a jet turbine under load.

P.S The Athlon II 635 sucks in Ghostbusters, too.


----------



## TheMailMan78 (Sep 7, 2010)

erocker said:


> No, I just checked. Those are default settings.



No they are not man!


----------



## erocker (Sep 7, 2010)

TheMailMan78 said:


> No they are not man!



Ok then, I'm using _your_ settings even though for me, they are set at default. Tell me what isn't default?


----------



## cadaveca (Sep 7, 2010)

actually, he's got an AA setting changed. it's set to 8x, then set to application default.

And I think running anything other than default settings is inaccurate.


----------



## erocker (Sep 7, 2010)

cadaveca said:


> actually, he's got an AA setting changed.



Multi Sample AA = Default

Use Application settings for the AA setting = Default


----------



## TheMailMan78 (Sep 7, 2010)

erocker said:


> Multi Sample AA = Default
> 
> Use Application settings for the AA setting = Default



Anything that says "Use Application settings" by default is off. I turn that on.


----------



## cadaveca (Sep 7, 2010)

His:








Mine:






not saying it will make any difference, but he seems to want to be pleased here(not sure why we need to do it his way either)


----------



## TheMailMan78 (Sep 7, 2010)

Don't tell me we now have a bug where people get different default settings. I SWEAR Ill go green.


----------



## cadaveca (Sep 7, 2010)

TheMailMan78 said:


> Don't tell me we now have a bug where people get different default settings. I SWEAR Ill go green.



Oh yeah, beleive it. yet another issue I have...CCC uninstall doesn't remove old settings. Probably 90% of reported issues are due to this...and has nothing to do with user error. it may show default settings, but maybe registry shows different. Overdrive and AA affected most often, it seems. Also keeps desktop arrangement(eyefinity profile, desktop resolution, etc).



Also, having the lock in Overdrive panel unlocked changes things as well..I have not touched it.


----------



## TheMailMan78 (Sep 7, 2010)

cadaveca said:


> His:
> 
> http://img.techpowerup.org/100907/AA 2.jpg
> 
> ...



You don't need to do it my way at all. I was trying to give you a controlled environment for your experiment. I was trying to help. But in light of the fact we all have different fucking default settings then screw it!


----------



## cadaveca (Sep 7, 2010)

Well, to me defaults HAVE to be used..we really need to eliminate every variable possible.

But your settings up there are not default, so seems not the right approach. I get what you are after, though, and it's a good thing to ensure, for sure. That's why I'm doing all this testing myself, and bought a complete second rig to do so.


----------



## erocker (Sep 7, 2010)

TheMailMan78 said:


> Anything that says "Use Application settings" by default is off. I turn that on.



I don't know man.. All of my settings match your settings and I'm at default.. I guess I post up some pictures too.


























Why do you do this to me when I'm trying to quit smoking? F****** WHY?!!!! Whatever your next system is... I'm getting just the opposite!


----------



## cadaveca (Sep 7, 2010)

erocker said:


> Why do you do this to me when I'm trying to quit smoking? F****** WHY?!!!!





You and me both...funny...I think we're doing pretty good, considering.


 Deep breathing really seems to help me get over the craving, BTW.


----------



## erocker (Sep 7, 2010)

TheMailMan78 said:


> Anything that says "Use Application settings" by default is off. I turn that on.



Yo sh*t is messed up then. I rest my case. Have fun with Nvidia. 



cadaveca said:


> You and me both...funny...I think we're doing pretty good, considering.
> 
> 
> Deep breathing really seems to help me get over the craving, BTW.



I'm on the patch and about fifty cases of gum a day!


----------



## AphexDreamer (Sep 7, 2010)

MY AMD CPU gets me constant 60FPS in all my games with Vsync on with no micro stutter. Only games that I don't get constant 60 but see constant 30 is Crysis and Crysis Warhead. 

Of course my CPU is overclocked but thats a given with most AMD CPU's (PhenomII) you can either unlock the core, overclock it by at least 500mhz/more or both.

Anyone suggesting AMD can't do 60FPS must still be on Athlon or hasn't heard of overclocking yet.


----------



## TheMailMan78 (Sep 8, 2010)

cadaveca said:


> Well, to me defaults HAVE to be used..we really need to eliminate every variable possible.
> 
> But your settings up there are not default, so seems not the right approach. I get what you are after, though, and it's a good thing to ensure, for sure.



Honestly with the fact you all have different setting in your catalyst I have no idea how you are going to have a real experiment. You do not have a controlled environment. This is something W1zz should tackle as he can provide both environments in the same setting.

For instance the other day I sent a PM to Erocker about his FPS in LRD2. I was getting 40+ more frames per second then him in the opening scene and hes running duel GPUS. After further investigation we discovered he had one small setting different then mine (AF) and from the he got a HUGE jump in FPS.

And thats my point. If you guys want to do testing then you have to have a stable environment.


----------



## cadaveca (Sep 8, 2010)

erocker said:


> I'm on the patch and about fifty cases of gum a day!



Cold turkey, here. I get super bitchy though, such that each time I try to quit, my wife buys me more. 

Aphex Dreamer, the point is that if you overclock, you have no warranty. We know it's required to overclock to get good performance...but it shouldn't be. 




TheMailMan78 said:


> *Honestly with the fact you all have different setting in your catalyst* I have no idea how you are going to have a real experiment. You do not have a controlled environment. This is something W1zz should tackle as he can provide both environments in the same setting.




MM, that's why I have two exact same rigs, and am really undertaking testing myself. I will not be posting results for 300+ games in THIS thread...I will try to present them in a palatable manner.

*Also, the only person with whacked defaults is YOU..and you even said they are NOT default, so I'm gonna call you a troll again*. Sry dude, but make up your mind whether they are default or NOT.

Once I have some results done, I will get people to verify, then continue.


----------



## erocker (Sep 8, 2010)

TheMailMan78 said:


> Honestly with the fact you all have different setting in your catalyst I have no idea how you are going to have a real experiment. You do not have a controlled environment. This is something W1zz should tackle as he can provide both environments in the same setting.
> 
> For instance the other day I sent a PM to Erocker about his FPS in LRD2. I was getting 40+ more frames per second then him in the opening scene and hes running duel GPUS. After further investigation we discovered he had one small setting different then mine (AF) and from the he got a HUGE jump in FPS.
> 
> And thats my point. If you guys want to do testing then you have to have a stable environment.



You jerk! I screwed up and accidentally had an in game setting set to 8 instead of 16. I'm sorry!!!!  I thought you had better FPS than me though.. Either way, I changed it and the results were similar to yours.


----------



## AphexDreamer (Sep 8, 2010)

cadaveca said:


> Aphex Dreamer, the point is that if you overclock, you have no warranty. We know it's required to overclock ot get good...but it shouldn't be.



Sure it voids warranty but I've never had to use it. I've overclocked several AMD Cpu's since Athlon 64 (and up) and never had one go bad to this day. I guess ultimately the only variable that matters is you.

Overclocking is becoming a standard that people will come to expect and although its nice to have stock performance be great my point is I don't need it to be.


----------



## cadaveca (Sep 8, 2010)

Yeah, you do have a point, however, there are many people with environments like mine, that prevent any overclocking.

Moreso, because it seems AMD driver team is working on overclocked systems, they are optimizing for overclocked systems, rather than stock, and if they made those gains at stock, the performcane would be BETTER when overclocked.

Overclocking is NOT a standard. It may be perceived as "normal" , but that's something far different.


----------



## TheMailMan78 (Sep 8, 2010)

cadaveca said:


> Cold turkey, here. I get super bitchy though, such that each time I try to quit, my wife buys me more.
> 
> Aphex Dreamer, the point is that if you overclock, you have no warranty. We know it's required to overclock to get good performance...but it shouldn't be.
> 
> ...



Call me a troll all you want. Call my setting whacked out all you want. Bottom line is I get better FPS then you with a stock CPU. So who's the screw up here?

Anyway did it ever occur to you that my defaults might be different because I am running a single GPU and not a multi setup? Mr. TWKR.


----------



## saikamaldoss (Sep 8, 2010)

i always wonder why people compare Phenom II with a i7 ... hmmm the memory channal is not the same.. i7 has 3 and phenom has only 2

so in a i7 more data can be sent and reserved 

phenom is to be compared with Qx the last gen quad and not i7 hmmm 

the compaction for i7 will not be released by AMD since the next AMD processor is bulldozer with 4channel memory


----------



## MohawkAngel (Sep 8, 2010)

TheMailMan78 said:


> Don't tell me we now have a bug where people get different default settings. I SWEAR Ill go green.



Youre already a weed smoker anyway !


----------



## EastCoasthandle (Sep 8, 2010)

Hey do me a favor:
Downloaded a program called RamMap.  This is a nifty program that allows you to empty System Working Set (not something you use all the time) and modified set.  Give it at least 1 minute and try GhostBusters again.  No, the availability of memory in and of itself is not a problem here.  No, this is not an implication that you don't have enough memory.  Just try this for the giggles to see if things improve or not.


----------



## erocker (Sep 8, 2010)

cadaveca said:


> *Also, the only person with whacked defaults is YOU..and you even said they are NOT default, so I'm gonna call you a troll again*. Sry dude, but make up your mind whether they are default or NOT..





TheMailMan78 said:


> Call me a troll all you want. Call my setting whacked out all you want. Bottom line is I get better FPS then you with a stock CPU. So who's the screw up here?
> 
> Anyway did it ever occur to you that my defaults might be different because I am running a single GPU and not a multi setup? Mr. TWKR.



Don't make me shut this thead down before I get my Ghostbusters results.  Then, I would have filled up my HDD and spent precious download time all for nothing. Then I'll get angry. Angry mod having a nic fit with his finger on the infraction button is not cool. So quit it!  Ok, at 92% almost done..


----------



## cadaveca (Sep 8, 2010)

Sry e, but he claimed I was contradicting myself, and then proceeeded to do the same himself. pot calling the kettle black, and he's derailed this a page already, with nothing to realyl add, except to try to get me going. I'll not fall for it though. Infract away if you wish..will give me more time to test.


----------



## crazyeyesreaper (Sep 8, 2010)

eitherway erocker cadaveca send me a list of games you have that you plan to test and ill compare my results later on with the games i own. Im still using 10.4a and im NOT using the newer crossfire profiles so we shall see if that has any impact as well


----------



## cadaveca (Sep 8, 2010)

crazyeyesreaper said:


> eitherway erocker cadaveca send me a list of games you have that you plan to test and ill compare my results later on with the games i own. Im still using 10.4a and im NOT using the newer crossfire profiles so we shall see if that has any impact as well



take a look @ STEAM for cadaveca, and you can see my games list..I'll be going through all 200+ there, as well as games within my EA downlaoder, and D2D...too many to list.



Might be better until I can post some results...what I really need is people to run the exact same settings, same drivers, everything...to validate this...that means same ram timings, same bios, same driver, everything...

If you want to run through the things Robert mentioned, that's be great...specifically looking for games that don't meet 60FPS...and then to see what can fix it.


----------



## crazyeyesreaper (Sep 8, 2010)

ill take a look im suppose to have a week to 2 weeks of free time coming up so while i cant test now i certainly can then and i can send you the info via PM per game with my exact settings etc

ill do some testing eventually

my system is stock for the time being with ram at 1333mhz 7 7 7 20 1T 

when i do start testing ill begin with Dragon Age and play through the begining 3 times in single card and crossfire at stock and then ill try my mild OC settings of 3600/2400 and see what difference that makes think of my data collected as a foot note to get a bit more scope from


----------



## Athlon2K15 (Sep 8, 2010)

So we are going to have a benchmark war? AWESOME count me in!!!


----------



## TheMailMan78 (Sep 8, 2010)

cadaveca said:


> Sry e, but he claimed I was contradicting myself, and then proceeeded to do the same himself. pot calling the kettle black, and he's derailed this a page already, with nothing to realyl add, except to try to get me going. I'll not fall for it though. Infract away if you wish..will give me more time to test.



I never said that Denial Son. I said we ALL have different settings and that we need a controled enviroment. Now I'm off to enjoy ALL OF MY GAMES AT 60+ FPS. Have fun with your experiment Dr. TWKR.


----------



## cadaveca (Sep 8, 2010)

Robert-The-Rambler said:


> Try the Times Square level and you will be mortified at how bad AMD does there. Stock Phenom II 920, triple 4850 1 gig, 4 gigs of DDR 2 RAM @ 2560 * 1600. It locks up, the frames drop from 60 down to 20 in an instant and it is almost unplayable and certainly unenjoyable. It literally stops at times. That is my experience in Ghostbusters. Perhaps I will just move on to my Athlon II Rig and test at 1080p. If that sucks the same I'll try the 9850 with 2 1 gig 4850s.
> 
> I've been using Cat 10.5. Won't upgrade just yet since I had bug where manual fan control wouldn't work after upgrading drivers. A 4850 X2 without manual fan control is a jet turbine under load.




think you could host a save file or two somewhere's?


----------



## LAN_deRf_HA (Sep 8, 2010)

saikamaldoss said:


> i always wonder why people compare Phenom II with a i7 ... hmmm the memory channal is not the same.. i7 has 3 and phenom has only 2
> 
> so in a i7 more data can be sent and reserved
> 
> ...



People compare what's priced equally; why I often compare an i5 750 to 1055T. That's what's fair any way you cut it. 

After everyone sorts out standard settings I'd recommend retesting with slight setting changes. See if a performance hit exists for one person but not another then compare the setups.


----------



## cadaveca (Sep 8, 2010)

LAN_deRf_HA said:


> After everyone sorts out standard settings I'd recommend retesting with slight setting changes. See if a performance hit exists for one person but not another then compare the setups.





Exactly. Thank you so much for understanding.


That is exactly what AMD should be doing, when it comes to drivers. But it kinda just dawned on me a while ago...if they did that, things wouldn't be so cheap, now would they?

Now imagine having to do this, every month. It's a bloody huge task...


That's what had me testing at stock speeds in the first place, and having this idea...how can you really have any standard, but stock speeds?


Now, I understand if AMD were running tests and developing with overclocked systems...it gets the work done faster. It's a big part of why I'm so sure that's what they are doing...

I don't care who is faster, who costs more...all I know is that if this is done right, everyone will benefit from it.


----------



## Mussels (Sep 8, 2010)

lol at mailmans trolling.



cadaveca said:


> I am stil ltesting, but it seems the i7 overclocked still beats out AMD. i7 costs more, so to me, it's no big deal...you get what you pay for.
> 
> 
> 
> ...





the problem here *as stated in the other thread* is that you're running crossfire. that adds more CPU load into the mix.

If you had one GPU your performance would end up being better, due to having less CPU power needs on your older/weaker CPU. (remember, me and mailman have 2 more cores than you, and a newer CPU design)


i'm only upto page 3 here and i can already see whats happened. cad knows he's CPU limited and would rather blame AMD, than blame himself for having too slow a CPU to run his video cards.

Mailman is trying to say that he has zero issues whatsoever, even at a stock CPU - because he doesnt have crossfire.

the rest, was just trolling and poor communication skills.


(now to post this and read the rest of the thread)


----------



## Athlon2K15 (Sep 8, 2010)

the easiest fix for all these issues is buy an intel cpu


----------



## cadaveca (Sep 8, 2010)

Mussels said:


> lol



Uh, basically. Except that the platform as I chose it is the launch platform for the 5-series cards, top to bottom, but everything else is 100% bang on, I think.







And we'll see about the extra two cores...as soon as ASUS launches the CH4E.



AthlonX2 said:


> the easiest fix for all these issues is buy an intel cpu




I don't know that yet...but I'll test and find out...not gonna get bitten again.


----------



## LAN_deRf_HA (Sep 8, 2010)

Mussels said:


> lol at mailmans trolling.
> 
> 
> Are AMD Current CPUs Not Enough For Solid 60 FPS ...
> ...



Protip: Open up notepad and write your midway-through-thread reply so you don't forget the points in your mind currently, then finish reading before actually replying. Not saying it will apply here, just less likely to say something that doesn't fit doing it that way.


----------



## Mussels (Sep 8, 2010)

LAN_deRf_HA said:


> Protip: Open up notepad and right your midway-through-thread reply so you don't forget the points in your mind currently, then finish reading before actually replying. Not saying it will apply here, just less likely to say something that doesn't fit doing it that way.



my quote broke in that post, i fixed it and it makes more sense now.


----------



## 3volvedcombat (Sep 8, 2010)

From what Ive seen, theres not alot of games you can run max at 60fps, even with a HD 5970.

You cant talk about FPS and hate on a processor, when 85% of the dam work is done by the video card 

If i was a video card, i would be sad because no one is acknowledged i exist 

Seriesly guys.

6 core Thubans, are the same thing as Phenom II's just 2 extra cores here people, so dont think its going to MAGICALLY raise performance :shadedshu

A Phenom II can get 60fps solid, just clock it 4.0ghz and sit back and use a real video card. 

A Phenom I sucks balls. 

Any Athlon II or I Though i give props, isnt ready for 60 fps. 

Ofcourse a i7 is going to do great, Its got silly architexture, its going to clock nicely, and its got more cores to handle other task and threads so its the best out of the best at 280 dollars a pop.
i7 is more expensive, when equated to getting the RAM/MOBO/PROCESSOR
AMD is less expensive, when equated to getting the RAM/MOBO/PROCESSOR.
Ive seen Phenom II's excell at 60+fps. 
Dont complain, if you use antiviruz, Sick windows vista :shadedshu, or load up windows 7 with a bunch of programs, making it vista :shadedshu
and use a crappy video card 
45-60 fps for me is fine. 
for me, on my 1440x900 resolution with this GTX 470 and 4.0Ghz quad, I get 60+ FPS In crysis, many other games, GTA 4 sometimes, Any game that isnt a crappy port, or really heavy on the power like Mafia II or metro 2033

But anything else i get 60 fps solid. I mean, In my setting.

****EDIT****

After reading your post. HD 4850's arnt as fast as 4890's, or a GTX 460 at 800Mhz core.

Tripple 4850's could be a stuttering unstable, CPU BOTTLENECKED Disaster, in PCI-E bandwith, and Drivers. 

4850's on that AMD system could have some driver, stuttering issue's, and a motherboard could be the cause.

Those i7 systems have high end x58 motherboard, When they were released, and you slamming 2 4890's in them a GTX 460 solutions overclocked in them, there going to be faster. by a long shot.


----------



## LAN_deRf_HA (Sep 8, 2010)

Mussels said:


> my quote broke in that post, i fixed it and it makes more sense now.



Too bad it didn't break in that ^ post. Now my moronic spelling is immortalized (right/write)


----------



## Mussels (Sep 8, 2010)

3volvedcombat said:


> From what Ive seen, theres not alot of games you can run max at 60fps, even with a HD 5970.
> 
> You cant talk about FPS and hate on a processor, when 85% of the dam work is done by the video card
> 
> ...



you forgot turbo mode 

you also contradicted yourself in the part i didnt quote, about background tasks. you think those extra two cores wont help with antivirus and such trying to interfere?


----------



## 3volvedcombat (Sep 8, 2010)

Mussels said:


> you forgot turbo mode
> 
> you also contradicted yourself in the part i didnt quote, about background tasks. you think those extra two cores wont help with antivirus and such trying to interfere?



Thats true, Depends how greety the antiviruz cpu core thief can be


----------



## claylomax (Sep 8, 2010)

Found this: http://www.tomshardware.co.uk/phenom-versus-i7,review-31630.html


----------



## cadaveca (Sep 8, 2010)

claylomax said:


> Found this: http://www.tomshardware.co.uk/phenom-versus-i7,review-31630.html



Dated July 13, 2009...over a year old. Doesn't apply with today's drivers.


I'm kinda highlighting this problem for a reason...you raen't giong to find this info, or anything that relates, unless it was published...say...after April 2010.


----------



## claylomax (Sep 8, 2010)

Just as well . . .


----------



## cadaveca (Sep 8, 2010)

claylomax said:


> Just as well . . .



Still an interesting article. I have other issues with it, such as them using differnt vgas in each system, and only 790X chipset in the AMD system, but they mention these points as well.

In the end, it seems that there are a few caveats for getting 60FPS outta your cpu, and the vga(s) you use seems to play a very large role in determining what cpu is right for the performance you desire.

My problem relates to not being able to buy a processor with the nessecary speeds required to push my vgas. As Mussels mentioned, it's quite clear that in my case, limitations come into play that affect my overall system performance, that might not be present in a different config...however, beucase i bought the best available at the time these cards were launched, this really highlights, to me, a failure in the launch of Both Eyefinity, and the 58--series vgas, when used in CrossfireX.

Of course, the easy way to fix this is to overclock...however, this whole situation may highlight where the 6-series is headed, as to me, these "obvious" faults are things that they would look to adress with successive generations.

The details we have now about these up and coming cards seems to reflect my thoughts here, but as they can only be accepted as rumour at this point, we can't realyl make any conclusions about that just yet.

But, if these rumours are true, then I must stand my ground on this position, and call any computing solution featuring this shader design, when grouped together in configs larger than 1600 shaders, a failure, and this is based purely on the fact that overclocking your cpu can overcome these limitions, but at the same time, voids your warranty.


that may sound very "doom and gloom", but really, we don't hear of cpus dying very often, so clearly warranty concerns from overlcocking aren't that big of an issue...but to me, they are an issue nonetheless.


----------



## CDdude55 (Sep 8, 2010)

So are you guys still keeping on the trench coats to play Sherlock Holmes or is this mystery issue solved?


----------



## cadaveca (Sep 8, 2010)

I don't think it's really "solved", and I think you should have been able to tell that my posts over the past 2 months have been leading up to exactly this...but I was waiting to generate alot of data to prove my point, rather than having someone else ask the same question...this just simply sped up what I was looking at already.

Anyway, I think that this is very important both now, and in the future...the high-end enthusiast really needs to pay attention to this, I think, and really, as it seems to be a fact(but not 100% yet), it also highlights why certain changes exist from the 5-series to the 6-series.

So, when the 6-series launches, it should prove fruitful to make a compare of 5-series and 6-series, in exactly these situations. But if this issue had not been exposed prior to the launch of 6-series, it may not have even been really mentioned, or even investigated.


----------



## crazyeyesreaper (Sep 8, 2010)

agreed and yes im wearing my tench coat


----------



## Loosenut (Sep 8, 2010)

*subbed
I'm very intrigued by Cadaveca's theory and hopefully soon to be released results.

Good luck in your research Dave


----------



## cadaveca (Sep 8, 2010)

LOL...it's not going to be any time soon, FYI. I'm currently testing off-line games, and coming up with a lsit of apps that are affected.

But I also have to look at alot of maps within each game...so this makes it alot more time consuming than jsut trying to get some FPS numbers.

Robert(the OP) highlighted this too, in GhostBusters...his vgas ahve enough grunt to push that game maxed out, but certai nscenarios lead to really poor performance, just simply due to the number of vgas he has(of course, the investigation revolves around finding out exactly when these limitations are exposed, and it may just tunr out to be bad game programming..and not the number of vgas...that's the question that needs to be answered.).


----------



## CDdude55 (Sep 8, 2010)

cadaveca said:


> I don't think it's really "solved", and I think you should have been able to tell that my posts over the past 2 months have been leading up to exactly this..



Sorry for not keeping track of your posts throughout the past two months.

And thanks for saving the tech world as we know it with your hard hitting research and grab-the-bull-by-the-balls attitude. .


----------



## cadaveca (Sep 8, 2010)

bleh..I don't think, really, that AMD has missed this. Nor do I think I'm doing anything special...I just think I have noticed something that many have not, and given the responses around these issues, I might be right there.

honestly though, my mentioning all these problems I've had, and how long they have been around for without suitable fixes, to me, more highlights what we should expect from the future gen. It's only because the 6-series is coming that I'm trying to make this stuff widely known...because we can use these things to judge how good the 6-series is...without including the competition in the mix.

I do not consider AMD and Nv to be competitors at this point...thier primary focus is too different. 

I've mentioned before I expect "final hurrah" driver for the 5-series before the 6-series launches. So maybe alot of the outstanding issues will be fixed...but maybe, as I think, alot of these issues are pure-hardware-based, and cannot be solved by driver...time will tell.

And if they ARE hardware-based, they better damn well get fixed with these new chips.

Eyefinity launched...but you need 2x5870 for decent performance. And 2x5870 needs an overclocked cpu...that's a problem to me..to me, Eyefinity has launched.. but doesn't work.I can only consider it working once all of these issues have been addressed, and once it doesn't require you void your warranty to use it. Until that happens, and AMd sells and promotes tech that requires you overclock, then I have to agree with the OP and this thread title...that no AMD cpu is capable of pushing 60FPS.


----------



## EastCoasthandle (Sep 8, 2010)

I'm not sure what direction this thread is in since the OP.  However, if you are using CF or SLI with an AMD processor and you believe that you are not experiencing the level of smoothness and frame rate that others are getting there is one possible solution.  Other then making sure you use RamMap to empty "System Working Set" and "Modified Page List" one can also use RadeonPro to increase Flip Queue Size from it's default of 3 to 5.  Depending on the game, etc that may yield some improvement.


----------



## Loosenut (Sep 8, 2010)

@ Cadaveca  The more I read, the more I'm in awe of the burden you are voluntarily shouldering. 

Lol, you should apply for a gov't. grant...


----------



## cadaveca (Sep 8, 2010)

EastCoasthandle said:


> ~snip~




Thanks again for that info..I'll check it out.


What troubles me about these fixes is that while I myself might be aware of them, I do not think most others are.

That said, needing to use outside software to get basic performance either says AMD, or Windows, sucks big time. And because I cannot personally prove who of the two is at fault here, I'm left ignoring fixes like this, as they are far to advanced for a normal user. Things like this should happen automatically, and be transparent to the end user. it's not us that needs this info...it's AMD.



Loosenut said:


> @ Cadaveca  The more I read, the more I'm in awe of the burden you are voluntarily shouldering.
> 
> Lol, you should apply for a gov't. grant...





I don't think this is such a big deal as you make it out to be. But thanks anyway...and really, I don't need money to do this...nor do I expect to get anything out of it.

This is just me geeking out on the stuff I love.


----------



## saikamaldoss (Sep 8, 2010)

LAN_deRf_HA said:


> People compare what's priced equally; why I often compare an i5 750 to 1055T. That's what's fair any way you cut it.
> 
> After everyone sorts out standard settings I'd recommend retesting with slight setting changes. See if a performance hit exists for one person but not another then compare the setups.



ya u r right if the price is the same.. but does the x4 phenom II cost the same as i7 920 ??.. i didnt know that... hmmm


----------



## saikamaldoss (Sep 8, 2010)

LAN_deRf_HA said:


> Protip: Open up notepad and write your midway-through-thread reply so you don't forget the points in your mind currently, then finish reading before actually replying. Not saying it will apply here, just less likely to say something that doesn't fit doing it that way.



The way u talk..  :shadedshu  u make people feel bad


----------



## erocker (Sep 8, 2010)

If people want to come up with proof or results for this topic, fine. Make a thread and post them. Otherwise this thread is going nowhere fast. Stay on topic.


----------



## LAN_deRf_HA (Sep 9, 2010)

saikamaldoss said:


> The way u talk..  :shadedshu  u make people feel bad



I thought it was nice practical advice, and the "not saying it will apply here" was meant to emphasize it was more general advice, less attack on the poster. ON topic: I don't think we have it together enough as a community to organize this properly.


----------



## Robert-The-Rambler (Sep 9, 2010)

*Thanks Erocker and a Clarification*



erocker said:


> If people want to come up with proof or results for this topic, fine. Make a thread and post them. Otherwise this thread is going nowhere fast. Stay on topic.



I just want to restate that what I am really looking for is a list of games where say at 1080p it is impossible for AMD processors on any platform, even the fastest current ones at stock everything, even when your single GPU or multiple GPU setup is capable of that performance, to simply maintain solid/locked 60 frames per second and maybe even at times performance is terrible. I want people to get an accurate idea of whether buying certain AMD products are worth it and if you want to get certain games what to actually expect. Why?

Because so often benches are done that don't give a true picture of what is going to happen when you actually game. Resident Evil 5 is one where the bench is way more stressful than the game especially the fixed one. What if there was some other way of testing real world performance without identical short term demo runs? Maybe a reviewer should actually play entire levels and track that performance on a graph and report variances in frame rates with Vsync on. It doesn't have to be identical runs to get clarity if the test is long enough. Who cares about max frames if you are getting valleys on different platforms that alter the gameplay experience?

So often reviews of new gear are done on overclocked I7s to remove bottlenecks. That is fine and all but what about others with lesser gear. Perhaps reviews should be done on the best that AMD has stock vs the best that Intel has at stock and then with the methodology I mention track the performance giving a very clear picture of what to expect when you actually play the game. We do buy this stuff to actually play games. Surely we have the time to actually play the games and report general performance in charts utilizing programs such as FRAPS to tell us that maybe Ghostbusters or whatever it may be might really suck on this type of system whatever it may be. Or on an I7 we saw almost a flatline through the entire Times Square Level making it a much better choice for this title for example.

I don't care about benchmarks so much anymore. I want raw data that actually illustrates real world gaming conditions and gives the real story of how your system is going to play any particular game. I don't want to necessary turn this into an epic ramble but I've been pondering this for a while. Something keeps gnawing at my mind about standard deviations in data and what it all means and how it relates to the variance from the mean/average. If we lock the FPS at 60 and track performance while we play a game for an extended period the best systems are going to have the lowest standard deviation or variance from the mean. We can track that by simply saving the results using a program such as FRAPS, can't we? That is what I want to see. Real games played by real people with real data that tells you in a general sense exactly what to expect when you plan on purchasing a game, not just a representation of the game, but the actual game.

Thinking about breasts is a whole lot simpler.


----------



## cadaveca (Sep 9, 2010)

I think the issue at hand is that performance can change so much every month with each driver release, it's really hard to keep it up to date, and would really require many people working all month long, IMHO.

It would be nice if they could develop a nice little extensionof STEAM, say, that actually kept track of exactly what you are looking for...and I think Valve themselves actually do this with Source-based games...I know they do collect quite a large amount of data from players actually playing the game, down to things like where people died, ammo used, etc...and all other things that they use for load-balancing.


----------



## Mussels (Sep 9, 2010)

its got little to do with drivers, video cards, or CPU's... modern games just often have shitty coding, and slow down at various points of the game no matter your hardware.


----------



## TheMailMan78 (Sep 9, 2010)

Mussels said:


> its got little to do with drivers, video cards, or CPU's... modern games just often have shitty coding, and slow down at various points of the game no matter your hardware.



That pretty much summed it up Robert. I hate to say Mussels is right but he is. Games nowadays are coded for consoles first and PC second and depending on the developer and who his "sponsors" are some hardware will ALWAYS perform better then others. But with a simple driver update that could change. This is the nature of PC gaming.

In the end raw power will never be enough. Crossfire rigs have ALWAYS had problems scaling. This is nothing new. Its all in the coding. With that being said I personally run a single ATI GPU with an AMD CPU get 60 FPS in most of my games as a *minimum* and I have seen i7 rigs get a lot less due to user error. Every system is different. Every user is different. Just learn to use what you have the BEST you can.


----------



## LifeOnMars (Sep 9, 2010)

I'm looking to build a gaming rig in the next month or so, 1680x1050 res monitor. I was set on getting a nice hexacore AMD setup as I have a fairly decent budget, this thread however has me worried as I like to run my games smooth as silk. Is it wise to stick with an intel i7 or what is new on the horizon from both companies, any pointers would be great?


----------



## TheMailMan78 (Sep 9, 2010)

LifeOnMars said:


> I'm looking to build a gaming rig in the next month or so, 1680x1050 res monitor. I was set on getting a nice hexacore AMD setup as I have a fairly decent budget, this thread however has me worried as I like to run my games smooth as silk. Is it wise to stick with an intel i7 or what is new on the horizon from both companies, any pointers would be great?



Dude this thread in short is just smoke and mirrors. With that being said I personally would wait a month or two if you can. New architecture is right around the corner. I mean within months. Wait and see.


----------



## CDdude55 (Sep 9, 2010)

LifeOnMars said:


> I'm looking to build a gaming rig in the next month or so, 1680x1050 res monitor. I was set on getting a nice hexacore AMD setup as I have a fairly decent budget, this thread however has me worried as I like to run my games smooth as silk.



This thread is the last thing you should be reading in determining what rig to go for.

And i agree with MailMan, we should be seeing a new architecture coming out pretty soon, so i think it would be best to wait for that.


----------



## Steevo (Sep 9, 2010)

My 4850 and quad was rocking GTA4 just fine, not at 60 all the time with the settings I had, but it was still running awesome compared to others. 

Don't worry about running a AMD system, if it were that much of a bottleneck no one of us would run them, but we do. I love mine still, I have enough power to run a TV with hardware accelerated Netflix while I game on my monitor on most games.


----------



## Dent1 (Sep 9, 2010)

LifeOnMars said:


> I'm looking to build a gaming rig in the next month or so, 1680x1050 res monitor. I was set on getting a nice hexacore AMD setup as I have a fairly decent budget, this thread however has me worried as I like to run my games smooth as silk. Is it wise to stick with an intel i7 or what is new on the horizon from both companies, any pointers would be great?



The performance issues being discussed in this thread are specific to Ghostbusters an Dragon Age Origins only, it has no bearing on any other game and from what I’ve read these two isolated games are to do with a bug or bad coding? Which is causing poor scaling on crossfire with the AMD X4s. TheMailMan has an X6 and a non-crossfire setup and he is getting solid FPS, unlike the OP.

Look at this logically, search Google and you'd see plenty of forums with people complaining that their i7 or SLI set up is getting poor frame rates in a specific game, but it wouldn’t stop you from buying a i7  would it? Poor performance in a isolated game has no reflection on the i7 or Phenom IIs performance.

All this “wait for  AMD’s next architecture” stuff is nonsense, do not avoid AMD, avoid Ghostbusters and Dragon Age Origins.

Anyways, this isn’t something that concerns you as 1680x1050 is considered to be low resolution, the OP has a 30" screen @ 2560x1600


----------



## erocker (Sep 9, 2010)

Dent1 said:


> The performance issues being discussed in this thread are specific to Ghostbusters an Dragon Age Origins only,



I can tell you right now, with my CPU at stock, I get a constant 60fps in Ghostbusters. In Dragon Age I am also consistently over 60fps with the exception of right before the end battle where there are 100's of soldiers on screen and I dip into the 50's. It's not even a part where you have to battle as you're just walking towards a gate. I'd be interested to see an i5 750 during this scene.


----------



## Dent1 (Sep 9, 2010)

erocker said:


> *I can tell you right now, with my CPU at stock, I get a constant 60fps in Ghostbusters. In Dragon Age I am also consistently over 60fps *with the exception of right before the end battle where there are 100's of soldiers on screen and I dip into the 50's



For the benefit of others can you confirm that you run a AMD processor?



erocker said:


> Well, it's in my signature...



Ok.


----------



## erocker (Sep 9, 2010)

Well, it's in my signature...


----------



## cadaveca (Sep 9, 2010)

Dent1 said:


> For the benefit of others can you confirm that you run a AMD processor?



He has no issues, becuase he's got an overclocked cpu. *Which WASN'T Robert's question*...he was asking for cpus running at stock, to which everyone here, except me, does not apply.



Robert-The-Rambler said:


> *I just want to restate that what I am really looking for is a list of games where say at 1080p it is impossible for AMD processors on any platform, even the fastest current ones at stock everything*, even when your single GPU or multiple GPU setup is capable of that performance, to simply maintain solid/locked 60 frames per second and maybe even at times performance is terrible.



It's a specific question, with a specific answer.


No, AMD cpus, at stock, are NOT good enough for 60FPS, when running more than one cypress card. Yes, they are fine, if you run a single Cypress-based card, or anything that performs less than that.

The question doesn't ask...HOW you get better performance, and it definately DOESN'T include overclocked cpus. I wish everyone would really STOP saying that they are fine, when overclocked...NO SHIT. That WAS NOT Robert's question. And it definately DOESN'T include JUST Dragon Age, and Ghostbusters.

Use one video card, and yes, things are generally fine...with a stock cpu, 3.0ghz and higher. Like Mussels metioned...if it isn't, then the issue isn't really anything other than bad programming.

But, my point in all of this, is that no matter the programming, 2 or more Cypress based cards NEED more than a stock AMD cpu. As such, if you are using multiple Cypress-bsed cards, you are better off with either an overclocked AMD cpu, or an overclocked i7 cpu(and i7 has the edge). And through my testing, this is due to a driver issue, to which both AMD and Intel cpus run into a bottleneck, and why that bottleneck occurs, isn't exactly obvious. Hence me doing the testing I am right now....I am just interested to find out, whether it's the NB that needs overclocking...the cpu speed...maybe ram...I don't know.


----------



## erocker (Sep 9, 2010)

cadaveca said:


> He has no issues, becuase he's got an overclocked cpu. *Which WASN'T Robert's question*...he was asking for cpus running at stock, to which everyone here, except me, does not apply.
> 
> 
> 
> ...



Sucks I'm just on one card now but, I think you can leave you CPU stock and just up the CPU/NB frequency which I think is where the real bottleneck is. When Dirt 2 came out I know positively that I was running my CPU stock with CrossFire and I would get occasional slowdowns. Adjusting the CPU/NB from 2000 to 2400 fixed it.


----------



## Dent1 (Sep 9, 2010)

cadaveca said:


> He has no issues, becuase he's got an overclocked cpu. *Which WASN'T Robert's question*...he was asking for cpus running at stock, to which everyone here, except me, does not apply.



Didn’t  TheMailMan claim solid FPS running at stock?

The premise of the post was to confort LifeOnMars whom is looking at the AMD X6. I was relating his situation to TheMailMan whom is already owns  AMD X6 and has no issues.



cadaveca said:


> And it definitely DOESN'T include JUST Dragon Age, and Ghostbusters.



Maybe not but Dragon Age and Ghostbusters are the two games in question in this thread. To add some balance maybe we should list all the games which Intel CPUs perform arguably inadequate in?


----------



## cadaveca (Sep 9, 2010)

erocker said:


> Sucks I'm just on one card now but, I think you can leave you CPU stock and just up the CPU/NB frequency which I think is where the real bottleneck is. When Dirt 2 came out I know positively that I was running my CPU stock with CrossFire and I would get occasional slowdowns. Adjusting the CPU/NB from 2000 to 2400 fixed it.


Yeah, and I get that. However, Dirt2 released long before the signifigant driver change...before this change, Crossfire scaling didn't seem to have this bottleneck.

So, with that in mind, any review out there, that kinda looks at this issue...was done prior to that change in the drivers, and as such, must be verified, before being accepted as fact.

The same is true every month, when a new driver comes out...old reviews may not have accurate data....but because the changes were so great...I truly think that this specific period of time, when the driver was changed, is very important.



Dent1 said:


> Didnt TheMailMan claim solid FPS running at stock?
> 
> It's a specific question, with a specific answer.




And as I said, he's running a single card, so the situation is OK. The rest is MailMan trolling, IMHO, because he didn''t like my answer. He ignored that I made the stipulation, that it seems once these 5-D shaders are in greater numbers than 1600, AMD cpus have issues feeding the cards data. Robert, with his 3c4850, is @ 2400 shaders, so runs into the same situation that I describe.

And yes, as I've said before, at stock, you bet Intel runs into the same problem...for some reason not as quickly. And because of this, I largely blame the driver alone...becuase, as erocker suggests here...things weren't so bad before....

I'm NOT singling out AMD cpus here....which is where you seem headed...I really think it has nothing to do with anything other than the VGA driver. BUt agian, I am NOT 100% sure on this.


Robert is looking for a list of these games that have these problems...WITH STOCK CPUS. My testing, so far, says  ALL APPS have issues with Crossfire scaling...but not all are affected so much that they drop below 60FPS...hence me telling him that it's giong to take some time before I can identify each one.

Everyone brings up good points that can help overcome this problem, but none of them are actually answering Robert's question. The response "Overclock, and it goes away" isn't a proper answer. The response "I have a single card, and no issues", doesn't apply.


And Robert, please feel free to correct me here if I am wrong in any of this.


----------



## CDdude55 (Sep 9, 2010)

But that's all stuff we already knew.

What is exactly the question that needs to be answered?


----------



## cadaveca (Sep 9, 2010)

CDdude55 said:


> What is exactly the question that needs to be answered?




Which apps, with stock AMD cpus, cannot break 60FPS consistently(ie, 60FPS minimum)...both with single cards, and with dual cards. If there are momentary dips in one small part of the game..I don't think he's too concerned with that. He mentioned Ghostbusters..and a specific level..that becomes unplayable. He's looking for the apps that become unplayable, and to me, specifically, the ones that are unplayable due to cpu @ stock alone.


----------



## CDdude55 (Sep 9, 2010)

cadaveca said:


> Which apps, with stock AMD cpus, cannot break 60FPS consistently(ie, 60FPS minimum)...both with single cards, and with dual cards. If there are momentary dips in one small part of the game..I don't think he's too concerned with that.



But i don't understand, like i said before there is a ton of factors that can contribute to not breaking the 60 frames barrier, which apps and AMD CPU's depends on design, coding, drivers and a TON of other factors. So i guess im not understanding where the testing comes in, i mean of course it's possible to obtain 60 frames with an AMD CPU, but again, even then it comes down to more factors.  What are you testing to obtain exactly?


----------



## erocker (Sep 9, 2010)

I know for a fact that using 10.4's with CrossFire 5850's CPU at stock 2400mhz CPU/NB I would get over 60fps in Dirt 2. No question in my mind.


----------



## crazyeyesreaper (Sep 9, 2010)

its the fact CD dude that say 10 months ago a 940be + 2 5850s might get 90fps but today on a newer driver same setup you might only get 75fps avg due to the changes amd made in the driver the point here is to pinpoint WHAT apps are most effected what fixes it etc example if a 3ghz 1800mhz nb 940be was doing fine but now that cpu must be 3400mhz with 2400nb to get the same performance it use to have do you follow now CDdude?

were talking that games that use to run fine are all of a sudden more demanding on the system for no reason causing performance loss that was never present untill the driver changed back in i think it was april or march aka crossfire profiles being released on there own ... cant remember for sure

i can honestly say my performance crossfire wise in my old 5850 review was better then on a slower cpu and ram then it is today in the same games


----------



## cadaveca (Sep 9, 2010)

CDdude55 said:


> But i don't understand, like i said before there is a ton of factors that can contribute to not breaking the 60 frames barrier, which apps and AMD CPU's depends on design, coding, drivers and a TON of other factors. So i guess im not understanding where the testing comes in, i mean of course it's possible to obtain 60 frames with an AMD CPU, but again, even then it comes down to more factors.  What are you testing to obtain exactly?



See Crazyeye's post...

Also, I don't think Robert cares WHY they don't perform..maybe he just wants to know so he can avoid them(no point in spending money on a gmae that you're not going to enjoy becuase it plays poorly)...maybe the actual cause doesn't matter to him.

You bring up a good point though...maybe lowering an ingame setting can fix the problem..etc...



erocker said:


> I know for a fact that using 10.4's with CrossFire 5850's CPU at stock 2400mhz CPU/NB I would get over 60fps in Dirt 2. No question in my mind.



I will take a look without the NB OC...but basically, we could add that app to the list, as he's asking about stock cpu speeds.

It's funny too, because i remember a while ago mention that AMD might increase NB/HTT speed to 2400MHZ for Thuban..but that proved to not be the case. There are some Opterons out now, though, that have 2200mhz stock HTT. Maybe AMD is very aware of the "problem"...if you want to call it that. I said before this wasn't all doom and gloom...but it's an issue for me that means I personally need to spend more $$ on my pc.


----------



## pjladyfox (Sep 9, 2010)

I'm still going thru this thread so if this has been mentioned earlier I'll try and address it in an update once I'm done.

It's interesting that this came up since I just recently got finished researching out something similar for work. Now these thoughts are all my own and based solely upon the software and tools I was using and should not be considered definitive by any means. Each piece of software is different, due to either design or other factors, and because of this having an absolute set of results is not really possible so I share this with the hopes that it will generate more discussion and maybe help others who enjoy benchmarking hardware.

With that out of the way basically I was trying to identify what factors were affecting some performance issues with a software title I was working on so that I could configure a direct performance comparison between an Intel and AMD CPU. After playing around with things I noticed that the following factors had some influence in performance:

a. CPU speed - This one typically comes into play if the software in question is in a situation where there is a performance mismatch between the CPU and the video card preventing both from working optimally. Now this is a difficult thing to find since each piece of software tends to react differently such as the various differences between say the Unreal and Source engines.

Once you have found performance parity, or basically a situation where even if there is improvement it's not noticable without other tools, then you move onto the next factor.

b. L2 cache size - This one, while you would think would be a no-brainer for many, caught me by surprise. Like a lot of people I figured that if you had enough speed that this would not matter but that turns out not to be the case.

For example, I compared two different Athlon X2 4400+ processors against a stock Core2 Duo E4300. On both systems I used the same RAM and same video card but was seeing a variance of about 3 to 5 percent. After talking with some people at work and at AMD it turns out that it was the amount of L2 that was causing the issue with the older 4400+ having 2MB versus the newer 4400+ only having 1MB compared to the E4300's 2MB.

I then ran into the next issue...

c. L2 cache latency - This one was harder to detect in my case since the variance was only about 1 to 2 percent but I did eventually peg this as the only logical reason for what I was seeing. But, realistically, the earlier factors paid a more larger part than this did and only really seemed to come into play when the software was more RAM intensive than normal such as software like Photoshop or 3D Studio Max which use a LOT of RAM depending upon the operation.

d. L3 cache - This one, like the L2 cache, only really came into play if one of the CPU being compared failed to have any while the other one did. I really did not get to play much with this one since I really did not have that much hardware that directly compared well to begin with and I ran out of time. But, if you're ever picking a CPU this can and should be a factor to consider.


----------



## cadaveca (Sep 9, 2010)

Thanks for "unvilifying" my posts in the past about cpu cache being important..I had many people(funny, the some of the same people in this thread) say that cache didn't matter.

I also mentioned that L3 cache may be part of the issue here, as increasing NB speed in AMDchips also increases L3 speed too...


----------



## pjladyfox (Sep 9, 2010)

Mussels said:


> the problem here *as stated in the other thread* is that you're running crossfire. that adds more CPU load into the mix.
> 
> If you had one GPU your performance would end up being better, due to having less CPU power needs on your older/weaker CPU. (remember, me and mailman have 2 more cores than you, and a newer CPU design)
> 
> ...



I'm still pouring thru this myself and the more I read the more I'm agreeing with this so I'm glad someone said it already. 

Both Crossfire and SLI, at least in my experience, are a LOT more CPU intensive at all levels than many realize. While I give credit to NVIDIA for their ability to evenly scale I think they really did everyone a disservice by doing so since many expect for this to hold true regardless of the CPU being used. I've lost count of how many times I've tried to explain this to someone who goes on a tirade why their rig is not performing like they think it should with a newer piece of software.

Don't get me wrong here I'm not bashing on any particular company  but I do find it funny sometimes where when NVIDIA does comparisons to Crossfire in most cases I've seen it's never using the same platform as what AMD uses. Of course, the reason for this is so that they can quietly avoid the CPU bandwith issue and continue to sell their bridge chip to Intel which of course does not have the problem seeing how the platform used does not have performance parity with the AMD system. But I'm getting off on another tangent that really is not pertinent to the discussion at hand. 



cadaveca said:


> Thanks for "unvilifying" my posts in the past about cpu cache being important..I had many people(funny, the some of the same people in this thread) say that cache didn't matter.
> 
> I also mentioned that L3 cache may be part of the issue here, as increasing NB speed in AMD chips also increases L3 speed too...



I'm still pouring thru this thread but I must admit I'm surprised this did not really get focused on. Still, pretty interesting thread so far so I'm glad it did not get locked early on. 



cadaveca said:


> Robert(the OP) highlighted this too, in GhostBusters...his vgas ahve enough grunt to push that game maxed out, but certai nscenarios lead to really poor performance, just simply due to the number of vgas he has(of course, the investigation revolves around finding out exactly when these limitations are exposed, and it may just tunr out to be bad game programming..and not the number of vgas...that's the question that needs to be answered.).



I must have glossed over this or missed it but one thing I think needs to be considered here is that the base game itself really was not designed to scale beyond a single video card and a dual-core CPU. With that in mind any artificial (read: SLI or Crossfire profile) performance improvement with this title will focus more on the CPU with it being affected more by L2 and L3 cache size than CPU speed. To rule this out a good comparison would be the following:

Intel Core i5 750 2.67GHz, L2 4 x 256kb, L3 8MB 95w (or a Core2Quad Q9550 which would be a better match I think)

vs

AMD Athlon II X4 640 3.0GHz, L2 4 x 512kb, L3 n/a 95w

vs

AMD Phenom II 945 3.0GHz, L2 4 x 512kb, L3 6MB 95w

I would be willing to bet that where you would see performance issues on the Athlon II that the Phenom II nor Core i5 would suffer from mainly due to the addition of the L3 cache which would help any multi-GPU configuration and hinder those without it. I know the Core i5 in there will most likely have better performance than either of the AMD ones but I'm not really sure to be honest.


----------



## dir_d (Sep 9, 2010)

pjladyfox said:


> I'm still pouring thru this thread but I must admit I'm surprised this did not really get focused on. Still, pretty interesting thread so far so I'm glad it did not get locked early on.



I tried to focus on it but they went another way with the discussion. I strongly feel that the CPU-NB is the key, specially since now there is conclusive data by Anandtech.


----------



## cadaveca (Sep 9, 2010)

dir_d said:


> I tried to focus on it but they went another way with the discussion. I strongly feel that the CPU-NB is the key, specially since now there is conclusive data by Anandtech.



Oh, I agree 100%(as does erocker, clearly). The only point I was trying to make when you mentioned that previously was that the situation may not be exactly as Anandtech reports...because that review was so old...it might actually be worse!


----------



## crazyeyesreaper (Sep 9, 2010)

indeed ^ it IS worse cadaveca i re went through my old 5850 crossfire review back when i was using RELEASE DRIVERS and performance today is WORSE then it was back then...


940BE at 3.4ghz 2000nb 4gigs DDR2 800mhz 5-5-5-15 2T back then gets better performance then 
965BE at 3.6ghz 2400nb 4gigs DDR3 1333mhz 7-7-7-20 1T
performance is about 7-9% less then it use to be for me  granted i WONT notice it as i use Vsync at all times but when i did the review i ran with it off and if i run those same games now yea..... the difference is there and its extremely noticeable

 Phenom II 940 + 5850 crossfire mini review
example 1680x1050 VERY HIGH in crysis DX10 mode 4xAA i got 37fps average on the 940be + DDR2 now at same settings im only around 33fps with a 965be DDR3


----------



## TheMailMan78 (Sep 9, 2010)

erocker said:


> I know for a fact that using 10.4's with CrossFire 5850's CPU at stock 2400mhz CPU/NB I would get over 60fps in Dirt 2. No question in my mind.



With that being said I guess its safe to say its basically a driver problem and nothing more? I mean look at it this way.
1. CPU is the same. Even faster if you OC.
2. GPU is the same.
3. RAM is the same.
4. Driver is different.

Hmmmm I wonder what the problem is. 
This isn't brain surgery.


----------



## crazyeyesreaper (Sep 9, 2010)

its the driver true mailman but 10.3 10.4 10.5 10.6 10.7 10.8 all have the SAME issue so 6months of the same PERFORMANCE breaking issue and its kinda shhhhh dont say anything or AMD will freak sorta thing


----------



## TheMailMan78 (Sep 9, 2010)

crazyeyesreaper said:


> its the driver true mailman but 10.3 10.4 10.5 10.6 10.7 10.8 all have the SAME issue



Not true. 10.4 did not have the issue. You want 60 FPS with crossifre? Use those. Problem solved.


----------



## cadaveca (Sep 9, 2010)

TheMailMan78 said:


> Not true. 10.4 did not have the issue. You want 60 FPS with crossifre? Use those. Problem solved.



Not true. Second card is useless, even with 10.4. But, single card performance is very high with 10.4 in comparison to other drivers, so it's not as evident.

10.4 is probably the best driver for this issue, but 10.8 is good too..single card performance is lower, but Crossfire performance is a fair bit better than 10.4

For example, 10.4..I can run BFBC2 with one card, no problem. add second card..enable 4xAA...performance TANKS. 10.8, 4xAA works now...but FPS is still the same as single card.

It's really weird, TBH.


----------



## crazyeyesreaper (Sep 9, 2010)

uh wrong mailman im still using 10.4 and performance is worse here then the 9.11 hotfix drivers that fixed xfire long ago problem is 9.11 is before the updatable crossfire profiles so i cant roll back either otherwise BC2 runs like shit so its pretty much a no win situation right now


----------



## TheMailMan78 (Sep 9, 2010)

Well according to erocker his was fine.


----------



## crazyeyesreaper (Sep 9, 2010)

and erocker was back then at 4ghz + with a 2600 or so NB thats a far cry from STOCK 

im seeing WORSE performance today from a 965be with 1333mhz 7 7 7 20 1T on 10.4 cpu at 3.6 nb at 2.4


then i did 10months ago on a 940BE at 3.4ghz 2200nb and in the review i posted stock and oc speed made ZERO impact on crossfire performance so in 10months it took a 600mhz cpu clock speed increase and 600mhz NB speed bump to get the same performance i had back then on the same games


----------



## TheMailMan78 (Sep 9, 2010)

crazyeyesreaper said:


> and erocker was back then at 4ghz + with a 2600 or so NB thats a far cry from STOCK
> 
> im seeing WORSE performance today from a 965be with 1333mhz 7 7 7 20 1T on 10.4 cpu at 3.6 nb at 2.4
> 
> ...



Nope.....



erocker said:


> I know for a fact that using 10.4's with CrossFire 5850's CPU at stock 2400mhz CPU/NB I would get over 60fps in Dirt 2. No question in my mind.


----------



## cadaveca (Sep 9, 2010)

TheMailMan78 said:


> Well according to erocker his was fine.



Maybe i notice it, and he doesn't, because I use 5870's rather than his 5850's. I don't know. But given the testing so far, that would make sense...more shaders=bigger bottleneck. I have 10% more shaders...might make for 10% more bottleneck...I don't know yet.


Like I mean really...adding a second card should enable 4xAA...fortunately, with 10.8 it does...but all previous drivers failed.

And still, with stock CPU..BFBC2 is more often than not 45-60FPS, with 2560x1600. But it's one of those apps that plays fine @ 45FPS...


----------



## TheMailMan78 (Sep 9, 2010)

Maybe something was not set up correctly then and now bum drivers have added to the problem?


----------



## Nick89 (Sep 9, 2010)

I call BS. My 940 BE OCed to 3.4 with my 480 plays all my games EXCEPTIONALLY well.


----------



## cadaveca (Sep 9, 2010)

TheMailMan78 said:


> Maybe something was not set up correctly then and now bum drivers have added to the problem?



You could be right. And that's a big reason why I haven't really mentioned this before...I try to complain only about problems that can be replicated, and not just by the two systems I have here...

I don't know what the hell is going on, and I've said this from the start. _It seems_ to be the driver, and _seems_ related to the number of shaders....but only _seems_...

For all I know, it could be because the "master" card in my system is in the bottom slot...could be some weird bios problem...I have two rigs exactly the same same, save the PSU, so any of the components could be the cause, I suppose(except the PSU).

I think i have eliminated OS issues, driver install problems, and other such oddities...and other people that have completely differnt configs(Intel vs my AMD), but use the same drivers, have the same problem...but it's very possible I overlooked soemthing...

That's why I'm doing the testing, to try to isolate the problem. But even that seems wasteful, given that much older drivers, from last year, seem to not have this issue...so it's so very easy to just say "It's the driver"...



Nick89 said:


> I call BS. My 940 BE OCed to 3.4 with my 480 plays all my games EXCEPTIONALLY well.




You're overclocked. And you're using a single card.  :shadedshu So of course you do. Multiple cards, and stock cpus, please.


----------



## Dent1 (Sep 10, 2010)

cadaveca said:


> And still, with stock CPU..BFBC2 is more often than not 45-60FPS, with 2560x1600. But it's one of those apps that plays fine @ 45FPS...



I play BFBC2 @ 1440x900, have a 19" LCD and I average 60+, maximums well over 100+ FPS. 45 FPS is like the minimum I ever see but on average is much higher.

This is both OC'd and stock, high detail with my CF 4850s.

In comparison to running a single 4850 I noticed that my maximum FPS increased the most, I wasnt see 100+ FPS maximum with a single card. With a single card my minimum and average was about the same 45 FPS. CF added about 10-15FPS average.




cadaveca said:


> You know, that's great, but Robert asked for results @ 1920x1080. I'm trying really hard to not sound like a jerk, but your results don't help.
> 
> Games @ 1080p, stock cpu. If it plays well, 60FPS minimum, that's what he's looking for.



I no that I do not fulfil Roberts requirements, but this is still useful information for people that are sceptical about CF on more modest hardware.

One thing I have noticed, running a single 4850 whilst play GTA IV gave me smooth gameplay at high settings, but since adding a second 4850 for CF GTA IV's performance has dropped drastically to the point where the game is almost unplayable and jerky even at low settings.


----------



## cadaveca (Sep 10, 2010)

Dent1 said:


> I play @ 1440x900, have a 19" LCD and I average 60+, maximums well over 100+ FPS. It 45FPS is like the minimum I ever see but on average is much higher.
> 
> This is both OC'd and stock, high detail with my CF 4850s.



You know, that's great, but Robert asked for results @ 1920x1080. I'm trying really hard to not sound like a jerk, but your results don't help at all.

Games @ 1080p, stock cpu. If it plays well, 60FPS minimum, that's what he's looking for(or rather, he asked about games that DON'T make the cut)


Maybe Robert here works for AMD, and once he has this list, he will look at addressing the issue...

Maybe he just wants to know games to avoid buying...

I can't answer those two questions...



Dent1 said:


> I no that I do not fulfil Roberts requirements, but this is still useful information for people that are sceptical about CF on more modest hardware.





Also, you are under 1600 shaders. He's got a third card, and if I'm right about it being driver, and shader load, then your results don't directly relate to his...might give him the idea to drop one card though...I've said many times, it seems that somewhere over 1600 shaders this problem comes out.

So, because you aren't answering his question, but just posting for whatever you said, you are off-topic...


----------



## Nick89 (Sep 10, 2010)

Play BF BC2 at 1920 x 1200 everything maxed. With great FPS.

This thread is now about stock CPUs and crossfire? I thought it was about how much the PII's suck. >_>


----------



## cadaveca (Sep 10, 2010)

Well, it turns out that that is kinda the one situation that AMD cpus AREN'T enough. In this config, it seems that yes, AMD cpus suck. Unfortunately, that's exactly Robert's question...


But like everyone says, a simple NB overclock fixes it...you don't even need to increase cpu speed. Like Mailman says, alot in this thread is smoke and mirrors...avoid the specific config, and it's not an issue.

For me, I'm stuck with that config, so I've been investigating this exact thing ALOT(and hence the number of my own posts in this thread).


----------



## Nick89 (Sep 10, 2010)

Makes since.


----------



## claylomax (Sep 10, 2010)

They suck with Crossfire configurations, not with Ati single card, Nvidia single card and Sli.


----------



## Dent1 (Sep 10, 2010)

cadaveca,

It seems that you are almost moulding this thread into whatever you want to discredit AMD.


Below is an extract from Roberts very first post in this thread:



Robert-The-Rambler said:


> I don't want this thread to be an AMD vs Intel Thread. I want this discussion to be a serious investigation of _real world gaming experiences _with the latest AMD processors or at_ least modern AMD processors_ where we figure out which _games seem to just hate your AMD processor_.



Robert wants to know "which games seem to just hate your AMD processor".  I've got a "modern AMD processor", the Athlon II X4 and I can provide "real world gaming experience" yet I find that you keep dismissing my results. Robert made no mention that only pariticipate if you own tri-cards with 1600+ shaders. To be objective a particular game can hate a CPU in a single card environment too - i.e.Crysis.


----------



## cadaveca (Sep 10, 2010)

Uh...


Robert-The-Rambler said:


> *I just want to restate that what I am really looking for is a list of games where say at 1080p it is impossible for AMD processors on any platform, even the fastest current ones at stock everything*, even when your single GPU or multiple GPU setup is capable of that performance, to simply maintain solid/locked 60 frames per second and maybe even at times performance is terrible.



So, yes, I have specifically tried to push it in the direction you say...as that's the only time that it is an issue.

And, as I said before, Intel chips run into the same problem...it really seems to be driver..not necessarily the cpu. But because AMD's cpu performs a bit worse than Intel, and Robert asked about AMD chips, yes, you bet that is where my focus is. I am directly answering his question.

It's not about discrediting AMD...I feel it's bad you need to voerclock, because this voids warranty, but overclocking, for most chips, is not an issue. that doesn't discredit AMD...I'm sry if you feel that's the case, but that isn't my perogative.


----------



## DRDNA (Sep 10, 2010)

cadaveca,
                 I find your insight refreshing and I am glad you are not discouraged by the feed-back that is in opposition to your findings.
                 I have enjoyed most all statements in this thread and will continue to monitor the ping-pong verbiage and insightful findings! If there are any tests you would like me to help with I will but, the only rig I have that has an AMD is an old school one. I believe its a 3400+ AMD CPU and I don't have crossfire in that rig only a single 3450 agp if I remember correctly. Now having said that I also have this rig in the Sig and can test on that as well if you have a need.


----------



## TheMailMan78 (Sep 10, 2010)

cadaveca said:


> Uh...
> 
> 
> So, yes, I have specifically tried to push it in the direction you say...as that's the only time that it is an issue.
> ...



Again there is no need to OC. Just roll back your driver.


----------



## Dent1 (Sep 10, 2010)

Cadaveca, regardless of what you have said or Robert may have said,

It doesnt take away from the fact that a specifc game can hate a CPU in a single card environment too as seen in Crysis, GTA IV, Metro 2033 etc regardless of whether its @ 1080p or at a more modest resolution.


----------



## cadaveca (Sep 10, 2010)

TheMailMan78 said:


> Again there is no need to OC. Just roll back your driver.





Sure...but all the way back to 9.12 or so. 

Unfortunately, this doesn't work for me...no driver since the 5-series released fixes the multi-monitor 2D/3D cursor corruption...first big cursor was the problem, now it's corruption. Even 10.8, which was supposed to fix the issue...

So going back only creates further problems...so many issues have been addressed in the past 9 months...

Again, Robert isn't asking for fixes, you are off-topic.



Dent1 said:


> Cadaveca, regardless of what you have said or Robert may have said,
> 
> It doesnt take away from the fact that a specifc game can hate a CPU in a single card environment too as seen in Crysis, GTA IV, Metro 2033 etc regardless of whether its @ 1080p or at a more modest resolution.



Yes, and I have admitted as much. And if you have an example of this, that's what Robert wants to know.


----------



## erocker (Sep 10, 2010)

I'm thinking this thread is reaching it's conclusion. For the past few pages it's the same thing over and over again. If someone wants to bring some facts or results to the table here that's fine, but making accusations or just saying something to say it is getting worn out. Am I right in thinking this?


----------



## cadaveca (Sep 10, 2010)

Yes, please.  I'm a little tired of repeating myself.


Seems that's what everyone wants anyway...well, maybe just a few people...but whatever...lock away, and I will Pm my results directly to Robert. Nobody else seems interested in directly answering his question.


----------



## TheMailMan78 (Sep 10, 2010)

cadaveca said:


> Sure...but all the way back to 9.12 or so.
> 
> Unfortunately, this doesn't work for me...no driver since the 5-series released fixes the multi-monitor 2D/3D cursor corruption...first big cursor was the problem, now it's corruption. Even 10.8, which was supposed to fix the issue...
> 
> ...


No. Just to 10.4a


----------



## CDdude55 (Sep 10, 2010)

This thread is definitely getting old and redundant.


----------



## Dent1 (Sep 10, 2010)

Unfortunately, I only have 2 out of 3 of the requirements to participate, a modern CPU and a CF setup, but I do not own a 1080p monitor, so I cannot post up any benchmarks that will be satisfactory. I will be following this thread closely as I'm intrigued to know the results. 

If benchmarks are genuinely going to be posted, this thread should stay open.


----------



## DRDNA (Sep 10, 2010)

cadaveca said:


> I will Pm my results directly to Robert. Nobody else seems interested in directly answering his question.



Please PM them to me as well if you would. I am still interested  in your end results, if this indeed gets locked. Thnx!


----------



## erocker (Sep 10, 2010)

Ok, then I guess we should all just find other threads and subjects to post in and about untill that time comes. Why bother repeating ourselves.


Lol! 

     l  l
     vv


----------



## TheMailMan78 (Sep 10, 2010)

Ok, then I guess we should all just find other threads and subjects to post in and about untill that time comes. Why bother repeating ourselves.


----------



## cadaveca (Sep 10, 2010)

erocker said:


> Ok, then I guess we should all just find other threads and subjects to post in and about untill that time comes. Why bother repeating ourselves.
> 
> 
> Lol!
> ...



It only ever takes one bad apple to ruin the basket. Im' sorry that I beleive in being honest and truthful, and feel a need to want to help those with honest questions, and I'm sry others don't want me to do so.

Really(and clearly, to me), I think it's time you hand out some infractions, and if I am one of those on the receiving end, I will wear my infraction with pride...


Until then, I'll let people spread FUD, if that's what you;d like, and will do as I said before...I'll talk to the OP directly...Oh, and I guess DRDNA as well.  Why would i want to stop the spread of FUD..clearly I'm am wrong in wanting to do so.


----------



## erocker (Sep 10, 2010)

cadaveca said:


> It only ever takes one bad apple to ruin the basket. Im' sorry that I beleive in being honest and truthful, and feel a need to want to help those with honest questions, and I'm sry others don't want me to do so.
> 
> Really(and clearly, to me), I think it's time you hand out some infractions, and if I am one of those on the receiving end, I will wear my infraction with pride...
> 
> ...



Actually, I think most of us would appreciate it if you made a thread with all of your results. That said this thread is just getting to a point where it has lost it's value and people are just being light-hearted about it now, which is much better than things turning into the opposite. No need to take offense or be over-serious about it. I don't have the control to close this thread, but it's been alerted to those who can.


----------



## pjladyfox (Sep 10, 2010)

cadaveca said:


> That's why I'm doing the testing, to try to isolate the problem. But even that seems wasteful, given that much older drivers, from last year, seem to not have this issue...so it's so very easy to just say "It's the driver"...



This actually brings up a pretty interesting situation I've encountered recently regarding Mechwarrior 3 on modern ATI and NVIDIA video cards that both could be traced back to the drivers. The fustrating part was that all attempts to engage either company as to what is causing the issue and to address it has fallen on deaf ears. The only option any Mechwarrior 3 player has is to use only XP or Vista with no hardware not supported by the Catalyst 7.11 drivers or do not use any driver past the 186.18 with supported NVIDIA video cards. 

In either event this entire thread just reinforces my decision back when the HD 3870 was new that running two or more video cards is just a whole bunch of headache for very little reward. 

No offense to those of you who run such setups of course.


----------

