• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Sapphire HD 4870 X2 2048 MB

Blegh -- some people. So your GTX 280 is better because it still plays at "playable levels" even though the 4870 X2 is faster? Funny how that didn't apply to ATI cards when Nvidia was in the lead. :p

Im saying it is a better card for me because it run any game i have on high setting and im thinking of buying a 2nd one there is know why this card can come close to 2x280 look at the scores off the ati card its lacking some good drivers atm.
 
I was just wondering how you could mess up the COD4 benchmark so horrible, and not even comment on it.

You are actually not using 2 cores at all, the difference between HD4870 and HD4870x2 is under 6fps, and thats just downright silly. Anyone with any experience at all on these HD4xxx card would know you completely made a mess of this gamebench, since COD4 have scary good scaling in Crossfire. You should have seen about 130-140fps+ in 1920x1200, not 76fps. When every reviewsite on mother earth get double the FPS compared to GTX280/HD4870, and any owner of 2xHD4870 get Crossfire to work and scale at about 2x, some alarmbells should start ringing, mate. I must say the work you did on COD4 with this HD4870x2 was really sloppy..

If you tested the HD4870x2 in single mode first, then turned on CFX, and ran the bench again, no wonder you got results like these... Do COD4 again please, and do it right this time, and that means Crossfire "ON", do the mandatory reboot after you enable Crossfire... Getting Crossfire to work in this game is easy as a walk in the park. I think you can do it, like everybody else did....
 
Last edited:
I was just wondering how you could mess up the COD4 benchmark so horrible, and not even comment on it.

You are actually not using 2 cores at all, the difference between HD4870 and HD4870x2 is under 6fps, and thats just downright silly. Anyone with any experience at all on these HD4xxx card would know you completely made a mess of this gamebench, since COD4 have scary good scaling in Crossfire. You should have seen about 130-140fps+ in 1920x1200, not 76fps. When every reviewsite on mother earth get double the FPS compared to GTX280/HD4870, and any owner of 2xHD4870 get Crossfire to work and scale at about 2x, some alarmbells should start ringing, mate. I must say the work you did on COD4 with this HD4870x2 was really sloppy..

If you tested the HD4870x2 in single mode first, then turned on CFX, and ran the bench again, no wonder you got results like these... Do COD4 again please, and do it right this time, and that means Crossfire "ON", do the mandatory reboot after you enable Crossfire... Getting Crossfire to work in this game is easy as a walk in the park. I think you can do it, like everybody else did....

let me look into this

edit: seems the multi-gpu configuration setting in cod 4 was not set to enabled. if i didnt notice it, take a guess at how many end-users will forget about and see their card not performing at its best

thanks for bringing this up, i will have revised cod 4 scores later today
 
Last edited:
HIS 4870 X2 arrives today.
 
let me look into this

edit: seems the multi-gpu configuration setting in cod 4 was not set to enabled. if i didnt notice it, take a guess at how many end-users will forget about and see their card not performing at its best

thanks for bringing this up, i will have revised cod 4 scores later today

The other thing I have thought of W1zz, did you enable crossfire mode for Crysis. It has to be don in an autoexec. by default it will only run Sli automatically.

Not saying you havent, it might just be really bad scaling in Crysis, I know it favours nvidia over Ati in terms of performance anyway.

;)
 
re. crysis, yep thats possible. let me look into that as well
 
Taken from tweak guides:

r_MultiGPU [0,1,2] - This option controls whether Crysis enables additional overhead for rendering on multi-GPU systems (i.e. SLI and CrossFire setups). If set to 0 - which is the optimal setting for single-GPU systems - it disables multi-GPU support; if set to 1 it enables multi-GPU support for both SLI and Crossfire; if set to 2 it attempts to auto-detect if a multi-GPU setup is present (system).

From experience it defaults to 2

You have to put the command:

r_MultiGPU=1

into an autoexec.cfg file (make one) in the Main crysis directory. to enable crossfire.


So stupid, anyone would think that they didnt want multi Ati GPU's to benefit in crysis, having to mess around with CLV's when Sli is done automatically. :rolleyes: Not that the nvidia logo at startup has anything to do with it...
 
For anyone that wants it. I have made a autoexec file to enable crossfire in Crysis. (it does nothing else)

Simply extract the cfg file and put it in the main Crysis directory. I.e For me it goes in:

C:\Program Files (x86)\Electronic Arts\Crytek\Crysis

:toast:

NB

Do not use this if you do not have crossfire or SLi, as it adds unrequired overhead and will reduce performance.
 

Attachments

let me look into this

edit: seems the multi-gpu configuration setting in cod 4 was not set to enabled. if i didnt notice it, take a guess at how many end-users will forget about and see their card not performing at its best

thanks for bringing this up, i will have revised cod 4 scores later today

No problem, mate!

For some (alot actually), performance in this game will break, or seal the deal on a new purchase. So getting this one right, with twice the performance of Nvidias best, will sertainly give a huge boost for the HD4870x2, since this game is so popular and good FPS and visuals is very important for them..

This will complete an otherwise very good review, and will sertainly boost the average performance lead of the HD4870x2 by quite alot actually..
 
If it's not enabled by default (Crysis Crossfire), why use it? The 100s of people who buy R700 might not be verged with "changing a resource file to make it work like it should". Besides, it has to be mentioned in the test setup part of the review.
 
If it's not enabled by default (Crysis Crossfire), why use it? The 100s of people who buy R700 might not be verged with "changing a resource file to make it work like it should". Besides, it has to be mentioned in the test setup part of the review.

Yes, that was my first thought, 90% of gamers/PC users are not going to change or know how to change or even understand the need to change the config file but then I thought, well, if only around 4% of consumers use XFire/SLi, they are likely to be enthusiasts anyways and doing this will give a better reflection on what the x2 can acheive in the game, certainly for TPU members in any case.
 
So getting this one right, with twice the performance of Nvidias best

i'm definitely not seeing that. maybe its the timedemo i am using .. some people will now say nvidia paid me to record an nvidia biased timedemo.
 
If it's not enabled by default (Crysis Crossfire), why use it? The 100s of people who buy R700 might not be verged with "changing a resource file to make it work like it should". Besides, it has to be mentioned in the test setup part of the review.

I know its bad, games like GRAW, have an option to turn it on as part of the game settings. Crysis should be the same. mind you they added a Vsync button in one of the patches. To remain unbiased and fair, they need to do the same for a Crossfire button.
 
i'm definitely not seeing that. maybe its the timedemo i am using .. some people will now say nvidia paid me to record an nvidia biased timedemo.

:D
 
nvidia is not even sending samples to us by the way, so stop with those accusations.

looks like the x2 reviewer driver gives some gains in cod 4 for all rv770 cards .. the new graphs include 4870 non-x2 with this driver
 
new cod4 graphs are up .. will be a few mins for them to go through the cache
 
new cod4 graphs are up .. will be a few mins for them to go through the cache

Will you be doing more crysis ones too? Or are you sticking to the,

"I shouldnt have to change CLV's in a game"

kinda bad on crysis' point, I might write to them, see if they can include it in the next patch.

A crossfire button on an Nvidia sponsored game! :p
 
i'm looking into crysis right now, now that i'm done with cod 4.

edit: if crysis says "MGPU" in the screen overlay does that mean multigpu is enabled?
 
nice review and and a long one. looks to be a killer card but i won't be getting one i don't have a use for that much power


You and 95% of the rest of consumers.


That's the situational crux of cards like the 280 and the X2. They're just not needed.

However, if the 280 isn't needed, than the X2 is double-not needed.

At an extra hundred dollars, twice the heat and on hundred more watts.. the X2 and the phrase 'price/performance' should never EVER be in the same sentence.


It's a revolving door, we're going round and round. Cards like the 280, X2, even 4870/260 in some situations, are far more than we need, except for one game - Crysis. We can't keep chasing this elusive goal forever. And not everyone even likes Crysis so...

What we're left with, is overkill horsepower cards, that don't live up to their expectations, who's prices are 'questionable' and etc.

The X2 doesn't 'dominate' anything, it doesn't even win 100%.


So, despite all the conflicting reviews, let's say the X2 obliterated Crysis.

2560 res, 16af, 16aa, transparency aa, Very High, etc. 60+ fps solid!

WHO THE FUKK CARES anymore?

Are we seriously going to wage the price, technical, architectural and efficiency - value of a card on it's performance in one 3d application? An application that many have stated is 'coded poorly,' and etc.?



All I know is, cards are getting huge, they're not changing their architecture and they're forcing us into one of two product choices.

A) A large plethora of products ranging nearly the full performance spectrum; with enough choices to make you confused. Yet at the end of the day, 80+% of them will run your 3d applications without a problem.

or

B) Self-proclaimed 'high end' products that do take you to the next level so to speak, but are still tied down by the basic limitations of architecture within both the GPUs and other PC components. Ultimatley it's like running out once a year and buying a bigger motor for your car, and only making slight adjustments here and there. The main thing is, you have a bulk increase in power. Yet the silly thing is, that your only purpose in doing so, is in attempts to break some old 1/4 mile track record at your local dragstrip; even though, you know, it will be years before you do so, and by that time, you won't care anymore.


That's what GPUs in the high end feel like. They serve very little purpose for real world applications, except one.



I just can't wait until physics on GPUs becomes a full fledged industry standard, and they can start shrinking GPUs to the point of 100% integration coughlarabcough :)



Much praise to ATi, for the accomplishment of the X2, but unfortunatley like the 3870x2, it's far from impressive.


EDIT: Ignore me. I'm a silly heart..a dreamer..and I've been watching too much "Uncle Buck!"
 
Last edited:
if crysis says "MGPU" in the screen overlay does that mean multigpu is enabled?

Yes, look forward to seeing the new benches.

:toast:
 
Yes, look forward to seeing the new benches.

:toast:

then that's how the current numbers were obtained. i will rerun to be sure, but i dont expect any changes
 
new cod4 graphs are up .. will be a few mins for them to go through the cache

Your timedemo is actually faster then guru3d's, by 10 fps on HD4870, and the difference between GTX280 and HD4870 is also about the same. What you are not getting, is Crossfire scaling on the same level as the other review sites, they are at ca. 1.8x, while you have 1.4x.

This might be down to your motherboard, and it's most likely just a driverflaw as it seems like Crossfire need to be written to support it, or that it just not scale as good on it.

I'm a GTX280 owner myself, and as a reviewer to, I have no thoughts at all about you being biased towards Nvidia, or any other for that matter. I'm just pointing out that you are not getting the average scaling, the reasons for that, as stated above, is most likely the combination of P35/CFX/Drivers. I've tested 2xHD4870 myself, and had a scaling of above 1.8x on a X38 motherboard, so somthing is not right with P35 and this HD4870x2. I've also tested HD3870x2 on a P35, and it also lacked scaling compared to x38, but not all games scaled bad, just some like COD4.
 
then that's how the current numbers were obtained. i will rerun to be sure, but i dont expect any changes

Really? Wierd, unless ATi have worked that out and their is a fix in their drivers for it...?

In which case, if they are the same, Crysis has appauling scaling for crysis, even though it is a "Works best on Nvidia" game, you would expect a bigger improvement than that, when running both GPU's
 
i forced mgpu to off now, lets see if there is any difference

edit: no major difference between mgpu forced on and off
 
Last edited:
This might be down to your motherboard, and it's most likely just a driverflaw as it seems like Crossfire need to be written to support it, or that it just not scale as good on it.

then ati fails even more. they say the x2 cards are supposed to work best on any chipset. ati themselves had not any complaints about the p35 chipset i'm using. also i dont see any relation between chipset and rendering performance unless we are talking hypermemory or other things that actually use the pcie bus

tech-report doesnt see 1.8x scaling either in 1920x1200
 
Back
Top