• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sapphire HD 4870 X2 2048 MB

i forced mgpu to off now, lets see if there is any difference

edit: no major difference between mgpu forced on and off

So basically Crysis has no real benefit in having a 4870X2 over single 4870.

Thats why I ended up getting a GTX260 over a 9800GX2. Dual GOU is great when it works, but a waste of money when it doesnt. Thats why I want nvidias next graphics card to be an even better single GPU, not two PCB's slapped together again.

Thanks for the reviews W1zz!
 
You and 95% of the rest of consumers.


That's the situational crux of cards like the 280 and the X2. They're just not needed.
-----------
What we're left with, is overkill horsepower cards, that don't live up to their expectations, who's prices are 'questionable' and etc.

The X2 doesn't 'dominate' anything, it doesn't even win 100%.
-----------
WHO THE FUKK CARES anymore?
-----------
All I know is, cards are getting huge, they're not changing their architecture and they're focing us into one of two product choices.
-----------
I just can't wait until physics on GPUs becomes a full fledged industry standard, and they can start shrinking GPUs to the point of 100% integration coughlarabcough :)

Well, I guess you got your 280 and once the euphoria dispersed you became jaded:).

Great, one Terraflop, then 2, then 5, and so on.. who really cares about that? I enjoy old games like Arcaum and Fallout 100 times more than I did playing Crysis. And those games run on a 800Mhz Procesor with integrated graphics even...

Regarding to your car analogy, we now have the 300Km/h cars, great, but we lack the highways to go that fast.
 
Last edited:
Your timedemo is actually faster then guru3d's, by 10 fps on HD4870, and the difference between GTX280 and HD4870 is also about the same. What you are not getting, is Crossfire scaling on the same level as the other review sites, they are at ca. 1.8x, while you have 1.4x.

This might be down to your motherboard, and it's most likely just a driverflaw as it seems like Crossfire need to be written to support it, or that it just not scale as good on it.

I'm a GTX280 owner myself, and as a reviewer to, I have no thoughts at all about you being biased towards Nvidia, or any other for that matter. I'm just pointing out that you are not getting the average scaling, the reasons for that, as stated above, is most likely the combination of P35/CFX/Drivers. I've tested 2xHD4870 myself, and had a scaling of above 1.8x on a X38 motherboard, so somthing is not right with P35 and this HD4870x2. I've also tested HD3870x2 on a P35, and it also lacked scaling compared to x38, but not all games scaled bad, just some like COD4.

I've look at a few reviews and all are aabout the same look at this one this is with a Gigabyte X48-DQ6 http://www.overclockersclub.com/reviews/sapphire_4870x2/6.htm the HD 4870 X2 was just a hype
 
As long as it remains at its current price its a horrid price/preformance ratio. Also the GTX280 55nm that is supposed to be clocked significantly higher will likly topple it, the GTX280 isn't to far behind in all honesty from the R700, so ATI has some gloating room for now, anyone rememeber what Nvidia does?

x1800XT -> 7800GTX 512 -> x1900XTX -> 7900GTX -> x1950XTX -> 7950GX2

same here

HD3870 X2 -> 9800GX2 -> HD4870 -> GTX280 -> HD4870 x2 -> GTX280 55nm

Nvidia doesnt like to loose

I just love how your explanation of why the 4870 X2 isn't a great card is to use a card that hasn't even been made yet (GTX280 55nm). If we're going to play that game, then I'll go ahead and say the GTX280 55nm is crap because of the HD 5880 X4.

And seriously -- seeing as how the 55nm 9800GTX+ was a colossal snore, I don't see why a 55nm GTX280 would be that much different than a 65nm one. :shadedshu
 
Techpowerup 1920x1200 4xAA/16xAF
74.0-98.4 COD4 33%
84.5-119.6 Call of Juarez 42%
262.6-337.8 Company Of Heroes 29%
37.9-34.2 Crysis -11%
104.4-137.7 Enemy Territory 32%
268.9-252.8 FarCry -6%
122.0-192.0 F.E.A.R. 58%
144.8-203.0 Prey 40%
54.1-161.6 Quake4 300%
143.3-239.5 Splinter Cell 67%
63.9-104.6 S.T.A.L.K.E.R. 64%
61.8-58,5 Supreme Commander -5%
73.1-74.1 Team Fortress 2 1%
132.6-151.9 UT3 15%
52.0-55.0 World In Conflict 6%

HD4870x2 is on average 32% faster then GTX280 at 1920x1200 AA/AF according to your own numbers. What was the basis for the 14% you have?

Guru3d 1920x1200 4XAA/16XAF
62.0-117.0 COD4 89%
50.0-54.0 Frontline: Fuel of war 8%
63.0-102.0 S.T.A.L.K.E.R. 63%
103.0-159.0 F.E.A.R. 62%
99.0-84.0 GRAW2 -18%

Guru3d rates HD4870x2 to be on average 37% faster then GTX280.

Anandtech 1920x1200 4xAA 16xAF
60.2-94.2 racedriver 56%
33.2-56.3 AoC 70%
51.9-76.0 Oblivion 47%
99.0-139.8 ET:wars 41%
35.8-39.8 Crysis 11%

Anandtech rates HD4870x2 to be on average 45% faster then GTX280 at 1920x1200 4xAA 16xAF.
 
thats the average over all benchmarks

Wouldn't it be better to have an % number pr. resolution? Slapping it all together does not tell the whole story IMHO. I imagine people would find it interesting to know that the higher the resolution, the bigger the difference between HD4870x2 vs. GTX280. After all this is a high-end part, and running this at anything below 1680x1050 should be considered a crime........

The Relative performance and performance per watt/$ would look very different if you applied it to each resolution tested. Alot of work, but worth it me thinks;)
 
nvidia is not even sending samples to us by the way, so stop with those accusations.

Hmm. I always wondered why you were slow reviewing some cards and fast with some others, making me go out to find reviews elsewhere. You have to buy them or wait until someone donates one? Come on Nvidia, give him the cards!!!

Will you be doing more crysis ones too? Or are you sticking to the,

"I shouldnt have to change CLV's in a game"

kinda bad on crysis' point, I might write to them, see if they can include it in the next patch.

A crossfire button on an Nvidia sponsored game! :p

No more patches remember? Just Warhead.

Your timedemo is actually faster then guru3d's, by 10 fps on HD4870, and the difference between GTX280 and HD4870 is also about the same. What you are not getting, is Crossfire scaling on the same level as the other review sites, they are at ca. 1.8x, while you have 1.4x.

This might be down to your motherboard, and it's most likely just a driverflaw as it seems like Crossfire need to be written to support it, or that it just not scale as good on it.

I'm a GTX280 owner myself, and as a reviewer to, I have no thoughts at all about you being biased towards Nvidia, or any other for that matter. I'm just pointing out that you are not getting the average scaling, the reasons for that, as stated above, is most likely the combination of P35/CFX/Drivers. I've tested 2xHD4870 myself, and had a scaling of above 1.8x on a X38 motherboard, so somthing is not right with P35 and this HD4870x2. I've also tested HD3870x2 on a P35, and it also lacked scaling compared to x38, but not all games scaled bad, just some like COD4.

Might be the CPU. I think that latest Ati cards require or get more benefits from a faster CPU or a Quad than Nvidia's cards. It's nothing that I can corfirm, or that I have tested, just something I figured out looking at latest reviews.

It's something that some reviewer could test (come on Wizz ;)). At least I'm very interested in the results. Could be interesting to guess if the different GPU architectures are so different that have very different CPU requirements and it would definately be interesting for end users.

Also it would demostrate that the different reviews are different because of that and not because any kind of bias. Pretty much everyone in TPU knows the system used in the bench plays a big role, but until the HD4000/GTX cards I never had the impression that the influence of the system could be very different between the different architectures, it would just function like a constant multiplier for all cards. Knowing both architectures it is logical for Ati cards to have a bigger driver overhead, but I never found it to make a difference in the past. Now I think there could be something. It's the way I justify the differences between reviews. I'd love to see it confirmed.
 
Hmm. I always wondered why you were slow reviewing some cards and fast with some others, making me go out to find reviews elsewhere. You have to buy them or wait until someone donates one? Come on Nvidia, give him the cards!!!

NVIDIA doesn't, Zotac does ;)

Just a friendly request, make your next NV card a Zotac.
 
Wouldn't it be better to have an % number pr. resolution? Slapping it all together does not tell the whole story IMHO. I imagine people would find it interesting to know that the higher the resolution, the bigger the difference between HD4870x2 vs. GTX280. After all this is a high-end part, and running this at anything below 1680x1050 should be considered a crime........

The Relative performance and performance per watt/$ would look very different if you applied it to each resolution tested. Alot of work, but worth it me thinks;)

i have been thinking about that, but A LOT of people want to look at one graph and know it all. maybe i could make 4 graphs each for perf summary, perf/$, perf/w and one additional summarizing the 4 resolutions.

i wrote me some nifty programs to do all the work, otherwise i'd spend all my life just processing the numbers. we are looking at over 1500 individual benchmark runs displayed in this review. yep that many! would you have thought that? for example if you were drawing our graphs by hand and it takes you 10 seconds per bar you would spend over 4 hours just to make the graphs
 
I just love how your explanation of why the 4870 X2 isn't a great card is to use a card that hasn't even been made yet (GTX280 55nm). If we're going to play that game, then I'll go ahead and say the GTX280 55nm is crap because of the HD 5880 X4.

And seriously -- seeing as how the 55nm 9800GTX+ was a colossal snore, I don't see why a 55nm GTX280 would be that much different than a 65nm one. :shadedshu

It already isnt a great card, the price/preformance ratio blows. As for the 9800GTX+ being a snore thats neither here nor there, the G200 55nm is supposed to be clocked higher, and the GTX280 isn't that far away in the first place. All it takes is an overclock from the factory to claim the throne again. The HD4870X2 is to hot, consumes to much power, and doesn't live up to the hype. Its also not signifcantly faster than whats availble that won't super heat your house. The fact is in under a month you'd pay for this card twice simply because of the power draw and the AC bill to keep your house cool
 
i have been thinking about that, but A LOT of people want to look at one graph and know it all. maybe i could make 4 graphs each for perf summary, perf/$, perf/w and one additional summarizing the 4 resolutions.

i wrote me some nifty programs to do all the work, otherwise i'd spend all my life just processing the numbers. we are looking at over 1500 individual benchmark runs displayed in this review. yep that many! would you have thought that? for example if you were drawing our graphs by hand and it takes you 10 seconds per bar you would spend over 4 hours just to make the graphs

I would certainly want a graph summarizing resolutions. After all, for most people it is not important that card x performs extremly bad in resolutions of 2560xxxx if they only play in 1680.
 
Wouldn't it be better to have an % number pr. resolution? Slapping it all together does not tell the whole story IMHO. I imagine people would find it interesting to know that the higher the resolution, the bigger the difference between HD4870x2 vs. GTX280. After all this is a high-end part, and running this at anything below 1680x1050 should be considered a crime........

The Relative performance and performance per watt/$ would look very different if you applied it to each resolution tested. Alot of work, but worth it me thinks;)

If thats w1zzards average across something like 18 benchmarks is that not more informative than a higher average across say just 6 benchmarks?

I agree on your comments about resolution, however there are some in these forums who have ordered the card who have 19 inch screens so the masses need to be catered for also I think.
 
Zehnsucht, not everyone has a fixed resolution LCD. My CRT Trinitrons scale just fine up to 2048x1536.

1500+ individual benchmarks W1zzard? Now that's the professionalism I'm talking about and the reason I rely on TPU reviews when it comes to my hardware purchases.
 
i have been thinking about that, but A LOT of people want to look at one graph and know it all. maybe i could make 4 graphs each for perf summary, perf/$, perf/w and one additional summarizing the 4 resolutions.

i wrote me some nifty programs to do all the work, otherwise i'd spend all my life just processing the numbers. we are looking at over 1500 individual benchmark runs displayed in this review. yep that many! would you have thought that? for example if you were drawing our graphs by hand and it takes you 10 seconds per bar you would spend over 4 hours just to make the graphs

Well, one graph would be sweet if it told the absolute truth, but it does not. One good example is 9800GTX+ vs. HD4870x2 in 1024x768 F.E.A.R.: here HD4870x2 is only 50% faster at 3 times the price, but when you move to 1920x1200 it's close to 300% faster. It would completely change the picture of perf. watt/$ if you excluded 1024x768/1280x1024, or made one for 1920x1200. The average performance advantage of HD4870x2 vs GTX 280 more then doubles, when moving up the resolution ladder.
 
If thats w1zzards average across something like 18 benchmarks is that not more informative than a higher average across say just 6 benchmarks?

I agree on your comments about resolution, however there are some in these forums who have ordered the card who have 19 inch screens so the masses need to be catered for also I think.

I don't think the masses with 19" monitors are lining up to buy HD4870x2 or GTX280, some might, but those should not count :banghead:

His average include 1024x768 and 1280x1024, and thats half of his resolutins. And when half of the benches hardly showes any difference between the cards at all, the numbers get meaningless.
 
It already isnt a great card, the price/preformance ratio blows. As for the 9800GTX+ being a snore thats neither here nor there, the G200 55nm is supposed to be clocked higher, and the GTX280 isn't that far away in the first place. All it takes is an overclock from the factory to claim the throne again. The HD4870X2 is to hot, consumes to much power, and doesn't live up to the hype. Its also not signifcantly faster than whats availble that won't super heat your house. The fact is in under a month you'd pay for this card twice simply because of the power draw and the AC bill to keep your house cool

The 9800GTX+ was clocked higher, how much higher do you think they will clock the "Oh so mighty 55nm GTX280"? It probably won't take the performance crown. People like me, would not buy a GTX280, but would go with an HD4870X2 instead because we like ATi and the card is faster...even if it's only by a small amount. You say the 4870X2 is hot, and consumes a lot of power, what about the GTX series...they aren't exactly the coldest running, low power cards either. Stop trolling on AMD threads...and open a Nvidia fanclub.
 
your right they are not the coolest, but they run cooler than even the 4850 does. You can buy the HD4870 X2, most users though will go with bang for buck. And by your own admission you wouldn't buy an Nvidia card proves my points its the ATI nutjobs that will get this card. I use a 3850 in my computer, its a decent little card, and when i bought it it was a good price/preformace.
 
well its nice that this has become a fanboi load of nonsense.

look at the scores in most games the 4870X2 beats the GTX280 done end of story.

wait here let me go through the NV fanboi complaints.

uses more power

response from me:
your dropping $550 on a video cards i doubt you will be using a shitty PSU

doesn't scale at low res

response from me:
why the sam hell are you using a low res monitor?

GTX280 55nm beats it
umm no i bet it wont until it gets clocked higher funny thing about die shrinks they don't change performance hence a 130nm P4 @3ghz with the same cache fsb etc performs the same as a 65nm@3ghz.


did i miss any? if i did just say it
 
It already isnt a great card, the price/preformance ratio blows. As for the 9800GTX+ being a snore thats neither here nor there, the G200 55nm is supposed to be clocked higher, and the GTX280 isn't that far away in the first place. All it takes is an overclock from the factory to claim the throne again. The HD4870X2 is to hot, consumes to much power, and doesn't live up to the hype. Its also not signifcantly faster than whats availble that won't super heat your house. The fact is in under a month you'd pay for this card twice simply because of the power draw and the AC bill to keep your house cool

HD4870x2 is 35-45% faster then a GTX280 at 1920x1200 4xAA 16xAF on average. And even faster on higher resolutions. To get the throne back, Nvidia would need alot more then a clock bump from 55nm., alot more and way beyond what a shrink can provide.

If you game 24 hours a day for a whole month, the HD4870x2 would cost you less than 15$ extra on the powerbill, so the extra cost on the powerbill with normal gaming, is 1/3 of that at the most. And the heat it produces is about the same as five 60w lightbulb's.
 
Last edited:
The HD4870X2 is to hot, consumes to much power, and doesn't live up to the hype. Its also not signifcantly faster than whats availble that won't super heat your house. The fact is in under a month you'd pay for this card twice simply because of the power draw and the AC bill to keep your house cool

You know, I recall people saying the exact same thing about the GTX 280 when it came out, yet lots of people like you defended it back then... :rolleyes:
 
Stay on topic and leave personal attacks out of this please.
 
I do have to admit when I look at reviews, I usually only look at the numbers for the res I use the most (1920x1200) then I go back to look at the others. A card like this is completely pointless at 1680x1050 & below but it owns the world at 1920x1200 & above. That's what makes the difference to me. Some people just want the best, fastest, biggest, hottest, bloodsucki-est :D thing that's out there at any given time. I've come to categorize myself as one of those ppl :toast:
 
I don't think the masses with 19" monitors are lining up to buy HD4870x2 or GTX280, some might, but those should not count :banghead:

His average include 1024x768 and 1280x1024, and thats half of his resolutins. And when half of the benches hardly showes any difference between the cards at all, the numbers get meaningless.

I agree, but we disagree on the resolutions, I dont think a bench should be eliminated because we guess that the majority of x2 owners will be using 16xx and above monitors when it is a plain fact that the most common resolution remains with 17 and 19inch owners, now WE know that it might be fairly foolish to pay this much for a card to game only at those lower resolutions but do all the people out there who might just buy this card,,,,,they are not all hardware enthusiasts and whilst I would agree a large proportion of x2 buyers will be enthusiasts.....not all of them will be.
IMO reviews are not just for enthusiasts.....if they were they would only be reaching 5-10% of users.
 
Back
Top