# Sapphire HD 4870 X2 2048 MB



## W1zzard (Aug 10, 2008)

Today AMD released their new HD 4870 X2 graphics cards. They are based on two RV770 GPUs on one PCB. With a total of 1600 shaders and 2 GB GDDR5 memory the card has excellent chances to achieve the goal "fastest graphics card in the world".

*Show full review*


----------



## Apocolypse007 (Aug 12, 2008)

this is what I expected. The card finally regains the crown for ATI at a reasonable price premium (about the same as 2 4870's). And im sure it wont stay at over 500 for long.

nice review as always wiz.


----------



## MikeJeng (Aug 12, 2008)

Now there is no reason to buy a GTX280 unless it drops to $300.


----------



## zithe (Aug 12, 2008)

The 2gig is noticeably better than the 1gig. I'm glad they released a 2gig.


----------



## wolf2009 (Aug 12, 2008)

zithe said:


> The 2gig is noticeably better than the 1gig. I'm glad they released a 2gig.



Thats not whats happening. The 1 Gig is with One card disabled . That shows the performance of 1 HD4870 1 GB


----------



## Zehnsucht (Aug 12, 2008)

Nice review!


----------



## zithe (Aug 12, 2008)

Thanks for pointing that out. I should go to bed now.... (night lol)


----------



## Animalpak (Aug 12, 2008)

is a true monster, very impressive results at high resolutions.  I can't believe it, ati has made awesome work.

You need at minimum 24" lcd panel to use with. 

THIS IS CROZZFAIAHHHHHH !!!


----------



## p_o_s_pc (Aug 12, 2008)

nice review and and a long one. looks to be a killer card but i won't be getting one i don't have a use for that much power


----------



## Megasty (Aug 12, 2008)

Sweet card. Nice to see that it still has the windowed 3D problems  It will be an excellent buy when the price comes down.


----------



## Whilhelm (Aug 12, 2008)

Awesome performing card. Finally ATI gets back on top. Hopefully the price will drop to 499 after launch as they said they wouldn't price a card over $500

Can't wait to get one.


----------



## wolf2009 (Aug 12, 2008)

p_o_s_pc said:


> nice review and and a long one. looks to be a killer card but i won't be getting one i don't have a use for that much power



me too, i am running at 1280x1024 , LOL . Palit HD4850 non-reference 1GB(if there is any benefit ) will do !


----------



## DonInKansas (Aug 12, 2008)

380W power draw at load?  You could power a small village with that!


----------



## sneekypeet (Aug 12, 2008)

That is a huge load but, I am, however glad to see the OCZ made it through testing!


----------



## Azazel (Aug 12, 2008)

i wants one...bu im broke...


----------



## AddSub (Aug 12, 2008)

Not too impressive. Terrible overclocker, much like the single 4870/50 setups. Hot and power hungry. I figured as much when they announced the preliminary specs recently. This review just confirms it all with some charts. 

Although, all this can be corrected by appropriate pricing I still wouldn't go for it, since AMD must have cut corners on the component and build quality front, what with all their financial troubles.


----------



## farlex85 (Aug 12, 2008)

They took the crown, but rather sloppily it would seem.......


----------



## Laurijan (Aug 12, 2008)

The 380W under peak load meens that i would need a new PSU if i wanna have this card.. i think my 750W will not be enough if my Q6600 is overclocked..


----------



## btarunr (Aug 12, 2008)

Didn't AMD promise  2x the amount of Performance/Watt compared to the R680 ?

Bad. Didn't meet my expectations.


----------



## Duxx (Aug 12, 2008)

btarunr said:


> Didn't AMD promise  2x the amount of Performance/Watt compared to the R680 ?
> 
> Bad. Didn't meet my expectations.



Yeah, agree with you here.  Expected a little more but its damn sexy looking none the less.


----------



## candle_86 (Aug 12, 2008)

its not worth it over the GTX280, the GTX280 is cheaper, is faster at times, slower at others, offers a total of 86% of the power, consumes less power and overall looks like a better card unless this thing comes down in price the GTX280 is a better buy. Now its Nvidia's turn for a trump card.


----------



## mrw1986 (Aug 12, 2008)

Looks like I might be buying a GTX280 after all...this card is not as impressive as I've hoped!


----------



## bugmenot (Aug 12, 2008)

*...*

1024 x 768? Is this 1998?
If you spend $500 on a graphics card with a $100 monitor, you have bigger problems than framerate.
And why was there no 2560 x1600?

http://www.guru3d.com/article/radeon-hd-4870-x2-review-crossfire/12
http://www.hexus.net/content/item.php?item=14928&page=11
http://www.hardwarecanucks.com/foru...deon-hd-4870-x2-2gb-video-card-review-17.html


----------



## W1zzard (Aug 12, 2008)

bugmenot said:


> 1024 x 768? Is this 1998?
> If you spend $500 on a graphics card with a $100 monitor, you have bigger problems than framerate.
> And why was there no 2560 x1600?
> 
> ...



dont look at 1024x768 if you dont like to see those results

no 2560x1600 because those huge displays are too expensive to buy, only 3 people each donated $10 so far


----------



## alexp999 (Aug 12, 2008)

Nice review w1zz! Though to be quite honest I expected it to perform slightly better than that! Seems it still draws a hell of a lot of power. Still glad I purchased my GTX 260. Maybe things will improve with a few driver updates...?


----------



## InnocentCriminal (Aug 12, 2008)

W1zzard said:


> dont look at 1024x768 if you dont like to see those results
> 
> no 2560x1600 because those huge displays are too expensive to buy, only 3 people each donated $10 so far



Well that answers my question. Shame really as these cards are designed for that sort of resolution.


----------



## CY:G (Aug 12, 2008)

THanks W1zz for the review, im a little underwhelmed, still trying to decide whether to wait for nVidia's answer or plunge for the x2...


----------



## alexp999 (Aug 12, 2008)

CY:G said:


> THanks W1zz for the review, im a little underwhelmed, still trying to decide whether to wait for nVidia's answer or plunge for the x2...



I've got my GTX 260, then if Nvidia comes out with a single GPU answer, I might do step-up. Thats if you can do step-up in UK...?

What I dont get is why these 48xx series get so hot, hotter than the GT200 series, when GT200 is 65nm and 48xx is 55nm.


----------



## snuif09 (Aug 12, 2008)

cool now i can get super overkill for my screen


----------



## Winterwind (Aug 12, 2008)

bugmenot said:


> http://www.hardwarecanucks.com/foru...-radeon-hd-4870-x2-2gb-video-card-review.html


review with Catalyst 8.8 Beta.
X2 and CF performance is much better with catalyst 8.8b than with catalyst 8.7.
sad that tpu review is with 8.7


----------



## W1zzard (Aug 12, 2008)

8.7 ? 8.8 beta? whats the version number? 8.52.2 ?

edit:

hwcanucks: ATI Catalyst Beta Driver Package # 8.52.2.080722a (8.8 beta) (HD4870 X2) 
tpu: ATI: Catalyst 8.7, 3870 X2 & 4870 X2: 8.52.2

here


----------



## CY:G (Aug 12, 2008)

alexp999 said:


> I've got my GTX 260, then if Nvidia comes out with a single GPU answer, I might do step-up. Thats if you can do step-up in UK...?
> 
> What I dont get is why these 48xx series get so hot, hotter than the GT200 series, when GT200 is 65nm and 48xx is 55nm.



Thats a good question.. 

I think i might have to wait for nVidias 260 GX2, though who knows if its actually going to happen... They might need to go to 55nm before that...


----------



## Hayder_Master (Aug 12, 2008)

best view w1zzard

guys if anyone want to make sure this is other link 

http://www.tweaktown.com/reviews/1541/1/sapphire_radeon_hd_4870_x2_in_crossfirex/index.html


----------



## computertechy (Aug 12, 2008)

cheers for the review mate

looks damn sexy!, i just love the whole black idea

but didnt perform as exspected!

maybe another driver release in a month!(rolls eyes)(rolls eyes)(rolls eyes)


----------



## Scrizz (Aug 12, 2008)

man.......
I'll stick with my 4850 for now; I'm more than impressed with it.


----------



## Tatty_One (Aug 12, 2008)

MikeJeng said:


> Now there is no reason to buy a GTX280 unless it drops to $300.



Hmmmm, well as W1z said, this is the worse price > performance ever seen, in the UK these cards are £375-£400, thats $725-$775 , taking into account that they are 13-15% faster across the board than the GTX280 and you can get one of them for £270, then that would make the x2's bang for buck about £300-£310 so I do hope the prices fall!

A very quick card but not so good value in the UK once again......can you get one in the US for $500?


----------



## alexp999 (Aug 12, 2008)

Tatty_One said:


> Hmmmm, well as W1z said, this is the worse price > performance ever seen, in the UK these cards are £375-£400, thats $725-$775 , taking into account that they are 13-15% faster across the board than the GTX280 and you can get one of them for £270, then that would make the x2's bang for buck about £300-£310 so I do hope the prices fall!
> 
> A very quick card but not so good value in the UK once again......can you get one in the US for $500?



They are available for pre-order on ebuyer:

Powercolor is £344.98

Sapphire is £354.98

$549 -> £288 -> £338 (inc VAT)


----------



## newtekie1 (Aug 12, 2008)

Terrible price to performance.  This card just isn't worth the money, I was really hoping they would keep the price under $500 like they promised.  The price will go down in a few weeks, once the hype dies, but for right now, this card is a terrible buy.  You can just go out and get two 4870's for $500 and save yourself $50.


----------



## mdm-adph (Aug 12, 2008)

Laurijan said:


> The 380W under peak load meens that i would need a new PSU if i wanna have this card.. i think my 750W will not be enough if my Q6600 is overclocked..



Aye, but isn't that 380W _system_ load under peak?  You really don't think your 750W would do that, even with a quad-core?



newtekie1 said:


> Terrible price to performance.  This card just isn't worth the money, I was really hoping they would keep the price under $500 like they promised.  The price will go down in a few weeks, once the hype dies, but for right now, this card is a terrible buy.  You can just go out and get two 4870's for $500 and save yourself $50.



Hell, or even two 4850's for $300.  If they can get the price down on this card, it'd be a lot better.  Shame, shame, ATI -- you promised $500.  At that price it probably woulda beat the GTX 280 in price/performance.


----------



## ShadowFold (Aug 12, 2008)

The HD 4850X2 is gonna be the best


----------



## newtekie1 (Aug 12, 2008)

ShadowFold said:


> The HD 4850X2 is gonna be the best



Not really, at $400 it too is overpriced compared to just getting two 4850s.  It would need to be in the $350 range or lower for it to be worth the price.


----------



## Tatty_One (Aug 12, 2008)

alexp999 said:


> They are available for pre-order on ebuyer:
> 
> Powercolor is £344.98
> 
> ...



Right, still damn pricey tho, I looked at Overclockers, Novatech and Dabs.......if you add VAT to the US price you must first deduct their tax which I assume is included in their price?

Ahhh forgot to say, if the x2 is 15% faster and it overclocks 6%.............if a GTX280 owner can overclock theirs by more than 21%....does that make the GTX faster??


----------



## zithe (Aug 12, 2008)

newtekie1 said:


> Terrible price to performance.  This card just isn't worth the money, I was really hoping they would keep the price under $500 like they promised.  The price will go down in a few weeks, once the hype dies, but for right now, this card is a terrible buy.  You can just go out and get two 4870's for $500 and save yourself $50.



Retail chooses the final price, not ATI/AMD. ATI/AMD kept its promise I think.


----------



## Tatty_One (Aug 12, 2008)

zithe said:


> Retail chooses the final price, not ATI/AMD. ATI/AMD kept its promise I think.



Lol....retail chooses the final price based on the price they are charged by the card maker who's price is determined by ATi/AMD


----------



## computertechy (Aug 12, 2008)

Tatty_One said:


> Lol....retail chooses the final price based on the price they are charged by the card maker who's price is determined by ATi/AMD




touché


----------



## LiveOrDie (Aug 12, 2008)

ill stay with my 280GTX this card looks like it wasn't all that it was cracked up to be, even in the DX10 review in most games the 280 come close or just over it, yer you mite get 40 more FPS in cod4 but how cares 120FPS in playable i think lol, ill says 280 is the better card because it comes close with only one GPU

DX10 review


----------



## mdm-adph (Aug 12, 2008)

Live OR Die said:


> ill stay with my 280GTX this card looks like it wasn't all that it was cracked up to be, even in the DX10 review in most games the 280 come close or just over it, yer you mite get 40 more FPS in cod4 but how cares 120FPS in playable i think lol, ill says 280 is the better card because it comes close with only one GPU
> 
> DX10 review



Blegh -- some people.  So your GTX 280 is better because it still plays at "playable levels" even though the 4870 X2 is faster?  Funny how that didn't apply to ATI cards when Nvidia was in the lead.


----------



## mdm-adph (Aug 12, 2008)

zithe said:


> Retail chooses the final price, not ATI/AMD. ATI/AMD kept its promise I think.



Then we just have to get the cards direct from ATI again.  However, if I recall, they always charged out the wazoo for them.  :shadedshu


----------



## Tatty_One (Aug 12, 2008)

mdm-adph said:


> Blegh -- some people.  So your GTX 280 is better because it still plays at "playable levels" even though the 4870 X2 is faster?  Funny how that didn't apply to ATI cards when Nvidia was in the lead.



I hear what your saying but better and faster can mean 2 different things, if he were to mean tha he considers the GTX280 to be "better" for him, because just like the x2, it does everything he wants of it without struggling but for him it does it cheaper, with less heat and power draw then thats his choice..........he certainly couldnt say his is "faster".....we know the answer to that one!


----------



## mdm-adph (Aug 12, 2008)

Tatty_One said:


> I hear what your saying but better and faster can mean 2 different things, if he were to mean tha he considers the GTX280 to be "better" for him, because just like the x2, it does everything he wants of it without struggling but for him it does it cheaper, with less heat and power draw then thats his choice..........he certainly couldnt say his is "faster".....we know the answer to that one!



Well, if he would've said it was better *for him*, I woulda agreed, but he didn't. 

Hell, if that's the case, my HD 3650 is "best" for me, because it takes no power at all and plays games fine on 1024x768.


----------



## chinese_farmer (Aug 12, 2008)

So, someone going to donate the 30" monitor for W1zzard to test on? Otherwise there's seriously no point in this review without 2560 x 1600 and 4AA or else you don't see the true power of the card. 

Just speaking because I took a look at some other review sites testing higher res!


----------



## PCpraiser100 (Aug 12, 2008)

They should've added in a mobo that had PCIE 2.0 cause the bandwidth on that card could be bottleneck and beyond lol.


----------



## robspierre6 (Aug 12, 2008)

AddSub said:


> Not too impressive. Terrible overclocker, much like the single 4870/50 setups. Hot and power hungry. I figured as much when they announced the preliminary specs recently. This review just confirms it all with some charts.
> 
> Although, all this can be corrected by appropriate pricing I still wouldn't go for it, since AMD must have cut corners on the component and build quality front, what with all their financial troubles.



I just got the worst review of the 4870x2 from this site.
I think the one who postd the numbers is probably a nvidiaguy.
The 4870x2 beats the 280gtx in sli.

http://www.techreport.com
http://www.tomshardware.com


----------



## btarunr (Aug 12, 2008)

robspierre6 said:


> I just got the worst review of the 4870x2 from this site.
> I think the one who postd the numbers is probably a nvidiaguy.



the author of ATItool. I find it a realistic review.


----------



## W1zzard (Aug 12, 2008)

PCpraiser100 said:


> They should've added in a mobo that had PCIE 2.0 cause the bandwidth on that card could be bottleneck and beyond lol.



pcie 2.0 bandwidth has no effect on the performance as numerous tests all over the web have shown


----------



## W1zzard (Aug 12, 2008)

robspierre6 said:


> I just got the worst review of the 4870x2 from this site.
> I think the one who postd the numbers is probably a nvidiaguy.
> The 4870x2 beats the 280gtx in sli.



i'm not an nvidia or ati guy. i just wrote my experience with the card, thats why there are multiple reviews on the net.. read them all and come to your own conclusions


----------



## Megasty (Aug 12, 2008)

robspierre6 said:


> I just got the worst review of the 4870x2 from this site.
> I think the one who postd the numbers is probably a nvidiaguy.
> The 4870x2 beats the 280gtx in sli.



There's nothing wrong with this review. You have to remember that this is an ATi launch card. Everything that's included with that are immature drivers & unused technologies. The card still pwns everything that's out there which is a major accomplishment in itself. I thought it would lose to NV in many of the games just because of the usual drivers issues but that wasn't the case. The card will grow on AMD's own accord. Multiple reviews is what give us the best inclination of what this card can do.


----------



## Darkrealms (Aug 12, 2008)

robspierre6 said:


> I just got the worst review of the 4870x2 from this site.
> I think the one who postd the numbers is probably a nvidiaguy.
> The 4870x2 beats the 280gtx in sli.


Even this review did not provide it to be the killer I was hoping for.


W1zzard said:


> i'm not an nvidia or ati guy. i just wrote my experience with the card, thats why there are multiple reviews on the net.. read them all and come to your own conclusions


*Thanks for another thorough review W1zz!*


Megasty said:


> There's nothing wrong with this review. You have to remember that this is an ATi launch card. Everything that's included with that are immature drivers & unused technologies. The card still pwns everything that's out there which is a major accomplishment in itself. I thought it would lose to NV in many of the games just because of the usual drivers issues but that wasn't the case. The card will grow on AMD's own accord. Multiple reviews is what give us the best inclination of what this card can do.


I hope you are right about the drivers.  This wasn't quite the launch I was hoping for.


I was really hoping for some higher performance over the 9800x2 and the GTX280.  At least those two cards traded off and neither was more consistently on the same level as the 4870x2.

Remember people this is a Dual GPU card.  Nvidia would never be $550 for something like that.  I just hope its enough to either make Nvidia drop prices or come out with either the next gen (including die shrinks, listening Nvidia) or a x2 of their own.


----------



## robspierre6 (Aug 12, 2008)

W1zzard said:


> pcie 2.0 bandwidth has no effect on the performance as numerous tests all over the web have shown


where did you get that from?
At least confirm your infos before posting them.
And yes,This is the worst review of the the 4870x2.


----------



## farlex85 (Aug 12, 2008)

robspierre6 said:


> where did you get that from?
> At least confirm your infos before posting them.
> And yes,This is the worst review of the the 4870x2.



So what if it's the worst review that you've seen? Wiz makes valid points and gives it what he thinks it deserves. Go write your own review if you want. The tech world is very subject to hype and bias, Wiz conducts some of the most unbiased reviews of any I've seen around, based almost completely on numbers until the end where he gives his 2 cents. He has given raving reviews to ati and nvidia cards. Check your "infos" before posting.


----------



## W1zzard (Aug 12, 2008)

farlex85 said:


> Since you seem to like tom's hardware



he is just spamming links to other sites to boost their search engine rating


----------



## farlex85 (Aug 12, 2008)

W1zzard said:


> he is just spamming links to other sites to boost their search engine rating



Good call. That was sneaky, I didn't pick up on it, what have I done, I fell right into the trap, noooo........


----------



## W1zzard (Aug 12, 2008)

i think even tom's has been testing pcie 2.0 vs. 1.1 bandwidth a while back. look on their site.

pcie bandwidth is the available bus bandwidth. the bus bandwidth is used only when stuff is being moved between gpu<->cpu or between gpus in a multi-gpu configuration (not x2 card, they have an own bridge chip). so pcie 2.0 will essentially give you advantages during game loading when the texture and level data is sent from the hdd -> memory -> cpu -> gpu. if you have like 32 mb vram and use hypermemory then it is also useful (not the case here either)


----------



## W1zzard (Aug 12, 2008)

farlex85 said:


> Good call. That was sneaky, I didn't pick up on it, what have I done, I fell right into the trap, noooo........



thats actually a good idea ... go go and spam tpu links on other hw sites


----------



## Polaris573 (Aug 12, 2008)

Or at least digg it.

http://digg.com/hardware/AMD_s_HD_4870_X2_2048_MB_Fastest_graphics_card_in_the_world


----------



## Megasty (Aug 12, 2008)

farlex85 said:


> Good call. That was sneaky, I didn't pick up on it, what have I done, I fell right into the trap, noooo........



I remembered that article from a month ago. They don't have anything new on the launch card, just the engineering sample. Those numbers are just too out of sync from reality. Even if that card pulled out all the stops, it still couldn't come close to those numbers. Maybe the driver updates will enhance the card to those levels, but not with a premature ES.


----------



## Chewy (Aug 12, 2008)

bugmenot said:


> 1024 x 768? Is this 1998?
> If you spend $500 on a graphics card with a $100 monitor, you have bigger problems than framerate.
> And why was there no 2560 x1600?
> 
> ...




 How many people actually play at 2560x1600? very few. looking at the 1680x1050 and 1920x1080 results those elite people who Im sure know enough about computers and their components would know this card will only move further up in performance compared to the competition.. maybe not in nvidia coded games, especially Crysis.

 A couple more driver releases and this card will do even better.. aren't they still on beta drivers? or did they release new ones yet?


----------



## Tatty_One (Aug 12, 2008)

robspierre6 said:


> I just got the worst review of the 4870x2 from this site.
> I think the one who postd the numbers is probably a nvidiaguy.
> The 4870x2 beats the 280gtx in sli.



Perhaps because this is one of the few truly "unbiased" reviews aimed at the mass user.......as only 4% of PC users have an SLi rig thats fairly immatrerial to the masses......mainstream useers wont be buying the x2 or 280 in any case, these are enthusist cards that take about 10-15% of the market share only.


----------



## trt740 (Aug 12, 2008)

An overclocked 280 gtx will come very close to those results, still nice to see AMD on top but not for long a 55nm 280gtx could easily make up 14 percent. Yet to be fair with driver improvements it will get faster.


----------



## trt740 (Aug 12, 2008)

MikeJeng said:


> Now there is no reason to buy a GTX280 unless it drops to $300.



not sure your right at 384.00 at the low end a 280gtx is about is 175.00 cheaper than a 4870x2 and when overclocked will beat 4870x2 in somethings and come very close in others saving you some money. Still if you want to have the best AMD is the KING ALL HAIL THE KING!!!!!!!!!

Also after reading this review which shows the 4870x2 destroying a 280 gtx http://www.hardwarecanucks.com/foru...-radeon-hd-4870-x2-2gb-video-card-review.html I wonder why the difference in these reviews is so big? I always thought Hardwarecanucks  was unbias review cite , but I trust wizzard 100 percent so I wonder how these numbers are so different?

W1zzard and incite on this review I posted I believe the owner of that cite belongs to Extreme and is a good guy. I'm not spamming I really only support this forum but I'm scratching my head.


----------



## Tatty_One (Aug 12, 2008)

trt740 said:


> not sure your right at 384.00 at the low end a 280gtx is about is 175.00 cheaper than a 4870x2 and when overclocked will beat 4870x2 in somethings and come very close in others saving you some money. Still if you want to have the best AMD is the KING ALL HAIL THE KING!!!!!!!!!



Queen.....I thought she was female, my ATi cards have always refused to do what I tell them and played me up like a woman!


----------



## L|NK|N (Aug 12, 2008)

chinese_farmer said:


> So, someone going to donate the 30" monitor for W1zzard to test on? Otherwise *there's seriously no point in this review without 2560 x 1600 and 4AA or else you don't see the true power of the card.*


Actually there is a point because I for one have limited space, let alone enough funds for a 30" monitor so I appreciate more "real world" numbers, especially since the majority of gamers are still gaming at 1680 x 1050 or below!



chinese_farmer said:


> Just speaking because I took a look at some other review sites testing higher res!


What would be the point if everyone had identical testing configurations?



robspierre6 said:


> I just got the worst review of the 4870x2 from this site.
> I think the one who postd the numbers is probably a nvidiaguy.
> The 4870x2 beats the 280gtx in sli.


Wow. Just wow. Not only are you ASSuming, you are questioning the most dedicated individual in this community. Let him send YOU the bill. 

Great review, W1zzard.


----------



## 3dchipset (Aug 12, 2008)

Awesome work, MrW1zzard.

I'm so glad you tested a ton of games not the typical ones that ATI or NVIDIA tells you too. (yes they have review guides that usually share what software that they "recommend" to use during testing)

I'm also loving what you done by disabling one GPU so technically you get to see the "potential" of a 1Gig 4870. Great review! Keep up the non-tradition games for testing and not the "recommended ones!"

3DChipset.com


----------



## trt740 (Aug 12, 2008)

Tatty_One said:


> Queen.....I thought she was female, my ATi cards have always refused to do what I tell them and played me up like a woman!



I should have know, Darn Brits and Queens, aw Heck all *hail the Queen!!!!!!  *


----------



## Darkrealms (Aug 12, 2008)

Tatty_One said:


> Queen.....I thought she was female, my ATi cards have always refused to do what I tell them and played me up like a woman!


ROFL!


----------



## Megasty (Aug 12, 2008)

Oh god, Sparta is a queen & hell has frozen over - we're all friggin doomed


----------



## WarEagleAU (Aug 12, 2008)

Awesome review. I will admit I was a tad disappointed, but only because I had my hopes up uber high. Its still the fastest, and still a beast. And ridiculously priced ::Roll:: The GTX280/260 was uberly ridiculously priced as well but it came down since ATI release their monster 4850.4870 cards.


----------



## KainXS (Aug 12, 2008)

man, i wish i could afford one right now


----------



## AddSub (Aug 12, 2008)

TechReport is okay, as far as their testing methodology and accuracy is concerned at least, but Tom's Hardware lost my respect when they did that whole "Hot Spot" article/video review back in 2001. The one where they removed the CPU heatsinks from running CPUs to see what exactly happens and how various CPUs cope with it. Pure hardware-porn. An example of special sort of junk review and brand-name mud-slinging real professional sites should stay away from. 

Also, Guru3D reviews? I've never seen Guru3D give a thumbs down to any GPU. For Guru3D it's all good and thumbs-up, short of a video card starting a fire and burning down their town or something! Guru3D = easily impressed.

Very thorough review W1zzard. Real professional. This type of stuff keeps me coming back to TPU.


----------



## AsRock (Aug 12, 2008)

farlex85 said:


> *So what if it's the worst review that you've seen?* Wiz makes valid points and gives it what he thinks it deserves. Go write your own review if you want. The tech world is very subject to hype and bias, Wiz conducts some of the most unbiased reviews of any I've seen around, based almost completely on numbers until the end where he gives his 2 cents. He has given raving reviews to ati and nvidia cards. Check your "infos" before posting.





Funny, it's the worsed and still beat the 280 lol.. ..  Thanks for your time W1zz


----------



## W1zzard (Aug 13, 2008)

i clearly stated that its the fastest vga card in the world, yet there are other things that also make up a good video card (at least for some users).

if you have infinite money, dont care about power or noise, dont want to overclock and never use windowed 3d then get two 4870 x2's. 

alternate suggestion: go out, find a bunch of nice girls and enjoy your infinite money - better than gaming


----------



## KainXS (Aug 13, 2008)

W1zzard said:


> alternate suggestion: go out, find a bunch of nice girls and enjoy your infinite money - better than gaming



I second that


----------



## paulo7 (Aug 13, 2008)

nice girls????? never seen one of them ;-) now sexy ones thats what u should be talking about! Only two users have donated then? Pretty crazy u must make money off advertising? or does that only cover cost running TPU?


----------



## robspierre6 (Aug 13, 2008)

Myabe,it's beacause the system setup they used. But,this review actuallly destroys the card.
I mean have you people check the 4870x2 reviews at techreport and thg

Does nvidia pay techpowerup to do so?

"Just wondering"


----------



## Kursah (Aug 13, 2008)

Great review W1Z, I wish I had time to read it this morning, but had to work early...

I've been interested behind all the hype I've heard of this card, so many gtx 2xx vs hd48xx threads and bickers, this card was supposed to quelm it all, with the supreme conquering of all cards. While it's impressive, I'm also dissapointed...but I know power and speed come at a price..not only cash, but size, heat, power consumption. I'm interested to see what happens when side-band is enabled, whether or not it will truly give a performance boost or is a technology implemented a generation or two too early, and something that may only be used later on after this card fades in the sunset.

Good job ATI for making the king-kong graphics card for 2008! Those who've been waiting for it I'm sure will be happy and content with the performance, even 1GB 1GPU performance was pretty good, kinda interesting a 4870 512mb tended to be faster by a tad tho, but that's probably due to having the interface chip and other designs for dual gpu that aren't as efficient for single gpu mode? Who knows, even at that, still good performance.

I can see the bench hounds the world round going crazy lol! Hope this card treats all of you well that decided to purchase...me I'll be fine with my GTX260, sure it's not as fast as the X2, but it doesn't need to be for what I do, but it's also not as-far behind as some had claimed either...but as with every new product, future drivers and support can still change the game!

Again, great job W1Z on the review, ATI, good job, a tad sloppy imo, but it got the job done, and as long as there aren't mass failures and driver bugs, it'll be a true winner that is sure to drop in price within a couple month's time! Good to see ATI on top of the heap, even if it took 2 GPU's at this point...and even tho I'm using NV cards atm. This is what the market needed, gotta have the coin flip for go well for the other guy every once in a while, get the competition stirring a little more. It's not quite as uber as many hoped, but it's not a total loss either, it came, it still conquered (mostly), and I'm sure has more to offer later on!


----------



## ShadowFold (Aug 13, 2008)

robspierre6 said:


> Myabe,it's beacause the system setup they used. But,this review actuallly destroys the card.
> I mean have you people check the 4870x2 reviews at techreport and thg
> 
> Does nvidia pay techpowerup to do so?
> ...



Well, why the hell are you here then? Go over there and stop posting the links, its that simple dude


----------



## Darkrealms (Aug 13, 2008)

W1zzard said:


> i clearly stated that its the fastest vga card in the world, yet there are other things that also make up a good video card (at least for some users).
> 
> if you have infinite money, dont care about power or noise, dont want to overclock and never use windowed 3d then get two 4870 x2's.
> 
> *alternate suggestion: go out, find a bunch of nice girls and enjoy your infinite money - better than gaming*


Wow that is sig worthy, LoL.


robspierre6 said:


> Myabe,it's beacause the system setup they used. But,this review actuallly destroys the card.
> I mean have you people check the 4870x2 reviews at thg and techreport
> 
> Does nvidia pay techpowerup to do so?
> ...


The best Nvidia does is provide testing samples same as ATI (err actually that would be the MFG huh, guess they don't do anything then . . .)
Try some actual links not just repeatedly hitting their homepages. . . .


----------



## AsRock (Aug 13, 2008)

Kursah said:


> Great review W1Z, I wish I had time to read it this morning, but had to work early...
> 
> I've been interested behind all the hype I've heard of this card, so many gtx 2xx vs hd48xx threads and bickers, this card was supposed to quelm it all, with the supreme conquering of all cards. While it's impressive, I'm also dissapointed...but I know power and speed come at a price..not only cash, but size, heat, power consumption. I'm interested to see what happens when side-band is enabled, whether or not it will truly give a performance boost or is a technology implemented a generation or two too early, and something that may only be used later on after this card fades in the sunset.
> 
> ...



Only reason i see the 4870 x2 not as good is due to NV dropping there prices because ya know they would of lost sales.  I'll stick with way and i would prefure to drop the $500 on the 4870 x2 than the 280 due to NV ripping people off till it's time for a ass whooping.

Think the cards got a hell load more performance to give if companys would get of there asses and support it more.


----------



## robspierre6 (Aug 13, 2008)

Well,I think that the review didn't show what the card is capable of.Nothing against the wizard.It probably was because of the system setup "the pci-express 1.1 slot and the cpu".
Anyway,i believe that the wizard did his best with the card.


----------



## AsRock (Aug 13, 2008)

robspierre6 said:


> Well,I think that the review didn't show what the card is capable of.Nothing against the wizard.It probably was because of the system setup "the pci-express 1.1 slot and the cpu".
> Anyway,i believe that the wizard did his best with the card.



Only thing i have against W1z's reviews with v cards is that he does not include the 2900 in them.

One thing for sure you can trust them more say about a lot of others.  IF you don't like them move along.


----------



## theonetruewill (Aug 13, 2008)

robspierre6 said:


> where did you get that from?
> At least confirm your infos before posting them.
> And yes,This is the worst review of the the 4870x2.



Oh goodness what is that noise? Sounds like sirens to me, wait could it be? Oh wow it's the 'Twat alert!" Yay!....................moron

Thanks for the review- shame about the power draw, it is a little insane. I definitely want to see Nvidia's response.


----------



## PCpraiser100 (Aug 13, 2008)

Megasty said:


> There's nothing wrong with this review. You have to remember that this is an ATi launch card. Everything that's included with that are immature drivers & unused technologies. The card still pwns everything that's out there which is a major accomplishment in itself. I thought it would lose to NV in many of the games just because of the usual drivers issues but that wasn't the case. The card will grow on AMD's own accord. Multiple reviews is what give us the best inclination of what this card can do.



Yeah, in fact R700 hasn't even unleased its full potential yet as drivers are still an issue. Soon, ATIs cards will leave a scar on Nvidia instead of a bruise once they get the driver problem solved.


----------



## candle_86 (Aug 13, 2008)

As long as it remains at its current price its a horrid price/preformance ratio. Also the GTX280 55nm that is supposed to be clocked significantly higher will likly topple it, the GTX280 isn't to far behind in all honesty from the R700, so ATI has some gloating room for now, anyone rememeber what Nvidia does?

x1800XT -> 7800GTX 512 -> x1900XTX -> 7900GTX -> x1950XTX -> 7950GX2

same here

HD3870 X2 -> 9800GX2 -> HD4870 -> GTX280 -> HD4870 x2 -> GTX280 55nm

Nvidia doesnt like to loose


----------



## btarunr (Aug 13, 2008)

The irony...nobody cried "horrible price/performance" so loud when 8800 Ultra sold for $700 while 8800 GTX sold for $500...or even when GTX 280 gave you nearly the same performance (slightly higher/debatable) as 9800 GX2 for $650 at launch.


----------



## candle_86 (Aug 13, 2008)

everyone did BTA, everyone saw how overpriced the Ultra was and most hated it.

The 9800GX2 vs the GTX280 though is the simple fact the GTX280 is a single card and uses less power and allows SLI the 9800GX2 does not


----------



## btarunr (Aug 13, 2008)

candle_86 said:


> The 9800GX2 vs the GTX280 though is the simple fact the GTX280 is a single card and uses less power and allows SLI the 9800GX2 does not



l2read   9800GX2 is very much a single card, and allows SLI



candle_86 said:


> NVidia doesn't like to loose



On a lighter note, "I'd like to loosen myself up" means "I need to pee/dump"...applying that to NVIDIA that..erm..nevermind.   I think you mean 'lose', not 'loose'.


----------



## Black Hades (Aug 13, 2008)

Very good review, quite the impressive card ,but I'm waiting for the 1Gb version of HD4870 with enhanced cooling. Now that is the weapon of choice for the masses. 

P.S.
Why doesnt somebody ban this robspierre6 dude already, he's still at it spamming like crazy even after being warned:shadedshu


----------



## AsRock (Aug 13, 2008)

Black Hades said:


> Very good review, quite the impressive card ,but I'm waiting for the 1Gb version of HD4870 with enhanced cooling. Now that is the weapon of choice for the masses.
> 
> P.S.
> *Why doesnt somebody ban this robspierre6 dude already, he's still at it spamming like crazy even after being warned:shadedshu*



Yeah, please already before the thread gets clossed.


----------



## InnocentCriminal (Aug 13, 2008)

theonetruewill said:


> Oh goodness what is that noise? Sounds like sirens to me, wait could it be? Oh wow it's the 'Twat alert!" Yay!....................moron



That made me laugh.


----------



## Tatty_One (Aug 13, 2008)

robspierre6 said:


> Myabe,it's beacause the system setup they used. But,this review actuallly destroys the card.
> I mean have you people check the 4870x2 reviews at thg and techreport
> 
> Does nvidia pay techpowerup to do so?
> ...



You continue to go around every 4870x2 thread "trolling" the same comments with the same links (as well as all the other Video card threads), you continue to select reviews that most support what you want to hear and continue to ignore what members are saying to you, so if you will forgive me just this once, I will repeat basically what about 30 people have already said to you in various threads.......

Most reviews have different test rigs, most reviews use varied and sometimes different benches and games, most reviews bench at a range of resolutions, not always the same, most reviews use different levels of AA/AF in those resolutions.....so you are bound to get differing results, now W1z tends to use mid ranged hardware for the testbed (because the majority of PC users have mid ranged setups), he uses mostly (but not exclusively) common resolutions with sensible detail levels and tests/compares over a HUGE range of benches and games, probably more than you will see anywhere else.....the idea behind this is simple, you can select the resolutions that you play at, the detail levels you like along with the games you like to play from a W1z review......it's more or less all there, this allows YOU to make a sensible conclusion about how the card will perform for YOU........this IMO is so much better than a site doing a review for example, knowing that a cards strengths are without AA/HDR enabled, so they dont enable it to show that the card is a strong performer and vice versa, I am not suggesting that your favorite reviewers do that, just showing an example, the more widespread a reviewer is in his use of resolutions, games, benches and detail levels makes for a better and more informed review in my opinion.

These comments are not intended to defend W1z's reviewing skills, he certainly does not need me to do that, to call a person biased or on the payroll of NVidia is slanderous in itself, especially if you knew his background and history in development and his work with ATi tool amongs numerous other things for the hardware community.

So, at risk of receiving an infraction, might I suggest that if you clearly dont like the reviews here or many of the comments on other video card related stuff posted by other TPU members in the countless threads you troll, and that you are so supportive of one or two other hardware sites, you go take a quick jog along to them and join their clearly fantastic sites and stop boring the ar*e off me by trolling anything Anti NVidia that you can dream up.    I am all for healthy debate and I certainly dont think NVidia has all the answers, I also like to see ATi giving them a  from time to time because that is what caused me to spend an aweful lot less than I would have done for the GTX260 I just bought, but you my friend are neither constructive or un-biased and that.....IMO contributes little to these forums.


----------



## LiveOrDie (Aug 13, 2008)

mdm-adph said:


> Blegh -- some people.  So your GTX 280 is better because it still plays at "playable levels" even though the 4870 X2 is faster?  Funny how that didn't apply to ATI cards when Nvidia was in the lead.



Im saying it is a better card for me because it run any game i have on high setting and im thinking of buying a 2nd one there is know why this card can come close to 2x280 look at the scores off the ati card its lacking some good drivers atm.


----------



## Ourasi (Aug 13, 2008)

I was just wondering how you could mess up the COD4 benchmark so horrible, and not even comment on it.

You are actually not using 2 cores at all, the difference between HD4870 and HD4870x2 is under 6fps, and thats just downright silly. Anyone with any experience at all on these HD4xxx card would know you completely made a mess of this gamebench, since COD4 have scary good scaling in Crossfire. You should have seen about 130-140fps+ in 1920x1200, not 76fps. When every reviewsite on mother earth get double the FPS compared to GTX280/HD4870, and any owner of 2xHD4870 get Crossfire to work and scale at about 2x, some alarmbells should start ringing, mate. I must say the work you did on COD4 with this HD4870x2 was really sloppy..

If you tested the HD4870x2 in single mode first, then turned on CFX, and ran the bench again, no wonder you got results like these... Do COD4 again please, and do it right this time, and that means Crossfire "ON", do the mandatory reboot after you enable Crossfire... Getting Crossfire to work in this game is easy as a walk in the park. I think you can do it, like everybody else did....


----------



## W1zzard (Aug 13, 2008)

Ourasi said:


> I was just wondering how you could mess up the COD4 benchmark so horrible, and not even comment on it.
> 
> You are actually not using 2 cores at all, the difference between HD4870 and HD4870x2 is under 6fps, and thats just downright silly. Anyone with any experience at all on these HD4xxx card would know you completely made a mess of this gamebench, since COD4 have scary good scaling in Crossfire. You should have seen about 130-140fps+ in 1920x1200, not 76fps. When every reviewsite on mother earth get double the FPS compared to GTX280/HD4870, and any owner of 2xHD4870 get Crossfire to work and scale at about 2x, some alarmbells should start ringing, mate. I must say the work you did on COD4 with this HD4870x2 was really sloppy..
> 
> If you tested the HD4870x2 in single mode first, then turned on CFX, and ran the bench again, no wonder you got results like these... Do COD4 again please, and do it right this time, and that means Crossfire "ON", do the mandatory reboot after you enable Crossfire... Getting Crossfire to work in this game is easy as a walk in the park. I think you can do it, like everybody else did....



let me look into this

edit: seems the multi-gpu configuration setting in cod 4 was not set to enabled. if i didnt notice it, take a guess at how many end-users will forget about and see their card not performing at its best

thanks for bringing this up, i will have revised cod 4 scores later today


----------



## J-Man (Aug 13, 2008)

HIS 4870 X2 arrives today.


----------



## alexp999 (Aug 13, 2008)

W1zzard said:


> let me look into this
> 
> edit: seems the multi-gpu configuration setting in cod 4 was not set to enabled. if i didnt notice it, take a guess at how many end-users will forget about and see their card not performing at its best
> 
> thanks for bringing this up, i will have revised cod 4 scores later today



The other thing I have thought of W1zz, did you enable crossfire mode for Crysis. It has to be don in an autoexec. by default it will only run Sli automatically.

Not saying you havent, it might just be really bad scaling in Crysis, I know it favours nvidia over Ati in terms of performance anyway.


----------



## W1zzard (Aug 13, 2008)

re. crysis, yep thats possible. let me look into that as well


----------



## alexp999 (Aug 13, 2008)

Taken from tweak guides:



> r_MultiGPU [0,1,2] - This option controls whether Crysis enables additional overhead for rendering on multi-GPU systems (i.e. SLI and CrossFire setups). If set to 0 - which is the optimal setting for single-GPU systems - it disables multi-GPU support; if set to 1 it enables multi-GPU support for both SLI and Crossfire; if set to 2 it attempts to auto-detect if a multi-GPU setup is present (system).



From experience it defaults to 2

You have to put the command:

r_MultiGPU=1

into an autoexec.cfg file (make one) in the Main crysis directory. to enable crossfire.


So stupid, anyone would think that they didnt want multi Ati GPU's to benefit in crysis, having to mess around with CLV's when Sli is done automatically.  Not that the nvidia logo at startup has anything to do with it...


----------



## alexp999 (Aug 13, 2008)

For anyone that wants it. I have made a autoexec file to enable crossfire in Crysis. (it does nothing else)

Simply extract the cfg file and put it in the main Crysis directory. I.e For me it goes in:

C:\Program Files (x86)\Electronic Arts\Crytek\Crysis



NB

Do not use this if you do not have crossfire or SLi, as it adds unrequired overhead and will reduce performance.


----------



## Ourasi (Aug 13, 2008)

W1zzard said:


> let me look into this
> 
> edit: seems the multi-gpu configuration setting in cod 4 was not set to enabled. if i didnt notice it, take a guess at how many end-users will forget about and see their card not performing at its best
> 
> thanks for bringing this up, i will have revised cod 4 scores later today



No problem, mate!

For some (alot actually), performance in this game will break, or seal the deal on a new purchase. So getting this one right, with twice the performance of Nvidias best, will sertainly give a huge boost for the HD4870x2, since this game is so popular and good FPS and visuals is very important for them..

This will complete an otherwise very good review, and will sertainly boost the average performance lead of the HD4870x2 by quite alot actually..


----------



## btarunr (Aug 13, 2008)

If it's not enabled by default (Crysis Crossfire), why use it? The 100s of people who buy R700 might not be verged with "changing a resource file to make it work like it should". Besides, it has to be mentioned in the test setup part of the review.


----------



## Tatty_One (Aug 13, 2008)

btarunr said:


> If it's not enabled by default (Crysis Crossfire), why use it? The 100s of people who buy R700 might not be verged with "changing a resource file to make it work like it should". Besides, it has to be mentioned in the test setup part of the review.



Yes, that was my first thought, 90% of gamers/PC users are not going to change or know how to change or even understand the need to change the config file but then I thought, well, if only around 4% of consumers use XFire/SLi, they are likely to be enthusiasts anyways and doing this will give a better reflection on what the x2 can acheive in the game, certainly for TPU members in any case.


----------



## W1zzard (Aug 13, 2008)

Ourasi said:


> So getting this one right, with twice the performance of Nvidias best



i'm definitely not seeing that. maybe its the timedemo i am using .. some people will now say nvidia paid me to record an nvidia biased timedemo.


----------



## alexp999 (Aug 13, 2008)

btarunr said:


> If it's not enabled by default (Crysis Crossfire), why use it? The 100s of people who buy R700 might not be verged with "changing a resource file to make it work like it should". Besides, it has to be mentioned in the test setup part of the review.



I know its bad, games like GRAW, have an option to turn it on as part of the game settings. Crysis should be the same. mind you they added a Vsync button in one of the patches. To remain unbiased and fair, they need to do the same for a Crossfire button.


----------



## Tatty_One (Aug 13, 2008)

W1zzard said:


> i'm definitely not seeing that. maybe its the timedemo i am using .. some people will now say nvidia paid me to record an nvidia biased timedemo.


----------



## W1zzard (Aug 13, 2008)

nvidia is not even sending samples to us by the way, so stop with those accusations.

looks like the x2 reviewer driver gives some gains in cod 4 for all rv770 cards .. the new graphs include 4870 non-x2 with this driver


----------



## W1zzard (Aug 13, 2008)

new cod4 graphs are up .. will be a few mins for them to go through the cache


----------



## alexp999 (Aug 13, 2008)

W1zzard said:


> new cod4 graphs are up .. will be a few mins for them to go through the cache



Will you be doing more crysis ones too? Or are you sticking to the,

"I shouldnt have to change CLV's in a game"

kinda bad on crysis' point, I might write to them, see if they can include it in the next patch.

A crossfire button on an Nvidia sponsored game!


----------



## W1zzard (Aug 13, 2008)

i'm looking into crysis right now, now that i'm done with cod 4.

edit: if crysis says "MGPU" in the screen overlay does that mean multigpu is enabled?


----------



## newconroer (Aug 13, 2008)

p_o_s_pc said:


> nice review and and a long one. looks to be a killer card but i won't be getting one i don't have a use for that much power




You and 95% of the rest of consumers. 


That's the situational crux of cards like the 280 and the X2. They're just not needed.

However, if the 280 isn't needed, than the X2 is double-not needed.

At an extra hundred dollars, twice the heat and on hundred more watts.. the X2 and the phrase 'price/performance' should never EVER be in the same sentence.


It's a revolving door, we're going round and round. Cards like the 280, X2, even 4870/260 in some situations, are far more than we need, except for one game - Crysis. We can't keep chasing this elusive goal forever. And not everyone even likes Crysis so...

What we're left with, is overkill horsepower cards, that don't live up to their expectations, who's prices are 'questionable' and etc.

The X2 doesn't 'dominate' anything, it doesn't even win 100%. 


So, despite all the conflicting reviews, let's say the X2 obliterated Crysis.

2560 res, 16af, 16aa, transparency aa, Very High, etc. 60+ fps solid!

WHO THE FUKK CARES anymore?

Are we seriously going to wage the price, technical, architectural and efficiency - value of a card on it's performance in one 3d application? An application that many have stated is 'coded poorly,' and etc.?



All I know is, cards are getting huge, they're not changing their architecture and they're forcing us into one of two product choices.

A) A large plethora of products ranging nearly the full performance spectrum; with enough choices to make you confused. Yet at the end of the day, 80+% of them will run your 3d applications without a problem.

or

B) Self-proclaimed 'high end' products that do take you to the next level so to speak, but are still tied down by the basic limitations of architecture within both the GPUs and other PC components. Ultimatley it's like running out once a year and buying a bigger motor for your car, and only making slight adjustments here and there. The main thing is, you have a bulk increase in power. Yet the silly thing is, that your only purpose in doing so, is in attempts to break some old 1/4 mile track record at your local dragstrip; even though, you know, it will be years before you do so, and by that time, you won't care anymore.


That's what GPUs in the high end feel like. They serve very little purpose for real world applications, except one. 



I just can't wait until physics on GPUs becomes a full fledged industry standard, and they can start shrinking GPUs to the point of 100% integration coughlarabcough 



Much praise to ATi, for the accomplishment of the X2, but unfortunatley like the 3870x2, it's far from impressive.


EDIT: Ignore me. I'm a silly heart..a dreamer..and I've been watching too much "Uncle Buck!"


----------



## alexp999 (Aug 13, 2008)

W1zzard said:


> if crysis says "MGPU" in the screen overlay does that mean multigpu is enabled?



Yes, look forward to seeing the new benches.


----------



## W1zzard (Aug 13, 2008)

alexp999 said:


> Yes, look forward to seeing the new benches.



then that's how the current numbers were obtained. i will rerun to be sure, but i dont expect any changes


----------



## Ourasi (Aug 13, 2008)

W1zzard said:


> new cod4 graphs are up .. will be a few mins for them to go through the cache



Your timedemo is actually faster then guru3d's, by 10 fps on HD4870, and the difference between GTX280 and HD4870 is also about the same. What you are not getting, is Crossfire scaling on the same level as the other review sites, they are at ca. 1.8x, while you have 1.4x.

This might be down to your motherboard, and it's most likely just a driverflaw as it seems like Crossfire need to be written to support it, or that it just not scale as good on it.

I'm a GTX280 owner myself, and as a reviewer to, I have no thoughts at all about you being biased towards Nvidia, or any other for that matter. I'm just pointing out that you are not getting the average scaling, the reasons for that, as stated above, is most likely the combination of P35/CFX/Drivers. I've tested 2xHD4870 myself, and had a scaling of above 1.8x on a X38 motherboard, so somthing is not right with P35 and this HD4870x2. I've also tested HD3870x2 on a P35, and it also lacked scaling compared to x38, but not all games scaled bad, just some like COD4.


----------



## alexp999 (Aug 13, 2008)

W1zzard said:


> then that's how the current numbers were obtained. i will rerun to be sure, but i dont expect any changes



Really? Wierd, unless ATi have worked that out and their is a fix in their drivers for it...?

In which case, if they are the same, Crysis has appauling scaling for crysis, even though it is a "Works best on Nvidia" game, you would expect a bigger improvement than that, when running both GPU's


----------



## W1zzard (Aug 13, 2008)

i forced mgpu to off now, lets see if there is any difference

edit: no major difference between mgpu forced on and off


----------



## W1zzard (Aug 13, 2008)

Ourasi said:


> This might be down to your motherboard, and it's most likely just a driverflaw as it seems like Crossfire need to be written to support it, or that it just not scale as good on it.



then ati fails even more. they say the x2 cards are supposed to work best on any chipset. ati themselves had not any complaints about the p35 chipset i'm using. also i dont see any relation between chipset and rendering performance unless we are talking hypermemory or other things that actually use the pcie bus

tech-report doesnt see 1.8x scaling either in 1920x1200


----------



## alexp999 (Aug 13, 2008)

W1zzard said:


> i forced mgpu to off now, lets see if there is any difference
> 
> edit: no major difference between mgpu forced on and off



So basically Crysis has no real benefit in having a 4870X2 over single 4870.

Thats why I ended up getting a GTX260 over a 9800GX2. Dual GOU is great when it works, but a waste of money when it doesnt. Thats why I want nvidias next graphics card to be an even better single GPU, not two PCB's slapped together again.

Thanks for the reviews W1zz!


----------



## Black Hades (Aug 13, 2008)

newconroer said:


> You and 95% of the rest of consumers.
> 
> 
> That's the situational crux of cards like the 280 and the X2. They're just not needed.
> ...



Well, I guess you got your 280 and once the euphoria dispersed you became jaded.

Great, one Terraflop, then 2, then 5, and so on.. who really cares about that? I enjoy old games like Arcaum and Fallout 100 times more than I did playing Crysis. And those games run on a 800Mhz Procesor with integrated graphics even...

Regarding to your car analogy, we now have the 300Km/h cars, great, but we lack the highways to go that fast.


----------



## LiveOrDie (Aug 13, 2008)

Ourasi said:


> Your timedemo is actually faster then guru3d's, by 10 fps on HD4870, and the difference between GTX280 and HD4870 is also about the same. What you are not getting, is Crossfire scaling on the same level as the other review sites, they are at ca. 1.8x, while you have 1.4x.
> 
> This might be down to your motherboard, and it's most likely just a driverflaw as it seems like Crossfire need to be written to support it, or that it just not scale as good on it.
> 
> I'm a GTX280 owner myself, and as a reviewer to, I have no thoughts at all about you being biased towards Nvidia, or any other for that matter. I'm just pointing out that you are not getting the average scaling, the reasons for that, as stated above, is most likely the combination of P35/CFX/Drivers. I've tested 2xHD4870 myself, and had a scaling of above 1.8x on a X38 motherboard, so somthing is not right with P35 and this HD4870x2. I've also tested HD3870x2 on a P35, and it also lacked scaling compared to x38, but not all games scaled bad, just some like COD4.



I've look at a few reviews and all are aabout the same look at this one this is with a Gigabyte X48-DQ6 http://www.overclockersclub.com/reviews/sapphire_4870x2/6.htm the HD 4870 X2 was just a hype


----------



## mdm-adph (Aug 13, 2008)

candle_86 said:


> As long as it remains at its current price its a horrid price/preformance ratio. Also the GTX280 55nm that is supposed to be clocked significantly higher will likly topple it, the GTX280 isn't to far behind in all honesty from the R700, so ATI has some gloating room for now, anyone rememeber what Nvidia does?
> 
> x1800XT -> 7800GTX 512 -> x1900XTX -> 7900GTX -> x1950XTX -> 7950GX2
> 
> ...



I just love how your explanation of why the 4870 X2 isn't a great card is to use a card that hasn't even been made yet (GTX280 55nm).  If we're going to play that game, then I'll go ahead and say the GTX280 55nm is crap because of the HD 5880 X4.

And seriously -- seeing as how the 55nm 9800GTX+ was a colossal *snore*, I don't see why a 55nm GTX280 would be that much different than a 65nm one. :shadedshu


----------



## Ourasi (Aug 13, 2008)

*Techpowerup 1920x1200 4xAA/16xAF*
74.0-98.4 COD4 33%
84.5-119.6 Call of Juarez 42%
262.6-337.8 Company Of Heroes 29%
37.9-34.2 Crysis -11%
104.4-137.7 Enemy Territory 32%
268.9-252.8 FarCry -6%
122.0-192.0 F.E.A.R. 58%
144.8-203.0 Prey 40%
54.1-161.6 Quake4 300%
143.3-239.5 Splinter Cell 67%
63.9-104.6 S.T.A.L.K.E.R. 64%
61.8-58,5 Supreme Commander -5%
73.1-74.1 Team Fortress 2 1%
132.6-151.9 UT3 15%
52.0-55.0 World In Conflict 6%

HD4870x2 is on average 32% faster then GTX280 at 1920x1200 AA/AF according to your own numbers. What was the basis for the 14% you have?

*Guru3d 1920x1200 4XAA/16XAF*
62.0-117.0 COD4 89%
50.0-54.0 Frontline: Fuel of war 8%
63.0-102.0 S.T.A.L.K.E.R. 63%
103.0-159.0 F.E.A.R. 62%
99.0-84.0 GRAW2 -18%

Guru3d rates HD4870x2 to be on average 37% faster then GTX280.

*Anandtech 1920x1200 4xAA 16xAF*
60.2-94.2 racedriver 56%
33.2-56.3 AoC 70%
51.9-76.0 Oblivion 47%
99.0-139.8 ET:wars 41%
35.8-39.8  Crysis 11%

Anandtech rates HD4870x2 to be on average 45% faster then GTX280 at 1920x1200 4xAA 16xAF.


----------



## W1zzard (Aug 13, 2008)

Ourasi said:


> What was the basis for the 14% you have?



thats the average over all benchmarks


----------



## Ourasi (Aug 13, 2008)

W1zzard said:


> thats the average over all benchmarks



Wouldn't it be better to have an % number pr. resolution? Slapping it all together does not tell the whole story IMHO. I imagine people would find it interesting to know that the higher the resolution, the bigger the difference between HD4870x2 vs. GTX280. After all this is a high-end part, and running this at anything below 1680x1050 should be considered a crime........

The Relative performance and performance per watt/$ would look very different if you applied it to each resolution tested. Alot of work, but worth it me thinks


----------



## DarkMatter (Aug 13, 2008)

W1zzard said:


> nvidia is not even sending samples to us by the way, so stop with those accusations.



Hmm. I always wondered why you were slow reviewing some cards and fast with some others, making me go out to find reviews elsewhere. You have to buy them or wait until someone donates one? Come on Nvidia, give him the cards!!!



alexp999 said:


> Will you be doing more crysis ones too? Or are you sticking to the,
> 
> "I shouldnt have to change CLV's in a game"
> 
> ...



No more patches remember? Just Warhead.



Ourasi said:


> Your timedemo is actually faster then guru3d's, by 10 fps on HD4870, and the difference between GTX280 and HD4870 is also about the same. What you are not getting, is Crossfire scaling on the same level as the other review sites, they are at ca. 1.8x, while you have 1.4x.
> 
> This might be down to your motherboard, and it's most likely just a driverflaw as it seems like Crossfire need to be written to support it, or that it just not scale as good on it.
> 
> I'm a GTX280 owner myself, and as a reviewer to, I have no thoughts at all about you being biased towards Nvidia, or any other for that matter. I'm just pointing out that you are not getting the average scaling, the reasons for that, as stated above, is most likely the combination of P35/CFX/Drivers. I've tested 2xHD4870 myself, and had a scaling of above 1.8x on a X38 motherboard, so somthing is not right with P35 and this HD4870x2. I've also tested HD3870x2 on a P35, and it also lacked scaling compared to x38, but not all games scaled bad, just some like COD4.



Might be the CPU. I think that latest Ati cards require or get more benefits from a faster CPU or a Quad than Nvidia's cards. It's nothing that I can corfirm, or that I have tested, just something I figured out looking at latest reviews. 

It's something that some reviewer could test (come on Wizz ). At least I'm very interested in the results. Could be interesting to guess if the different GPU architectures are so different that have very different CPU requirements and it would definately be interesting for end users. 

Also it would demostrate that the different reviews are different because of that and not because any kind of bias. Pretty much everyone in TPU knows the system used in the bench plays a big role, but until the HD4000/GTX cards I never had the impression that the influence of the system could be very different between the different architectures, it would just function like a constant multiplier for all cards. Knowing both architectures it is logical for Ati cards to have a bigger driver overhead, but I never found it to make a difference in the past. Now I think there could be something. It's the way I justify the differences between reviews. I'd love to see it confirmed.


----------



## btarunr (Aug 13, 2008)

DarkMatter said:


> Hmm. I always wondered why you were slow reviewing some cards and fast with some others, making me go out to find reviews elsewhere. You have to buy them or wait until someone donates one? Come on Nvidia, give him the cards!!!



NVIDIA doesn't, Zotac does 

Just a friendly request, make your next NV card a Zotac.


----------



## Urbklr (Aug 13, 2008)

DarkMatter said:


> No more patches remember? Just Warhead.



Nah, there is a guy working on a new Crysis patch on his own.


----------



## W1zzard (Aug 13, 2008)

Ourasi said:


> Wouldn't it be better to have an % number pr. resolution? Slapping it all together does not tell the whole story IMHO. I imagine people would find it interesting to know that the higher the resolution, the bigger the difference between HD4870x2 vs. GTX280. After all this is a high-end part, and running this at anything below 1680x1050 should be considered a crime........
> 
> The Relative performance and performance per watt/$ would look very different if you applied it to each resolution tested. Alot of work, but worth it me thinks



i have been thinking about that, but A LOT of people want to look at one graph and know it all. maybe i could make 4 graphs each for perf summary, perf/$, perf/w and one additional summarizing the 4 resolutions.

i wrote me some nifty programs to do all the work, otherwise i'd spend all my life just processing the numbers. we are looking at over 1500 individual benchmark runs displayed in this review. yep that many! would you have thought that? for example if you were drawing our graphs by hand and it takes you 10 seconds per bar you would spend over 4 hours just to make the graphs


----------



## candle_86 (Aug 13, 2008)

mdm-adph said:


> I just love how your explanation of why the 4870 X2 isn't a great card is to use a card that hasn't even been made yet (GTX280 55nm).  If we're going to play that game, then I'll go ahead and say the GTX280 55nm is crap because of the HD 5880 X4.
> 
> And seriously -- seeing as how the 55nm 9800GTX+ was a colossal *snore*, I don't see why a 55nm GTX280 would be that much different than a 65nm one. :shadedshu



It already isnt a great card, the price/preformance ratio blows. As for the 9800GTX+ being a snore thats neither here nor there, the G200 55nm is supposed to be clocked higher, and the GTX280 isn't that far away in the first place. All it takes is an overclock from the factory to claim the throne again. The HD4870X2 is to hot, consumes to much power, and doesn't live up to the hype. Its also not signifcantly faster than whats availble that won't super heat your house. The fact is in under a month you'd pay for this card twice simply because of the power draw and the AC bill to keep your house cool


----------



## Zehnsucht (Aug 13, 2008)

W1zzard said:


> i have been thinking about that, but A LOT of people want to look at one graph and know it all. maybe i could make 4 graphs each for perf summary, perf/$, perf/w and one additional summarizing the 4 resolutions.
> 
> i wrote me some nifty programs to do all the work, otherwise i'd spend all my life just processing the numbers. we are looking at over 1500 individual benchmark runs displayed in this review. yep that many! would you have thought that? for example if you were drawing our graphs by hand and it takes you 10 seconds per bar you would spend over 4 hours just to make the graphs



I would certainly want a graph summarizing resolutions. After all, for most people it is not important that card x performs extremly bad in resolutions of 2560xxxx if they only play in 1680.


----------



## Tatty_One (Aug 13, 2008)

Ourasi said:


> Wouldn't it be better to have an % number pr. resolution? Slapping it all together does not tell the whole story IMHO. I imagine people would find it interesting to know that the higher the resolution, the bigger the difference between HD4870x2 vs. GTX280. After all this is a high-end part, and running this at anything below 1680x1050 should be considered a crime........
> 
> The Relative performance and performance per watt/$ would look very different if you applied it to each resolution tested. Alot of work, but worth it me thinks



If thats w1zzards average across something like 18 benchmarks is that not more informative than a higher average across say just 6 benchmarks?

I agree on your comments about resolution, however there are some in these forums who have ordered the card who have 19 inch screens so the masses need to be catered for also I think.


----------



## AddSub (Aug 13, 2008)

Zehnsucht, not everyone has a fixed resolution LCD. My CRT Trinitrons scale just fine up to 2048x1536.

1500+ individual benchmarks W1zzard? Now that's the professionalism I'm talking about and the reason I rely on TPU reviews when it comes to my hardware purchases.


----------



## Ourasi (Aug 13, 2008)

W1zzard said:


> i have been thinking about that, but A LOT of people want to look at one graph and know it all. maybe i could make 4 graphs each for perf summary, perf/$, perf/w and one additional summarizing the 4 resolutions.
> 
> i wrote me some nifty programs to do all the work, otherwise i'd spend all my life just processing the numbers. we are looking at over 1500 individual benchmark runs displayed in this review. yep that many! would you have thought that? for example if you were drawing our graphs by hand and it takes you 10 seconds per bar you would spend over 4 hours just to make the graphs



Well, one graph would be sweet if it told the absolute truth, but it does not. One good example is 9800GTX+ vs. HD4870x2 in 1024x768 F.E.A.R.: here HD4870x2 is only 50% faster at 3 times the price, but when you move to 1920x1200 it's close to 300% faster. It would completely change the picture of perf. watt/$ if you excluded 1024x768/1280x1024, or made one for 1920x1200. The average performance advantage of HD4870x2 vs GTX 280 more then doubles, when moving up the resolution ladder.


----------



## Ourasi (Aug 13, 2008)

Tatty_One said:


> If thats w1zzards average across something like 18 benchmarks is that not more informative than a higher average across say just 6 benchmarks?
> 
> I agree on your comments about resolution, however there are some in these forums who have ordered the card who have 19 inch screens so the masses need to be catered for also I think.



I don't think the masses with 19" monitors are lining up to buy HD4870x2 or GTX280, some might, but those should not count 

His average include 1024x768 and 1280x1024, and thats half of his resolutins. And when half of the benches hardly showes any difference between the cards at all, the numbers get meaningless.


----------



## Urbklr (Aug 13, 2008)

candle_86 said:


> It already isnt a great card, the price/preformance ratio blows. As for the 9800GTX+ being a snore thats neither here nor there, the G200 55nm is supposed to be clocked higher, and the GTX280 isn't that far away in the first place. All it takes is an overclock from the factory to claim the throne again. The HD4870X2 is to hot, consumes to much power, and doesn't live up to the hype. Its also not signifcantly faster than whats availble that won't super heat your house. The fact is in under a month you'd pay for this card twice simply because of the power draw and the AC bill to keep your house cool



The 9800GTX+ was clocked higher, how much higher do you think they will clock the "Oh so mighty 55nm GTX280"? It probably won't take the performance crown. People like me, would not buy a GTX280, but would go with an HD4870X2 instead because we like ATi and the card is faster...even if it's only by a small amount. You say the 4870X2 is hot, and consumes a lot of power, what about the GTX series...they aren't exactly the coldest running, low power cards either. Stop trolling on AMD threads...and open a Nvidia fanclub.


----------



## candle_86 (Aug 13, 2008)

your right they are not the coolest, but they run cooler than even the 4850 does. You can buy the HD4870 X2, most users though will go with bang for buck. And by your own admission you wouldn't buy an Nvidia card proves my points its the ATI nutjobs that will get this card. I use a 3850 in my computer, its a decent little card, and when i bought it it was a good price/preformace.


----------



## cdawall (Aug 13, 2008)

well its nice that this has become a fanboi load of nonsense.

look at the scores in most games the 4870X2 beats the GTX280 done end of story.

wait here let me go through the NV fanboi complaints.

*uses more power*

response from me:
your dropping $550 on a video cards i doubt you will be using a shitty PSU

*doesn't scale at low res*

response from me:
why the sam hell are you using a low res monitor?

*GTX280 55nm beats it*
umm no i bet it wont until it gets clocked higher funny thing about die shrinks they don't change performance hence a 130nm P4 @3ghz with the same cache fsb etc performs the same as a 65nm@3ghz.


did i miss any? if i did just say it


----------



## Ourasi (Aug 13, 2008)

candle_86 said:


> It already isnt a great card, the price/preformance ratio blows. As for the 9800GTX+ being a snore thats neither here nor there, the G200 55nm is supposed to be clocked higher, and the GTX280 isn't that far away in the first place. All it takes is an overclock from the factory to claim the throne again. The HD4870X2 is to hot, consumes to much power, and doesn't live up to the hype. Its also not signifcantly faster than whats availble that won't super heat your house. The fact is in under a month you'd pay for this card twice simply because of the power draw and the AC bill to keep your house cool



HD4870x2 is 35-45% faster then a GTX280 at 1920x1200 4xAA 16xAF on average. And even faster on higher resolutions. To get the throne back, Nvidia would need alot more then a clock bump from 55nm., alot more and way beyond what a shrink can provide.

If you game 24 hours a day for a whole month, the HD4870x2 would cost you less than 15$ extra on the powerbill, so the extra cost on the powerbill with normal gaming, is 1/3 of that at the most. And the heat it produces is about the same as five 60w lightbulb's.


----------



## mdm-adph (Aug 13, 2008)

candle_86 said:


> The HD4870X2 is to hot, consumes to much power, and doesn't live up to the hype. Its also not signifcantly faster than whats availble that won't super heat your house. The fact is in under a month you'd pay for this card twice simply because of the power draw and the AC bill to keep your house cool



You know, I recall people saying the *exact* same thing about the GTX 280 when it came out, yet lots of people like you defended_ it_ back then...


----------



## Polaris573 (Aug 13, 2008)

Stay on topic and leave personal attacks out of this please.


----------



## Megasty (Aug 13, 2008)

I do have to admit when I look at reviews, I usually only look at the numbers for the res I use the most (1920x1200) then I go back to look at the others. A card like this is completely pointless at 1680x1050 & below but it owns the world at 1920x1200 & above. That's what makes the difference to me. Some people just want the best, fastest, biggest, hottest, bloodsucki-est  thing that's out there at any given time. I've come to categorize myself as one of those ppl


----------



## Tatty_One (Aug 13, 2008)

Ourasi said:


> I don't think the masses with 19" monitors are lining up to buy HD4870x2 or GTX280, some might, but those should not count
> 
> His average include 1024x768 and 1280x1024, and thats half of his resolutins. And when half of the benches hardly showes any difference between the cards at all, the numbers get meaningless.



I agree, but we disagree on the resolutions, I dont think a bench should be eliminated because we guess that the majority of x2 owners will be using 16xx and above monitors when it is a plain fact that the most common resolution remains with 17 and 19inch owners, now WE know that it might be fairly foolish to pay this much for a card to game only at those lower resolutions but do all the people out there who might just buy this card,,,,,they are not all hardware enthusiasts and whilst I would agree a large proportion of x2 buyers will be enthusiasts.....not all of them will be.
IMO reviews are not just for enthusiasts.....if they were they would only be reaching 5-10% of users.


----------



## Tatty_One (Aug 13, 2008)

cdawall said:


> well its nice that this has become a fanboi load of nonsense.
> 
> look at the scores in most games the 4870X2 beats the GTX280 done end of story.
> 
> ...



perhaps the x2 is a prettier colour?


----------



## mandelore (Aug 13, 2008)

wow, that is one killer card, for gods sake, why cant i buy one?? bills insist on eating away at my cash supplies.

Nice review!!!

Some of the bashing the 280 takes is quite insane


----------



## trt740 (Aug 13, 2008)

Well I have a 280 gtx and at max setting at 1400x900 widescreen format (not a super high resolution) 19 inch monitor and with all setting at max quality, in 3dmark06, 8x sample, 2 quality and  texture set to aniso filtering, level 16x, all my other cards including my 3870x2 didn't run close to as well as this card does. Set your system to those setting and tell me a 280 gtx or 4870x2 won't benefit a user such as my self. I am going to get a little higher resolution monitor for Xmas but games look great on this one.


----------



## Ourasi (Aug 13, 2008)

Tatty_One said:


> I agree, but we disagree on the resolutions, I dont think a bench should be eliminated because we guess that the majority of x2 owners will be using 16xx and above monitors when it is a plain fact that the most common resolution remains with 17 and 19inch owners, now WE know that it might be fairly foolish to pay this much for a card to game only at those lower resolutions but do all the people out there who might just buy this card,,,,,they are not all hardware enthusiasts and whilst I would agree a large proportion of x2 buyers will be enthusiasts.....not all of them will be.
> IMO reviews are not just for enthusiasts.....if they were they would only be reaching 5-10% of users.



This is a High-End product, witch only stretches it's legs on high resolutions. This is not meant for the masses, and will mostly end up in enthusiast's PC's. If you wan't to accurately review such a high-end product, using 1024x768 0xAA 0xAF is pointless, as it becomes a CPU bench in 99.9% of the games, and you have to overclock CPU like a mad man to even make the card break a sweat even at 1280x1024 2xAA 8xAF. The HD4870x2 is only 50% faster then 9800GTX+ in F.E.A.R. in 1024x768, while at 1920x1200 4xAA 16xAF it's 300% faster. 

Using such low resolutions in this review is like testing the supercar Bugatti Veron over speedbumps while carpooling 2.5 kids to school, and saying it's only 50% faster on average compared to a hybrid while complaining that the Veron uses alot of cas....

When reviewing a product in the extreme high-end such as HD4870x2, and using low CPU limited resolutions on half the benches, and conclude it's only 14% faster then the GTX280 based on these numbers, is inaccurate at best.


----------



## Ourasi (Aug 13, 2008)

trt740 said:


> Well I have a 280 gtx and at max setting at 1400x900 widescreen format (not a super high resolution) 19 inch monitor and with all setting at max quality, in 3dmark06, 8x sample, 2 quality and  texture set to aniso filtering, level 16x, all my other cards including my 3870x2 didn't run close to as well as this card does. Set your system to those setting and tell me a 280 gtx or 4870x2 won't benefit a user such as my self. I am going to get a little higher resolution monitor for Xmas but games look great on this one.



You are just underlining my points, these cards are not meant for 1024x768 0xAA 0xAF/1280x1024 2xAA 8xAF used in this review. For those resolutions to have any meaning at all on these high-end products, you'd have to slap on 16xAA 16xAF, but he did not. And thus it's a CPU bench, not a graphic bench....

If you played your games in 1400x900 0xAA 0xAF with a GTX280, that would be a horrific waste, borderline madness, don't you agree?


----------



## Tatty_One (Aug 14, 2008)

Ourasi said:


> This is a High-End product, witch only stretches it's legs on high resolutions. This is not meant for the masses, and will mostly end up in enthusiast's PC's. If you wan't to accurately review such a high-end product, using 1024x768 0xAA 0xAF is pointless, as it becomes a CPU bench in 99.9% of the games, and you have to overclock CPU like a mad man to even make the card break a sweat even at 1280x1024 2xAA 8xAF. The HD4870x2 is only 50% faster then 9800GTX+ in F.E.A.R. in 1024x768, while at 1920x1200 4xAA 16xAF it's 300% faster.
> 
> Using such low resolutions in this review is like testing the supercar Bugatti Veron over speedbumps while carpooling 2.5 kids to school, and saying it's only 50% faster on average compared to a hybrid while complaining that the Veron uses alot of cas....
> 
> When reviewing a product in the extreme high-end such as HD4870x2, and using low CPU limited resolutions on half the benches, and conclude it's only 14% faster then the GTX280 based on these numbers, is inaccurate at best.



Lol, I know all that, and I agree, my point was.........reviews are supposed to cater for all buyers, you or I do not know how many non enthusists will actually buy the card so who are we to say what users should be included or excluded, many reviews on many sites have included at least 1280 res for thier reviews......and therefore I beleive that it is right to show lower res....I am not suggesting that mid and high res should not be used for one minute, I just think that where possible....as wide an angle is preferable, bottom line is there ARE members in this site that have ordered the x2 with 19 inch monitors so why shouldnt this site cater for them as well as using higher res benches?  We do agree on the fact that much of the power and cost of the card is wasted on these lower res tho!


----------



## Animalpak (Aug 14, 2008)

O.T

Sorry i would know if anybody knows if a 1000 W powersupply is enough for 2 HD 4870X2 in crossfireX ? 

Or i need 1200/1300 W powesupply ?


----------



## ShadowFold (Aug 14, 2008)

A good brand 1kw is fine


----------



## imperialreign (Aug 14, 2008)

ShadowFold said:


> A good brand 1kw is fine



althought you might want to ask your local power company for a firehose connection 


I gotta say, I'm surpsied the card is running nearly twice as fast as a 4870.  the only thing I would've liked to see in the reviews would've been average _min_ framerates as well.  That was the pitfall of the 3870x2 versus crossfired 3870s.  I'm kinda hoping to see that ATI had addressed that issue with these cards.

Either way, though . . . wow.  It's like seeing the results of the X1900 XTX for the first time.  A whalloppin and a whoompin the card has come crashing out of the gates.  Trim the price up some more, and it can't be beat.


----------



## From_Nowhere (Aug 14, 2008)

Too bad ATI priced these cards at ~$550... 
Although, the 4870X2's performance is amazing


----------



## Mussels (Aug 14, 2008)

The performance per dollar graph is still my favourite part of TPU reviews.


ourasi: you keep talking about how 1024x768 is a useless resolution. It is not. Many high definition TV's these days come in 1360x768 resolution, which is just a widened 1024x768. 720P screens are next to impossible to find nowadays, so this is relevant to those users (myself included)

Also, just because its high end doesnt mean squat. assumptions are stupid, w1zzard did those tests to LEARN what resolutions these cards perform best at, and he PROVED its not worth it at low resolution. Proof is far better than faulty logic - the Nvidia FX series of cards was 'high end' and they also ran like crap - assuming at the time that they were for 'high res' only would have been ridiculous.


----------



## candle_86 (Aug 14, 2008)

Ourasi said:


> This is a High-End product, witch only stretches it's legs on high resolutions. This is not meant for the masses, and will mostly end up in enthusiast's PC's. If you wan't to accurately review such a high-end product, using 1024x768 0xAA 0xAF is pointless, as it becomes a CPU bench in 99.9% of the games, and you have to overclock CPU like a mad man to even make the card break a sweat even at 1280x1024 2xAA 8xAF. The HD4870x2 is only 50% faster then 9800GTX+ in F.E.A.R. in 1024x768, while at 1920x1200 4xAA 16xAF it's 300% faster.
> 
> Using such low resolutions in this review is like testing the supercar Bugatti Veron over speedbumps while carpooling 2.5 kids to school, and saying it's only 50% faster on average compared to a hybrid while complaining that the Veron uses alot of cas....
> 
> When reviewing a product in the extreme high-end such as HD4870x2, and using low CPU limited resolutions on half the benches, and conclude it's only 14% faster then the GTX280 based on these numbers, is inaccurate at best.



see thats not true, I run a 17in moniter and ran crossfire till I switched boards. My viewpoint is quite simple buy high end at release and you won't need a new card for 12x10 for sevral years. My 7950GT 512mb held up quite nicely till I got my 3870 and 3850 in my comp. So people like me care about the 12x10 res, I buy cards like this for one reason, because I upgrade every 1.5 years or I like to as long as my finaces are stable, the last year they havn't been. So to me this res matters, highend isnt just for enthusiats its also for the bang for buck guys, what works better a 500 dollar card every year and a half to 2 years or a 300 dollar card every year, you do the math.

My 6800GT stayed with me till I went to a 7950GT. I had to sell it for cash because I lost my job and get a 7200GS simply to hold me over. My 6800GT replace a 5950XT that was simply an RMA replacement for my Ti-4400. My Ti-4400 replace my Geforce2 GTS. See where im going with this?


----------



## Chewy (Aug 14, 2008)

Animalpak said:


> O.T
> 
> Sorry i would know if anybody knows if a 1000 W powersupply is enough for 2 HD 4870X2 in crossfireX ?
> 
> Or i need 1200/1300 W powesupply ?




 I think my psu would handle theses in x-fire good enough. I'd guess 600w peak? I have an Enermax 850Watt Dxx, its big. got it for about $225 with tax so I cant complain 

 Ofc if you do buy a new psu you should get the 1kw varrient of the Enermax Galaxy, if you go with a Galaxy psu.


----------



## Darkrealms (Aug 14, 2008)

First *Ourasi*, thank you for keeping this on a professional level.  Second on the first page of comments W1zz already said this:


W1zzard said:


> don’t look at 1024x768 if you don’t like to see those results
> 
> no 2560x1600 because those huge displays are too expensive to buy, only 3 people each donated $10 so far


http://forums.techpowerup.com/showpost.php?p=925292&postcount=24
If he doesn't have the monitor and we (as a forum group) haven't helped him out with it then we can't really blame him for not testing all the possibilities.

If he had it then I'm sure he would be testing it.


----------



## LiveOrDie (Aug 14, 2008)

are these Company of Heroes scores the max or average i would say there the max to high to be average?


----------



## W1zzard (Aug 14, 2008)

they are the average, did you disable the fps limit in your testing?


----------



## LiveOrDie (Aug 14, 2008)

yes and my average was only 191.3 fps ?


----------



## LiveOrDie (Aug 14, 2008)

this is in the older game ill try that did you all high settings or ultra settings im guess you had DX10 shader off because its in XP


----------



## W1zzard (Aug 14, 2008)

yeah no dx10 shaders, i selected the highest that could be selected under xp


----------



## LiveOrDie (Aug 14, 2008)

W1zzard said:


> yeah no dx10 shaders, i selected the highest that could be selected under xp



i set all my settings to high not ultra and still only got whats shown any ideas could it be vista? and here are the setting im running


----------



## W1zzard (Aug 14, 2008)

Ourasi said:


> Wouldn't it be better to have an % number pr. resolution? Slapping it all together does not tell the whole story IMHO. I imagine people would find it interesting to know that the higher the resolution, the bigger the difference between HD4870x2 vs. GTX280. After all this is a high-end part, and running this at anything below 1680x1050 should be considered a crime........
> 
> The Relative performance and performance per watt/$ would look very different if you applied it to each resolution tested. Alot of work, but worth it me thinks



there you go sir: http://www.techpowerup.com/reviews/Powercolor/HD_4850_PCS/26.html 30 pages for a vga review ..

we also changed the algorithm which builds the summary percentages slightly to be more intuitive. discuss the new per-resolution-graphs in the pcs 4850 comments thread please.


----------



## Culex (Aug 20, 2008)

What hardware was used for these tests? The TF2 and Crysis tests were absolute bogus, since I've seen the 4870x2 beat the gtx280 by 33-50% (in crysis). And wtf with the TF2 tests, which I might remind you is an ATI optiomised game. THE 4870 BEATS the GTX280 in that game, and what was with that CoH test? 198FPS? Bull! These I must say are the most bogus tests I have ever seen carried out, and what's with the 9800GTX beating the 4870 in 3dMark? THE 4850 beats it!!!!!! Anyway, those tests are completely illegitimate and should not be thought of as anything close to reality.


----------



## trt740 (Aug 20, 2008)

Culex said:


> What hardware was used for these tests? The TF2 and Crysis tests were absolute bogus, since I've seen the 4870x2 beat the gtx280 by 33-50% (in crysis). And wtf with the TF2 tests, which I might remind you is an ATI optiomised game. THE 4870 BEATS the GTX280 in that game, and what was with that CoH test? 198FPS? Bull! These I must say are the most bogus tests I have ever seen carried out, and what's with the 9800GTX beating the 4870 in 3dMark? THE 4850 beats it!!!!!! Anyway, those tests are completely illegitimate and should not be thought of as anything close to reality.



wow!!!!


----------



## Mussels (Aug 20, 2008)

yeahhhh... culex, if you read the thing its stated what hardware was used.


----------



## candle_86 (Aug 20, 2008)

Culex said:


> What hardware was used for these tests? The TF2 and Crysis tests were absolute bogus, since I've seen the 4870x2 beat the gtx280 by 33-50% (in crysis). And wtf with the TF2 tests, which I might remind you is an ATI optiomised game. THE 4870 BEATS the GTX280 in that game, and what was with that CoH test? 198FPS? Bull! These I must say are the most bogus tests I have ever seen carried out, and what's with the 9800GTX beating the 4870 in 3dMark? THE 4850 beats it!!!!!! Anyway, those tests are completely illegitimate and should not be thought of as anything close to reality.



dude wizz is one of best hardware reviewers around, go get a life kid


----------



## Megasty (Aug 20, 2008)

trt740 said:


> wow!!!!



wow is right  I think someone just needs some attention  We need to give him some


----------



## jbunch07 (Aug 20, 2008)

candle_86 said:


> dude wizz is one of best hardware reviewers around, go get a life kid



seriously! show some respect to the Wizz!


----------



## Culex (Aug 20, 2008)

candle_86 said:


> dude wizz is one of best hardware reviewers around, go get a life kid



Um, yeah, that's the problem. I DO have a life, obviously with a little more experience than the one you live, kid. I find the results hard to stomach because every review I've seen shows that 1. The 4850 beats the 9800GTX+ (The opposite was shown on the review).
2. The 4870 beats the GTX260 (The opposite was shown in the review), especially in HL2 and other source games, for it out performs the gtx280 in these as well (again, the opposite was shown in the review).
3. How can Opposing fronts score 199 fps when it's frame capped at 60?!?
4. Regardless of whether the 4870x2 has 1 or 2GB of GDDR5, it should cream not only a GTX280, but definitely a 4870 (It makes sense, 2x4870 GPUs on one diode would beat 1x4870 GPU NO MATTER WHAT!).
5. Do some research before you speak, otherwise you might understate your intellect in


----------



## btarunr (Aug 20, 2008)

On a precautionary note, please leave W1zzard to respond to these comments by Culex (if he feels he should). Let's not spawn dozens of posts that basically mean nothing more than "how dare you?".


----------



## alexp999 (Aug 20, 2008)

Culex said:


> Um, yeah, that's the problem. I DO have a life, obviously with a little more experience than the one you live, kid. I find the results hard to stomach because every review I've seen shows that 1. The 4850 beats the 9800GTX+ (The opposite was shown on the review).
> 2. The 4870 beats the GTX260 (The opposite was shown in the review), especially in HL2 and other source games, for it out performs the gtx280 in these as well (again, the opposite was shown in the review).
> 3. How can Opposing fronts score 199 fps when it's frame capped at 60?!?
> 4. Regardless of whether the 4870x2 has 1 or 2GB of GDDR5, it should cream not only a GTX280, but definitely a 4870 (It makes sense, 2x4870 GPUs on one diode would beat 1x4870 GPU NO MATTER WHAT!).
> 5. Do some research before you speak, otherwise you might understate your intellect in failing to do so



You have to remember some games have terrible crossfire scaling, same as Sli. Same as dual's and quad CPU's. I can guarantee across all games, the 4870X2 will NEVER be twice that of a single 4870, due to the scaling. And as for opposing fronts, I have never heard or seen a single game, which cannot have an FPS limited disabled, in some shape or form. I you sure you are not just running with vsync on?


----------



## Culex (Aug 20, 2008)

btarunr said:


> On a precautionary note, please leave W1zzard to respond to these comments by Culex (if he feels he should). Don't spawn dozens of posts that basically mean nothing more than "how dare you?".



Well, the first comment that hasn't been a mindless shouting down of someone with a different opinion (one well based, I might add). Thanks for the refreshing knowledge that there are people with sufficient EQs, btarunr. I direct you all to this review, one of many which support my comments:

http://en.hardspell.com/doc/showcont.asp?news_id=3807&pageid=3116


----------



## Culex (Aug 20, 2008)

alexp999 said:


> You have to remember some games have terrible crossfire scaling, same as Sli. Same as dual's and quad CPU's. I can guarantee across all games, the 4870X2 will NEVER be twice that of a single 4870, due to the scaling. And as for opposing fronts, I have never heard or seen a single game, which cannot have an FPS limited disabled, in some shape or form. I you sure you are not just running with vsync on?



True, SLI and Crossfire most definitely does not scale to 100%. What I was stating is that a 4870X2 would not under any circumstances be beaten by a single 4870GPU, unless it were faulty. Just like a 9800GTX would not outperform a 4850 (nowhere near) especially in HL2. The same goes for a 4870 vs a GTX 260. Also, I'm not aware that CoH:OP has vsync as an option.


----------



## W1zzard (Aug 20, 2008)

Dear culex,

1) look at the summary page
2) look at the summary page again
3) when the fps cap is removed and it's not opposing fronts
4) look up "single" in a dictionary, the test setup page might also be of assistance
5) right back to you


----------



## Culex (Aug 20, 2008)

W1zzard said:


> Dear culex,
> 
> 1) look at the summary page
> 2) look at the summary page again
> ...



Dear W1zzard

1) My mistake, I should have viewed the test setup.
2) I simply assumed Opposing Fronts was being used, as it is the standard for all good reviewers. My mistake.
3) Regardless of whether or not you disabled one of the cores, it should have still beaten the GTX260, which it did not (especially in TF2). Quite bluntly, if the 4870 does not CREAM a GTX260 in source games, let alone a 280, the review instantly lacks credibility and should not be used as a reference. Check out this review:http://en.hardspell.com/doc/showcont.asp?news_id=3807&pageid=3116
4) Right back to you


----------



## Mussels (Aug 20, 2008)

what makes you think a 4870x2 is in fact, exactly the same as a 4870 if one card is disabled? The cards share PCI-E bandwidth (less total, for one GPU) and its common for dual GPU cards to use lower clocks/higher latency ram to get their heat/cost down.


----------



## btarunr (Aug 20, 2008)

Yup, even with CFX disabled, the disabled GPU is still 'active' though not participatory in graphics processing. In essence the one that is doing the graphics processing ends up with only 2.5 GB/s bandwidth, not that with one GPU not participating in Crossfire (doesn't mean disabled) means that all the PCI-E x16 bandwidth is at the disposal of the single GPU. It's really not the same as a single HD 4870 sitting cozy on a x16 slot, system bandwidth matters (to performance) more than the amount of memory that's available to a GPU so don't let that "1 GB / GPU" factor mislead you.


----------



## Exavier (Aug 20, 2008)

I'd be enjoying my X2 if it worked with this DFI x48 mobo.. 
still, RMAing the mobo, now waiting to use it  though £320 at release was an excellent price


----------



## Tatty_One (Aug 20, 2008)

W1zzard said:


> there you go sir: http://www.techpowerup.com/reviews/Powercolor/HD_4850_PCS/26.html 30 pages for a vga review ..
> 
> we also changed the algorithm which builds the summary percentages slightly to be more intuitive. discuss the new per-resolution-graphs in the pcs 4850 comments thread please.



Thats interesting....thank you, so contrary to some popular belief, the higher res you go, the the smaller the gap between the 280 and x2 becomes, that surprises me a little!


----------



## Mussels (Aug 20, 2008)

Tatty_One said:


> Thats interesting....thank you, so contrary to some popular belief, the higher res you go, the the smaller the gap between the 280 and x2 becomes, that surprises me a little!



its (something) of common knowledge that higher resolutions are more CPU bound than GPU limited. The gaps always close, and of course an x2 is going to need a monster of a CPU to feed it.


----------



## Chewy (Sep 2, 2008)

Im curious as to how the 4870x2 is doing with driver updates? anyone know? I may just get this card someday but not if its still lagging behind the 280 overall


----------



## ShadowFold (Sep 2, 2008)

Well my 4850 was already getting max fps in all my games with 8.7 and they did the same with 8.8 so go with a 4870X2


----------



## Chewy (Sep 2, 2008)

I plan on getting this card down the road sometime maybe, I was planning on getting it on release, but than it was released for $600+taxes where I live and I spent my money on something else anyway.

 time will tell, this is the best card out though atm from what someone told me on msn.


----------



## ShadowFold (Sep 2, 2008)

Wait for the non refs man, it will be worth it


----------



## Ourasi (Sep 19, 2008)

Tatty_One said:


> Thats interesting....thank you, so contrary to some popular belief, the higher res you go, the the smaller the gap between the 280 and x2 becomes, that surprises me a little!



The HD4870x2 widens the gap to GTX280 the higher the resolution get's. How you can read those graphs so wrong is odd. Look them over one more time, and you'll se that "the popular belief" are infact the truth in this case...

The bad results the HD4870x2 recieve in this review, seem to be down to the P35 board he uses. There are popping up other reviews with low scores, and the P35 seems to be the common factor. Why this is the case, I can only speculate, but PCIe 1.1 might be the culprit. Newer drivers have surfaced since this review, but they do not seem to eliminate this.. There are a few reviews that shows that a Crossfire setup of 2xHD4870 scores lower on a 8x-8x PCIe 2.0 P45 motherboard compared to a 16x-16x PCIe 2.0 x38/x48 M/B. 8x PCIe 2.0 have the same bandwidth as a 16x PCIe 1.1, but the 8x PCIe 2.0 are improved on some other areas compared to 16x PCIe 1.1, so if 8x PCIe 2.0 are a limiting factor in some cases it wouldn't be surprising that the 16x PCIe 1.1 is that aswell, especially when a beast like HD4870x2 is used...

In Wizzard's review there are some awefull scores in a few games, and it will benefit a potensial buyer of HD4870x2 to check out other reviews on full 16x PCIe 2.0 motherboard's like x38/x48/P45 before drawing a conclution...


----------



## Ourasi (Sep 19, 2008)

Mussels said:


> its (something) of common knowledge that higher resolutions are more CPU bound than GPU limited. The gaps always close, and of course an x2 is going to need a monster of a CPU to feed it.



He is wrong, the X2 widens the gap the higher the resolution get's. And you are wrong aswell, in modern games, CPU will be less of a limiting factor the higher the resolution and AA/AF you use. If you game in high enough resolution in most of todays games, you will not see any difference between a 200$ CPU and a 1000$ CPU....


----------



## Tatty_One (Sep 19, 2008)

Ourasi said:


> He is wrong, the X2 widens the gap the higher the resolution get's. And you are wrong aswell, in modern games, CPU will be less of a limiting factor the higher the resolution and AA/AF you use. If you game in high enough resolution in most of todays games, you will not see any difference between a 200$ CPU and a 1000$ CPU....



You are right if you say the x2 widens the gap in most....but certainly not all, there is no FPS growth difference between the x2 and 280 between 1024 and 1920 in both COD4 and Team Fortress 2 and the gap actually narrows between the 2 in Call of juarez and Supreme Commander, add to that in this review the 280 also wins in 2 of the game benches, there is 6 already so yes, the x2 does "widen the gap in more" but not as you have intimated in all, although my comment in an earlier post was also misleading, I only looked at Call of juarez lol!

Ohh by the way, just found a really sweeeetttt deal for a 4870x2 in the UK and my finger is hovering over the "buy it now" button


----------



## Kwod (Sep 19, 2008)

alexp999 said:


> I can guarantee across all games, the 4870X2 will NEVER be twice that of a single 4870, due to the scaling



In future{and in some games right now}, the 4870x2 2 gig will be 500% faster than 4870 512 CF because of the 512 ram......obviously, 2 x 4870 1 gig will solve that problem.


----------



## Kwod (Sep 19, 2008)

Ourasi said:


> There are a few reviews that shows that a Crossfire setup of 2xHD4870 scores lower on a 8x-8x PCIe 2.0 P45 motherboard compared to a 16x-16x PCIe 2.0 x38/x48 M/B. 8x PCIe 2.0 have the same bandwidth as a 16x PCIe 1.1, but the 8x PCIe 2.0 are improved on some other areas compared to 16x PCIe 1.1, so if 8x PCIe 2.0 are a limiting factor in some cases it wouldn't be surprising that the 16x PCIe 1.1 is that aswell, especially when a beast like HD4870x2 is used...
> ...



P35 and P45 are ineffective for maxing CF{but P45 should be okay for any x2 single card}.......the action starts with 38/48 with full 16x2.


----------



## Ourasi (Sep 19, 2008)

Kwod said:


> P35 and P45 are ineffective for maxing CF{but P45 should be okay for any x2 single card}.......the action starts with 38/48 with full 16x2.



There is no difference in P45 and X48 when single PCIe is used, both has 16x PCIe 2.0. P35 uses PCIe 16x 1.1, witch is the same or slightly slower then 8x PCIe 2.0, and might be the reason for the bad scores in this review.
P35 suffers badly in crossfire with it's 16x-4x PCIe 1.1 solution, and I would not recommend crossfire on this M/B, same goes for the HD4870x2 IMHO...


----------



## Kwod (Sep 19, 2008)

Ourasi said:


> There is no difference in P45 and X48 when single PCIe is used, both has 16x PCIe 2.0. P35 uses PCIe 16x 1.1, witch is the same or slightly slower then 8x PCIe 2.0, and might be the reason for the bad scores in this review.
> P35 suffers badly in crossfire with it's 16x-4x PCIe 1.1 solution, and I would not recommend crossfire on this M/B, same goes for the HD4870x2 IMHO...



I've seen articles on P45 performance with CF and it sux........but as you say, there's no problems with P45 and x2 cards.

P35=dinosaur.

X2= one card with 2 GPU's.
CF= 2 seperate cards.


----------



## Ourasi (Sep 19, 2008)

It's how fast the the graphics card is that will decide if the PCIe port becomes a bottleneck or not. The HD4870x2 have twice the power of a single HD4870, and therefore demands alot more of the PCIe port in use. I do infact believe, that the P35's 16x PCIe 1.1 port with lesser speccs and half the bandwidth of a 16x PCIe 2.0, becomes a bottleneck when paired with a HD4870x2.

I can not prove this, since I do not have all the hardware to test this, but every review I have seen so far with a HD4870x2 on a P35 motherboard scores alot worse compared to the reviews using x38/x48/P45. Some tests show little difference, but many of the tests shows severely crippled performance......


----------



## Kwod (Sep 20, 2008)

Ourasi said:


> I can not prove this, since I do not have all the hardware to test this, but every review I have seen so far with a HD4870x2 on a P35 motherboard scores alot worse compared to the reviews using x38/x48/P45. Some tests show little difference, but many of the tests shows severely crippled performance......



Sounds reasonable to me.....I'd also like to see 4 gig included or become the standard.


----------



## 1dude1 (Oct 12, 2008)

i have 2 of these cards in crossfire, and i'm having a problem with full screen in games. other than that i have run crysis in very high quality with full anti-aliasing and not a glitch. i'm very upset though because i can't play games like supreme commander to the fullest becasue you have to do extra to get a different view.


----------



## elitewolverine (Oct 17, 2008)

wizz first of all great review.  I always like your reviews and have been a reader for sometime now but never registered.  Figured it was about time i suppose.

But at the same time i gotta ask about your naysays about this product...

no native hdmi support...why is that considered a con?  With DVI its a digital to digital transfer making HDMI native non issue and extremely irrelevant...

and was wondering about the overclocking, was AMD OVERDRIVE used?

that tool is simply the easiest thing to use on the planet...if not could you post if your still interested/have the card...

of how high you can clock stable wise gpu overclock using amd overdrive...

and do you think your test rig would gain any performance using something of a 'spider' setup ie amd chipset, amd cpu (2,3,4 core), ati card?  ive seen tests that get about 5% increase from using that type of setup...

and thanks for your review it was great i wonder how long it took you to do all those tests...keep up the great work...


----------



## 1dude1 (Oct 21, 2008)

> rue monster, very impressive results at high resolutions. I can't believe it, ati has made awesome work.
> 
> You need at minimum 24" lcd panel to use with.
> 
> THIS IS CROZZFAIAHHHHHH !!



just wondering if this could be my problem, i'm using 2 of these cards and i can't use fullscreen. could it be the fact that right now i only have a 22 inch monitor?


----------



## Mussels (Oct 21, 2008)

1dude1 said:


> could it be the fact that right now i only have a 22 inch monitor?



no. no way in hell.


----------

