# AMD Radeon HD 4890 CrossFire



## W1zzard (Apr 1, 2009)

Our HD 4890 CrossFire review investigates whether running two of these brand-new cards is a viable choice if you need that extra bit of performance or want to run with even more eye candy than what is possible with a single HD 4890 board.

*Show full review*


----------



## ShadowFold (Apr 2, 2009)

Thanks for the review. And the addition of Crossfire benchmarks are awesome! I can't believe that it scales 92% at 1920, they've come a long way haven't they!


----------



## DaMulta (Apr 2, 2009)

I think this is the first CF or SLi review I have EVER seen you post Wiz. 

Has there been another?

OMG 2 is better than one is all you can really say after seeing that review......


----------



## HammerON (Apr 2, 2009)

Thanks for the review


----------



## mrw1986 (Apr 2, 2009)

> HD 4890 CrossFire is available on any AMD or Intel chipset motherboard that has two PCI-Express slots.



That's not true...is it?
I thought it's only available on CrossFire compatible mobo's...


----------



## ShadowFold (Apr 2, 2009)

mrw1986 said:


> That's not true...is it?
> I thought it's only available on CrossFire compatible mobo's...



Crossfire's been like for awhile, where have you been


----------



## mrw1986 (Apr 2, 2009)

ShadowFold said:


> Crossfire's been like for awhile, where have you been



Obviously not paying attention, haha!

ATI's site says it only works on specific motherboards, then they list the chipsets.

EDIT: Holy crap, I'm a tard...Sorry, it's late and I misread it...I kept thinking nVidia chipsets (650i,680i,750i,780i) lol...


----------



## Drizzt5 (Apr 2, 2009)

I like this review, it was a great comparison with some good results...

Crossfire/SLI isn't that bad especially with the better ati drivers now... for a 21% if i was going for performance I'd skip the gtx285 and go for 4890CF.


----------



## v12dock (Apr 2, 2009)

its spanking, ty for the review


----------



## CStylen (Apr 2, 2009)

My 280 suddenly feels like a dated, crusty, craggly piece of hardware.

The 4890 looks sweet, thanks for the review!

random ADD prediction: ATI 5870X2 = 2 4890's


----------



## Drizzt5 (Apr 2, 2009)

CStylen said:


> My 280 suddenly feels like a dated, crusty, craggly piece of hardware.
> 
> The 4890 looks sweet, thanks for the review!
> 
> random ADD prediction: ATI 5870X2 = 2 4890's



I really doubt that. That would be a gay naming scheme on AMD's part. It would be logical to be 4890x2.

280 is still plenty powerful.... I'm sure your fps is high enough


----------



## CStylen (Apr 2, 2009)

It's just depressing watching your video card go further down the list lol...


----------



## ChaoticAtmosphere (Apr 2, 2009)

Thanks for the review Wizz!


----------



## TooFast (Apr 2, 2009)

by the looks of the benchies, a 4890x2 would beat the gtx295... and it would use less power than the 4870x2


----------



## Assassin48 (Apr 2, 2009)

TooFast said:


> by the looks of the benchies, a 4890x2 would beat the gtx295... and it would use less power than the 4870x2



if a 4890x2 would come out i would sell my 4870x2 w/ koolance water block for the same price of a 4890x2 

maybe a little less with free shipping


----------



## TooFast (Apr 2, 2009)

wow we are such pc junkies...sitting in front of our pcs waiting for the review...lol now we have the rv870 on the way


----------



## ShadowFold (Apr 2, 2009)

TooFast said:


> wow we are such pc junkies...sitting in front of our pcs waiting for the review...lol now we have the rv870 on the way



It's 12:28 where I am. I got school in 7 hours lol I stayed up for this review specifically.


----------



## TooFast (Apr 2, 2009)

so did I.


----------



## ChaoticAtmosphere (Apr 2, 2009)

ShadowFold said:


> It's 12:28 where I am. I got school in 7 hours lol I stayed up for this review specifically.



Haha! Me too...it's 1:30 here and I've got to be at work in 6 1/2 hours...


----------



## -1nf1n1ty- (Apr 2, 2009)

YAY so this mean I can get 2 4890's instead of the 4850x2's woooo


----------



## DaMulta (Apr 2, 2009)

-1nf1n1ty- said:


> YAY so this mean I can get 2 4890's instead of the 4850x2's woooo



dual 4850x2 would out run it. IMO

4 GPUS oced vs 2 GPUS that oc very little

Some it would be slower/much more it would be faster.
Did you see how close the x2 was in many of those. Some it was behind in as well to be honest.

It would cost the same to go either way.


----------



## -1nf1n1ty- (Apr 2, 2009)

DaMulta said:


> dual 4850x2 would out run it. IMO
> 
> 4 GPUS oced vs 2 GPUS that oc very little
> 
> ...



yeah thats true, we'll see what I will decide


----------



## VulkanBros (Apr 2, 2009)

Thanx for the review....but it unfortunely has a BIG downside...
Well ... I am drooling.....and my wallet is crying ....hmmm that also would mean I have to buy a bigger monitor...
Like CStylen I can see my GTX285 droping down the list...


----------



## eidairaman1 (Apr 2, 2009)

TooFast said:


> by the looks of the benchies, a 4890x2 would beat the gtx295... and it would use less power than the 4870x2



ya considering the 295 is practically 2x260 in SLI, and the 4890 in Crossfire Takes on the 295, very good performance range AMD released and AMD releases very little in the Top End bracket.


----------



## Animalpak (Apr 2, 2009)

eidairaman1 said:


> ya considering the 295 is practically 2x260 in SLI, and the 4890 in Crossfire Takes on the 295, very good performance range AMD released and AMD releases very little in the Top End bracket.



Are you sure ? I mean GTX 295 is 2x280GTX not 2x260GTX...


----------



## eidairaman1 (Apr 2, 2009)

Bus Width is what tells it.


----------



## btarunr (Apr 2, 2009)

Animalpak said:


> Are you sure ? I mean GTX 295 is 2x280GTX not 2x260GTX...



No, GTX 295 is 2x275GTX.


----------



## GandalfNYC (Apr 2, 2009)

Animalpak said:


> Are you sure ? I mean GTX 295 is 2x280GTX not 2x260GTX...



It's neither, actually... my guess would be that *btarunr* is correct, though I have not researched the specs on the 275GTX.
I suppose my GTX 285 @ 747/1581/2664 will have to suffice for now.

It's nice to see that AMD continues to release high quality, reasonably priced graphics cards.  This can only be good for the consumer!

This is my first post here, and I'm sure it has not impressed anyone but I gotta start somewhere, right??  
And yes, if you recognize me from my fame on another non-related site, it's really me and not some numbnut impersonator.


----------



## BrooksyX (Apr 2, 2009)

Sweet review. Thanks.


----------



## btarunr (Apr 2, 2009)

Guys, each digg saves a kitten: http://digg.com/hardware/techPowerUp_AMD_Radeon_HD_4890_CrossFire_Review


----------



## GandalfNYC (Apr 2, 2009)

btarunr said:


> Guys, each digg saves a kitten: http://digg.com/hardware/techPowerUp_AMD_Radeon_HD_4890_CrossFire_Review



Done, I registered and clicked "digg" as you requested.
I would like to encourage others to do the same.

*Now, how about welcoming me to this site?*


----------



## RomeoX47 (Apr 2, 2009)

W1zzard
really thanks for this amazing work test over 4890 CF


----------



## GandalfNYC (Apr 2, 2009)

btarunr said:


> Guys, each digg saves a kitten: http://digg.com/hardware/techPowerUp_AMD_Radeon_HD_4890_CrossFire_Review





GandalfNYC said:


> Done, I registered and clicked "digg" as you requested.
> I would like to encourage others to do the same.
> 
> *Now, how about welcoming me to this site?*



Looks like my encouragement was enough for 3 others to do the same... still waiting for a proper welcome!


----------



## Studabaker (Apr 2, 2009)

I Digged it.


----------



## RomeoX47 (Apr 2, 2009)

I Digged it too   im the 7th digg  lol lucky number , i  I Should Be So Luckyl lucky lucky lucky , lol 80 years kyllie Minogue song


----------



## Troubled (Apr 2, 2009)

I would love to see AMD come out with a Quad Core GPU...that would be great....then be able to CF them....*DROOL*...Just think about having 16 Graphics Processing Cores (4 Cards with 4 Cores each)...


----------



## Frizz (Apr 2, 2009)

wow! that's awesome, I knew ATI would make a come back! from the benchmarks it looks like it competes with the gtx285 and not any lower. It actually beat gtx295 in crossfire, can't wait to see 4890x2, lots of people saying it wouldn't be much of an improvement from the 4870.... well now I see why the 4870 dropped in price so much cause this looks like a new possible crown holder.


----------



## hooj (Apr 2, 2009)

Awesome review !


----------



## Scrizz (Apr 2, 2009)

must sell 4850s


----------



## mdm-adph (Apr 2, 2009)

Anybody know where 2xGTX285's would score on that list?  Probably about equal to the 2x4890's?


----------



## Salsoolo (Apr 2, 2009)

to tpu
you newer games for the bench.


----------



## HolyCow02 (Apr 2, 2009)

holy crossfire beatdown on the 295. Damn those things scale pretty well! Great review!


----------



## ChaoticAtmosphere (Apr 2, 2009)

btarunr said:


> Guys, each digg saves a kitten: http://digg.com/hardware/techPowerUp_AMD_Radeon_HD_4890_CrossFire_Review



I saved a kitten!


----------



## ChaoticAtmosphere (Apr 2, 2009)

GandalfNYC said:


> Looks like my encouragement was enough for 3 others to do the same... still waiting for a proper welcome!



Welcome to TPU Gandalf!! I'm sure it's because we're too busy drooling over the new 4890!!


----------



## DarkMatter (Apr 2, 2009)

Why doesn't power managemant work in CrossFire? Man 308w in idle is just way too much. So much that Wizz had to have added at least 3-6 months of 100w electricity bill to calculate the perf/price.


----------



## Xphobe (Apr 2, 2009)

Great review, but I would LOVE to see a maxed OCed 4890 vs an OCed GTX 275.  I wonder who would really be the winner then?


----------



## erocker (Apr 2, 2009)

GandalfNYC said:


> Looks like my encouragement was enough for 3 others to do the same... still waiting for a proper welcome!



The white wizzard of NYC approaches!  Welcome to TPU!  I digg as much as possible and generally am the first to digg reviews here.  Not only does it save kittens but it keeps sheep fed too!


----------



## zithe (Apr 2, 2009)

WOW. It really showed an increase over the 295 and 4870x2. I wonder if the 5850 will be to the 4870x2 like the 4850 is to the 3870x2. That'd be a nice jump.


----------



## Rabidpeanut (Apr 2, 2009)

Man, this sucks soooooooooo hard, crossfire for the hd4870 in crysis hardly even scales at all, in fact i get a loss of 1 frame/s. ATI make my heart heavy. I specifically bought the damn things in crossfire so that i WOULD get a playable framerate specifically in CRYSIS. Gay. Very very very gay. I am even considering going vista 64 so i can use catalyst 9.3 for the 9.2  crossfire gain in crysis warhead.


----------



## Drizzt5 (Apr 2, 2009)

Did you try the tricks and tweaks to get crossfire to scale well with crysis? There out there somewhere (console commands, maybe renaming the exe, stuff like that... I may look into it). And a loss of 1 frame? It scales decent enough for me... I get a few more fps out of my 2nd 4850... sometimes 2-5 fps more, sometimes even 10. But in other games I get much much more.


----------



## eidairaman1 (Apr 2, 2009)

DarkMatter said:


> Why doesn't power managemant work in CrossFire? Man 308w in idle is just way too much. So much that Wizz had to have added at least 3-6 months of 100w electricity bill to calculate the perf/price.



you always have something to gripe about when its AMD that is kicking your cards butt in Crossfire.


----------



## DarkMatter (Apr 2, 2009)

eidairaman1 said:


> you always have something to gripe about when its AMD that is kicking your cards butt in Crossfire.



Wohooo here comes the poor hurt fanboi. You feel bad? Poor boy...

I will always have something to say about such high and absurd power consumptions, are you telling me they are not high?? 100w over a period of 1 years accounts for a 25-100 euros increase in the electricity bill here (in my case it exceeds that ammount) depending on how much you use the PC, effectively something to take into account. And Zotac GTX275 FYI http://forums.techpowerup.com/showpost.php?p=1294352&postcount=4 - You-ll notice how I have something to gripe about that card too. What is that? Hmm, let's see. Power consumtion???? The difference? Other GTX275 consume much less in other reviews...


----------



## eidairaman1 (Apr 3, 2009)

DarkMatter said:


> Wohooo here comes the poor hurt fanboi. You feel bad? Poor boy...
> 
> I will always have something to say about such high and absurd power consumptions, are you telling me they are not high?? 100w over a period of 1 years accounts for a 25-100 euros increase in the electricity bill here (in my case it exceeds that ammount) depending on how much you use the PC, effectively something to take into account. And Zotac GTX275 FYI http://forums.techpowerup.com/showpost.php?p=1294352&postcount=4 - You-ll notice how I have something to gripe about that card too. What is that? Hmm, let's see. Power consumtion???? The difference? Other GTX275 consume much less in other reviews...



So did you complain about the 8800GTX/Ultra? I never hear you complain about them and they were ridiculous.


----------



## zithe (Apr 3, 2009)

Rabidpeanut said:


> Man, this sucks soooooooooo hard, crossfire for the hd4870 in crysis hardly even scales at all, in fact i get a loss of 1 frame/s. ATI make my heart heavy. I specifically bought the damn things in crossfire so that i WOULD get a playable framerate specifically in CRYSIS. Gay. Very very very gay. I am even considering going vista 64 so i can use catalyst 9.3 for the 9.2  crossfire gain in crysis warhead.



..Did you look at reviews or read what people post every day? ATI hardly scales in Crysis.



DarkMatter said:


> Why doesn't power managemant work in CrossFire? Man 308w in idle is just way too much. So much that Wizz had to have added at least 3-6 months of 100w electricity bill to calculate the perf/price.



Because it doesn't love you.


----------



## Animalpak (Apr 3, 2009)

Nice and deep review. Two of these beast ... Dude this crossFire setup is amazing, best results in every game ( almost )...

WoW Ati just Wow


----------



## DarkMatter (Apr 3, 2009)

eidairaman1 said:


> So did you complain about the 8800GTX/Ultra? I never hear you complain about them and they were ridiculous.



Those are nowhere near that power consumption lol. Nowhere near. And still yes I complained. If you saw me doing that or not is no my problem lol. 

But yes, in order to satisfy your desires of calling me fanboy again, I'll tell you that effectively, I complained much more about the HD2900XT, but that was of course because I'm a Nvidia fanboy. It had nothing to do with the fact that it consumed much more than the 8800 Ultra while being 25% slower. Nothing to do, puuuuure fanboyism.


----------



## HammerON (Apr 3, 2009)

GandalfNYC said:


> Done, I registered and clicked "digg" as you requested.
> I would like to encourage others to do the same.
> 
> *Now, how about welcoming me to this site?*



Kitten saved - even though I don't care for cats
Welcome to TPU GandalfNYC


----------



## Wile E (Apr 3, 2009)

I couldn't care less about power consumption. 

Although performance is very impressive, this does tell me that 4890 + 4890X2 (when and if it releases) is not a worthy upgrade for me.


----------



## neon neophyte (Apr 3, 2009)

Im faced with a problem.

Ive been eyeing up a gtx 295 for awhile now, and was just about to bite on getting one. I just saw the 4890 CF review, and, its changed my perspective slightly. Both are phenomenal overclockers, the 295 can be clocked up to 684 mhz on many, which is lots for a gtx. the 4890s are hitting 1 ghz with the asus volt mod. Power consumption is more on the 4890s, especially at idle. The 4890s make a buttload of noise, the gtx 295 is pretty quiet. I can find a gtx 295 for 627 canadian after tax and delivery. a 4890 is 269 before tax and delivery. i just dont know what to do!!!! gtx 295 or 4890 CF... HELP.

*edit* id like to assume there are more than just fanboys here.


----------



## Binge (Apr 3, 2009)

Get both.  Your indecisive nature can't be helped by the inevitable fanboy flame-war that might happen since you gave people the initiative to pick sides...


----------



## magibeg (Apr 3, 2009)

Can't we just like the fact that crossfire seems to scale well with the 4890? If you're buying 2 of them i don't think you're pinching pennies power wise.


----------



## eidairaman1 (Apr 3, 2009)

DarkMatter said:


> Those are nowhere near that power consumption lol. Nowhere near. And still yes I complained. If you saw me doing that or not is no my problem lol.
> 
> But yes, in order to satisfy your desires of calling me fanboy again, I'll tell you that effectively, I complained much more about the HD2900XT, but that was of course because I'm a Nvidia fanboy. It had nothing to do with the fact that it consumed much more than the 8800 Ultra while being 25% slower. Nothing to do, puuuuure fanboyism.



At least you admit you are because I never called you Fanboy, you said it yourself, I just noticed you come into most ATI/AMD related Topics and Bitch and whine.


----------



## TooFast (Apr 3, 2009)

neon neophyte said:


> Im faced with a problem.
> 
> Ive been eyeing up a gtx 295 for awhile now, and was just about to bite on getting one. I just saw the 4890 CF review, and, its changed my perspective slightly. Both are phenomenal overclockers, the 295 can be clocked up to 684 mhz on many, which is lots for a gtx. the 4890s are hitting 1 ghz with the asus volt mod. Power consumption is more on the 4890s, especially at idle. The 4890s make a buttload of noise, the gtx 295 is pretty quiet. I can find a gtx 295 for 627 canadian after tax and delivery. a 4890 is 269 before tax and delivery. i just dont know what to do!!!! gtx 295 or 4890 CF... HELP.
> 
> *edit* id like to assume there are more than just fanboys here.




I would go for the 4890's. either way you are using 2 gpus, u might as well go for the faster setup!


----------



## DarkMatter (Apr 3, 2009)

eidairaman1 said:


> At least you admit you are because I never called you Fanboy, you said it yourself, I just noticed you come into most ATI/AMD related Topics and Bitch and whine.



Haha I admited it? You know what's sarcasm? It was clear you were implying I am one, don't try to hide in a corner now. 
YOU think I always ONLY bitch about AMD because it's your precious and because you hardly ever enter Nvidia threads to begin with. You just assume. You aren't even able to see the cons of Ati cards, you can forgive anything when it's AMD, I don't. When I say something bad about Nvidia cards or technology it passes totally unnoticed, but when I remotely say something that is not good for AMD/Ati hohohohoh man, you are in a trouble. I will repeat the important part you purposedly missed, I will always bitch, complain, call it as you want, about things that simply are not correct and twice the power consumption at 308 is way too much, and yes 308 vs 230 is twice as much on the GPU. The very first day that AMD releases something that hasn't a big drawback I will not say anything, but that didn't happen since the X1900 series, the last Ati card I had.

Power consumption matters, I don't care if you don't care about the power consumption. Power consumption affects the *true* price of owning a card, so you can't be shouting a better performance/price when the card will cost you $50 more per year in electricity than the other card you are comparing to. It's that simple. Period.

EDIT: I will traduce the sarcasm about the 8800GTX/Ultra for you BTW. How in hell could I bitch about the Ultra's power consumption when the much slower HD2900XT consumed more? Tell me. That's one more example of how you are unable to see the bad when it comes to Ati/AMD. The 8800 was ridiculous? Then what should we think of the ONLY competition at the time?? Not to mention that 8800 GTX's power consumtion was aroud the same of the X1900XTX. No, when that happens you just assume it's the trend, a là "this performance requires this consumption". That's *not* the case right now.


----------



## RomeoX47 (Apr 3, 2009)

It is possible to mix in CF a Shappire 4870 & Shappire 4890 ?
lol?


----------



## Wile E (Apr 3, 2009)

DarkMatter said:


> Haha I admited it? You know what's sarcasm? It was clear you were implying I am one, don't try to hide in a corner now.
> YOU think I always ONLY bitch about AMD because it's your precious and because you hardly ever enter Nvidia threads to begin with. You just assume. You aren't even able to see the cons of Ati cards, you can forgive anything when it's AMD, I don't. When I say something bad about Nvidia cards or technology it passes totally unnoticed, but when I remotely say something that is not good for AMD/Ati hohohohoh man, you are in a trouble. I will repeat the important part you purposedly missed, I will always bitch, complain, call it as you want, about things that simply are not correct and twice the power consumption at 308 is way too much, and yes 308 vs 230 is twice as much on the GPU. The very first day that AMD releases something that hasn't a big drawback I will not say anything, but that didn't happen since the X1900 series, the last Ati card I had.
> 
> Power consumption matters, I don't care if you don't care about the power consumption. Power consumption affects the *true* price of owning a card, so you can't be shouting a better performance/price when the card will cost you $50 more per year in electricity than the other card you are comparing to. It's that simple. Period.
> ...



If somebody is even remotely considering something as graphically powerful as a 4890 Crossfire, I'm willing to be the majority of them aren't concerned about power consumption. 

I know I'm sure as hell not. In fact, I flash all of my cards, be it my ATI's or my nVidias, to operate on 3d voltage and speeds 24/7. I completely eliminate any traces of a 2D profile.

I shut off all power saving options on every aspect of my rig. With a machine this powerful, what's the sense in trying to save power? Just buy a damn low-powered rig for 24/7 use, and only turn on the powerful rig for gaming or heavy tasks.


----------



## DarkMatter (Apr 3, 2009)

Wile E said:


> If somebody is even remotely considering something as graphically powerful as a 4890 Crossfire, I'm willing to be the majority of them aren't concerned about power consumption.
> 
> I know I'm sure as hell not. In fact, I flash all of my cards, be it my ATI's or my nVidias, to operate on 3d voltage and speeds 24/7. I completely eliminate any traces of a 2D profile.
> 
> I shut off all power saving options on every aspect of my rig. With a machine this powerful, what's the sense in trying to save power? Just buy a damn low-powered rig for 24/7 use, and only turn on the powerful rig for gaming or heavy tasks.



I have not a problem with that, I'm not talking about that in fact, but hell don't say that CF is better from a performance-per-dollar basis then. It's that simple, how many times I have heard "buy two cards in sli/crossfire instead of a single GTX280/285, it's cheaper and or faster"? After 6-12 months of using it the single card solution becomes much cheaper. That wasn't trully the case when cards were sold for $500 as 2x$500 vs 2x$550 isn't much of a difference, but when recommending $200-250 cards $50 becomes of big relevance. You can't exclude power consumption when talking about money. Period.

You want the fastest no matter the price? I'm sure GTX285 SLI is faster and price difference will be mitigated by the lower electricity bill.


----------



## Wile E (Apr 3, 2009)

DarkMatter said:


> I have not a problem with that, I'm not talking about that in fact, but hell don't say that CF is better from a performance-per-dollar basis then. It's that simple, how many times I have heard "buy two cards in sli/crossfire instead of a single GTX280/285, it's cheaper and or faster"? After 6-12 months of using it the single card solution becomes much cheaper. That wasn't trully the case when cards were sold for $500 as 2x$500 vs 2x$550 isn't much of a difference, but when recommending $200-250 cards $50 becomes of big relevance. Period.
> 
> You want the fastest no matter the price? I'm sure GTX285 SLI is faster and price difference will be mitigated by the lower electricity bill.


I wasn't commenting on price. That's a whole other ball game right there. lol. Although, those that have a Crossfire board may want to consider this over an nVidia card solution, so it still has it's place in the market. These still do edge out the GTX295, the most powerful nVidia solution they could run.


----------



## DarkMatter (Apr 3, 2009)

Wile E said:


> I wasn't commenting on price. That's a whole other ball game right there. lol. Although, those that have a Crossfire board may want to consider this over an nVidia card solution, so it still has it's place in the market. These still do edge out the GTX295, the most powerful nVidia solution they could run.



You won't get anything close to that performance in anything but Core i7 so Crossfire only boards is irrelevant. Recommending this crossfire over a GTX295/HD4870 X2 or even a single GTX285 for a Core2 system based in the performance numbers seen here is missleading, to say the least.

Anyway all my comments were regarding the perf/price so what's the point of your reply if it was not just to argue then?


----------



## Wile E (Apr 3, 2009)

DarkMatter said:


> You won't get anything close to that performance in anything but Core i7 so Crossfire only boards is irrelevant. Recommending this crossfire over a GTX295/HD4870 X2 or even a single GTX285 for a Core2 system based in the performance numbers seen here is missleading, to say the least.
> 
> Anyway all my comments were regarding the perf/price so what's the point of your reply if it was not just to argue then?



No, an OC'ed Yorkfield quad is more than enough to run those setups without bottlenecking, especially at high resolutions. I get no difference in framerates between 3.6Ghz and 4.4Ghz on my setup @ 1920x1200.

And, from what I read in your posts, the most major basis of your argument against 4890 xfire was power consumption. At least that's the way it come across.


----------



## DarkMatter (Apr 3, 2009)

Wile E said:


> No, an OC'ed Yorkfield quad is more than enough to run those setups without bottlenecking, especially at high resolutions. I get no difference in framerates between 3.6Ghz and 4.4Ghz on my setup @ 1920x1200.



It's not. And it doesn't matter how much you overclock the CPU, Core i7 even the 920 at stock means a massive increase in performance over the highest clocked Core2 in the world, when it comes to multi GPU. It's probably because of the platform and triple channel memory, idk.



> And, from what I read in your posts, the most major basis of your argument against 4890 xfire was power consumption. At least that's the way it come across.



Because power consumption = price, money. In fact not only because of how much you pay for the electricity consumed by the card, but because you have to pay for a similar ammount again in summer to cool your house because of the added heat. Believe me, my brother has been out of home for a month many times and I did the same, spent the same electricity except for the fact that his PC was shut down and only that supposed a difference of 10 euros in the bill. It has happened enough times to see the pattern. We hardly spend electricity except on the PCs so it's clear from whe it comes the bill.

EDIT: I read all my posts again just to be sure and there's hardly a single one where I don't mention the perf/price and electricity expenses, BTW.


----------



## neon neophyte (Apr 3, 2009)

i actually think the multicore problem with core2s stems from the fact that they arent really quadcores, but rather, 2x2. It is increasingly difficult to write code that will optimize core2s 2x2 setup. I think this is done much easier on a true quad. PhenomII or CORE i7. Both are true quadcores.


----------



## mdm-adph (Apr 3, 2009)

magibeg said:


> Can't we just like the fact that crossfire seems to scale well with the 4890? If you're buying 2 of them i don't think you're pinching pennies power wise.



Thanks for pointing that out.  It's like someone complaining about the gas mileage on a Ferrari -- *nobody gives a shit.*  Yet, there's always people who'll complain about it, and not just from a technical aspect, either.  :shadedshu


----------



## neon neophyte (Apr 3, 2009)

some of us are in the market for a tesla roadster.

point is. energy efficiency matters. i never want my computer using 2000w. ever. 1000w is too high to be honest. i want performance increases without the extra cost in energy. thats true innovation. efficiency is innovation. just simply pumping more power into aging technology is... less innovative. the 4890 does a nice job of cleaning up its idle waste energy. kudos to the ati team on that.

thats what made core2s so great over p4s. they consumed far less electricity per core without sacrifice to performance. try to understand this matters to many. including myself.

*edit* @romeo i dont think you can crossfire a 4870 and a 4890. i think i read that somewhere. i think i also read that the 4890 does burst on the memory and the 4870 doesnt... or something. there are differences. even if it is the same gpu.


----------



## btarunr (Apr 3, 2009)

ahem? Why is everyone so worked up about the power consumptions?



			
				The review said:
			
		

> Speaking of power, you will be surprised to notice that a pair of HD 4890 accelerators consumes nearly the same (in fact less) amount of power as a single Radeon HD 4870 X2 accelerator, on peak and average scales.


----------



## Urlyin (Apr 3, 2009)

btarunr said:


> ahem? Why is everyone so worked up about the power consumptions?



They also need to take the noise to the General Nonsense thread instead of spamming the review thread...


----------



## 3870x2 (Apr 3, 2009)

Drizzt5 said:


> I really doubt that. That would be a gay naming scheme on AMD's part. It would be logical to be 4890x2.
> 
> 280 is still plenty powerful.... I'm sure your fps is high enough



I dont know, due to newer drivers, my 9800GX2 is whooping the GTX280 and 285   Talk about a good investment!


----------



## Apocolypse007 (Apr 3, 2009)

CStylen said:


> It's just depressing watching your video card go further down the list lol...



my 3870 isn't even on the list !


----------



## mdm-adph (Apr 3, 2009)

btarunr said:


> ahem? Why is everyone so worked up about the power consumptions?



Never underestimate the wrath of the angry Internet fanboy.  If it's not speed, it's power consumption.  If it's not power consumption, it's HD audio quality.  

If it's not that, it's color, or shape, or size, or something equally moronic.


----------



## DarkMatter (Apr 3, 2009)

Urlyin said:


> They also need to take the noise to the General Nonsense thread instead of spamming the review thread...



TBH I don't understand why you have the "discuss this articles in our forums" link, if discussion about such a critic thing as power consumption is so bad considered as to say it is spamm that should be in the nonsense thread. In fact, this is not the first time a mod suggests any discussion should be out of this kind of threads. I'd suggest you save time (both to you and us) and put a big "Thank Wizzard" button instead.  

Anyway it's funny how a comment that was supposed to be a joke has derived in this, but as any joke it holded a bit of truth and I know I'm right in this point. The fact is that people still fail to understand that my comment is not about the power consumption itself, but about how power consumption can greatly change the true price of a card, rendering any perf/price chart irrelevant.

I'm intrigued at this point about how so many people in these forums can care so few about power consumption, but then are so concerned about the slightest increase in price of the cards. Either they are a bunch of unconscious or just some 12 years old boys who don't pay the bills and obviously don't know how much it costs to earn the money to pay them. In any of both cases they need to understand that more power = more money, so as long as you put money into the ecuation, multi-GPU solutions lose the battle.


P.D. I actually know people who bought a Porsche/BMW sportive car and were later unable to mantain it, gas, tires, fixes, everything... their face was , the car in the garage or ebay. You know, they won the lottery/toto and went "Oh, I can buy a Porsche with this money" and they effectively could buy the car.  

With graphics cards is not exactly the same, but similar, just not that dramatic. What's the point of having a perf/price chart in that case? What the point of saving up $20 when buying the cards, if it is going to cost you twice as much in just one year of use?


----------



## DarkMatter (Apr 3, 2009)

btarunr said:


> ahem? Why is everyone so worked up about the power consumptions?



Ahem? I see the charts and the X2 consumes 230w in idle, not 308. I think it's a significative difference. And maybe it's just me, but I think the cards are idling most of the time the PC in powered on...


----------



## ChaoticAtmosphere (Apr 3, 2009)

Urlyin said:


> They also need to take the noise to the General Nonsense thread instead of spamming the review thread...



That was a hint.

Man, this 4890 kicks ass!!!!


----------



## rangerone766 (Apr 3, 2009)

has anyone seen a xfire review of a 4890 xfire'd with a 4870? just curious of that combos performance.

i'm thinking of picking one of these new 4890's up and was curious what to expect.


----------



## btarunr (Apr 3, 2009)

DarkMatter said:


> Ahem? I see the charts and the X2 consumes 230w in idle, not 308. I think it's a significative difference. And maybe it's just me, but I think the cards are idling most of the time the PC in powered on...



The GTX 275 consumes more power than the HD 4890 in both average (load) and peak load, and has worse performance/watt (including the Zotac Amp!).  The difference between the two in the idle chart is dwarfed by the margins by which HD 4890 leads GTX 275 in three other factors.

Neither of us can come to conclusions on what people's PC usage patterns are, and hence it boils down to how many factors a product leads in, numerically.


----------



## DarkMatter (Apr 3, 2009)

btarunr said:


> The GTX 275 consumes more power than the HD 4890 in both average (load) and peak load, and has worse performance/watt (including the Zotac Amp!).  The difference between the two in the idle chart is dwarfed by the margins by which HD 4890 leads GTX 275 in three other factors. And maybe it's just me, but idle power consumption is lesser a factor than average, load, and performance/watt when you're looking at high-end PC hardware.



The GTX275 doesn't consume more than the HD4890 on average. Wizzard's average is *not* really average, as per a normal usage.



> # Average: 3DMark03 Nature at 1280x1024, 6xAA, 16xAF. This results in the highest power consumption. Average of all readings (two per second) while the test was rendering (no title screen).



^^ I would call that consumption under load.

And the Zotac case, it's some issue with the card that Wizzard got. In almost every other review out there the GTX275 is the less consuming card when in idle. i.e. http://www.anandtech.com/video/showdoc.aspx?i=3539&p=22

EDIT: How much time you spend playing and how much in internet, watching videos, mailing, working...


----------



## btarunr (Apr 3, 2009)

DarkMatter said:


> The GTX275 doesn't consume more than the HD4890 on average. Wizzard's average is *not* really average, as per a normal usage.
> 
> 
> 
> ...



Then you disapprove of our testing methods. Ends our discussion. Feel free to Google your way to the review that proves your point best.


----------



## DarkMatter (Apr 3, 2009)

btarunr said:


> Then you disapprove of our testing methods. Ends our discussion. Feel free to Google your way to the review that proves your point best.



LOL. I'm not saying that. But are you ging to say that running 3Dmark will simulate  the average usage of a card? Also if you look at Wizzards charts (http://www.techpowerup.com/reviews/Powercolor/HD_4890/27.html) the GTX285 is just above the HD4890 in power consumtion in that average: 322 vs 324. Do you really believe the true GTX275 consumes more thn the GTX285? Come on...

And furthermore, I'll repeat this question, because I didn't like your comment TBH. If you don't want critics to the reviews why do you have a disscussion thread for them??


----------



## Marineborn (Apr 3, 2009)

wow this is the stupidest thing ive heard arguing about..Power consumptioN!! why are you buying 700+W power supplies if your so dam worried about power consumption! go green and shut your mouth and play some solitare! ahhahaha dam...i can find a  tree huggers forum for you if your interested, I have a 1300wpsu in my system for a reason! its to suck that outlet dry, and i look forward to doing it!....im just happy something gave that dam gtx295 some compitiion and yes im a ati fanboy doesnt mean i wont give the nvidia card credit for holding its place for awhile without getting knocked down, by its time for big red to step in, GREAT review by the way wizzard thank you very much.

this is good for both companies, ati is the only thing that keeps nvidia in line with keeping there prices even have way decent, that is a good thing.

ill wait for the x2 version of this card buy 2 of them crossfire them and giggle like a schoolgirl


----------



## W1zzard (Apr 3, 2009)

DarkMatter said:


> Wizzard's average is *not* really average, as per a normal usage.



so what do you propose as "average" ? 

i am using a large number of measurements taken while 3dmark is running and then calculate the average. 

the reason for 3dmark is that it is standard, easily obtainable, repeatable, supports multi gpu, is well supported, optimized for by all drivers and has a sufficiently high power consumption (not cpu bound).

there is neither a definition of "average" for gpu power consumption nor a definition for "normal usage". but again, bring forward your suggestions


----------



## W1zzard (Apr 3, 2009)

322 vs. 324 should be considered equal. 0.62% is hardly a difference.


----------



## DarkMatter (Apr 3, 2009)

W1zzard said:


> so what do you propose as "average" ?
> 
> i am using a large number of measurements taken while 3dmark is running and then calculate the average.
> 
> ...



I actually don't have a benchmark for that. What about having the testbed idling or playing a video for 5 mins after the 3Dmark run? 

IDK man, sorry if that offended you. I think it's good that you measure all that info and I find it useful, but I too think that even you (king of benhmarker kings, no sarcasm I think you are the best, that's why I'm here in TPU) can be honest and think of the posibility (I'm not even talking about probability) that it might not reflect a true average. I have read many surveys that said that PCs are 90% of the time idling, I sure as hell I'm close to that. I played more in the past but as to make it closer to 75% of the time playing? A 50%? I don't think so.

EDIT: And now that I have you here (Sorry hehe ), you don't find the Zotac GTX275 results intriguing?


----------



## btarunr (Apr 3, 2009)

DarkMatter said:


> And furthermore, I'll repeat this question, because I didn't like your comment TBH. If you don't want critics to the reviews why do you have a disscussion thread for them??



I personally have no problem with critics. I have a problem with people selectively judging the reviews to suit their contentions.


----------



## suraswami (Apr 3, 2009)

as usual nice review there.  thanks w1z.

420w peak power consumption, no way I can justify that if I am build something like that.


----------



## W1zzard (Apr 3, 2009)

DarkMatter said:


> What about having the testbed idling or playing a video for 5 mins after the 3Dmark run?



that's what the idle result is for. playing a video stresses cpu and the integrated video decoders in the gpu but not any shading units, maybe we could have a fourth result "video playback" but i dont think it is that important. i'll look into it when i get our new power measurements stuff setup (measuring vga power only, already blew 600 € on equipment and i'm nowhere near what i am looking for)


----------



## DarkMatter (Apr 3, 2009)

btarunr said:


> I personally have no problem with critics. I have a problem with people selectively judging the reviews to suit their contentions.



And I'm doing that? I don't know how TBH. I don't know what you think are my purposes. I have a problem with high power consumptions, that's all. I think my comments about power in every benchmark with a highly demanding (unjustifiable for my eyes) card proves so. It's been mostly Ati cards? Well take a look at the reviews of past 2 years and maybe just maybe, you can find a pattern and understand why.


----------



## DarkMatter (Apr 3, 2009)

W1zzard said:


> that's what the idle result is for. playing a video stresses cpu and the integrated video decoders in the gpu but not any shading units, maybe we could have a fourth result "video playback" but i dont think it is that important. i'll look into it when i get our new power measurements stuff setup (measuring vga power only, already blew 600 € on equipment and i'm nowhere near what i am looking for)



No, no. After or before the benchmark with the results included. Like 15 mins running 3dmark (I don't know how long it actually is) and 5 mins "not doing anything". Also I'm talking about GPU accelerated video playback, CPU usage is usually below 5% in that case.


----------



## btarunr (Apr 3, 2009)

DarkMatter said:


> It's been mostly Ati cards? Well take a look at the reviews of past 2 years and maybe just maybe, you can find a pattern and understand why.



Now it's not even about HD 4890 vs. GTX 275 

therefore...



DarkMatter said:


> And I'm doing that? I don't know how TBH. I don't know what you think are my purposes. I have a problem with high power consumptions, that's all. I think my comments about power in every benchmark with a highly demanding (unjustifiable for my eyes) card proves so.



You seem to be _very_ concerned about idle power consumption. Here's my advice: ditch that 8800 GT for a HD 4830. 







Notice how the margin looks similar (relatively) to that between HD 4890 and GTX 275. 

Good day. Back to topic.


----------



## DarkMatter (Apr 3, 2009)

btarunr said:


> You seem to be _very_ concerned about idle power consumption. Here's my advice: ditch that 8800 GT for a HD 4830.
> 
> http://img.techpowerup.org/090403/bta657.jpg
> 
> ...



Are you kidding? I could understand (to a point) you all mods are angry with me now but still... I have said I think 5 times up to this point that I care about power consumption because it means money. Buying another card will hardly make me save money. If you want to send, if anyone wants, to send me an HD4830 for free I will gladly exhange it for my 8800GT. I'll make a thread if there's a lot of people wanting to make the deal. Until then if someone is interested PM me.



> Now it's not even about HD 4890 vs. GTX 275
> 
> therefore...



I have not talked about the GTX275, so yeah it's not about the GTX275. It's not even about a single HD4890, which IMO has a good power consumption and perf/watt. (I challenge you to find a comment in this thread where I said it wasn't.)

All this is about 308w in idle, which is just an obscene number.


----------



## [I.R.A]_FBi (Apr 3, 2009)

btarunr said:


> Guys, each digg saves a kitten: http://digg.com/hardware/techPowerUp_AMD_Radeon_HD_4890_CrossFire_Review



dugged


----------



## Paintface (Apr 3, 2009)

i think i just saw a fanboy his world bubble burst....i can understand people having a hard time to deal with it....

Thats said, 4890 is available on newegg for $220!

I welcome everyone who joins our family in the ATI subforums!


----------



## Marineborn (Apr 3, 2009)

btarunr said:


> Now it's not even about HD 4890 vs. GTX 275
> 
> therefore...
> 
> ...



HAHAHAHAH!! i agree the 8800 likes the juice, and the 4830 outperforms, dont be hipocritically, if you start a agument you better have some good evidence backing you. now can we stop this whole thread about stupid stuff and just enjoy a new card on the market.
thanks


----------



## W1zzard (Apr 3, 2009)

DarkMatter said:


> I care about power consumption because it means money.



do the math how much 20W more power consumption in idle will cost you.

20W * number of hours pc on per day * 30 / 1000 * price per kwh you pay to your power provider = price you pay for 20W extra for a month

for me: 20W * 16 * 30 / 1000 * € 0.1433 = € 1.38


----------



## DarkMatter (Apr 3, 2009)

Marineborn said:


> HAHAHAHAH!! i agree the 8800 likes the juice, and the 4830 outperforms, dont be hipocritically, if you start a agument you better have some good evidence backing you. now can we stop this whole thread about stupid stuff and just enjoy a new card on the market.
> thanks



It's you guys who are acting like some stupids and assholes and at least in the case of btrunr I know he will regret those comments in the future when he calms down. 

I know for sure btrunr knows, because I have said that to him in more than one post (if he didn't forget about it), that I bought this 8800gt for 203 euros when at that time 8800GT's were selling for 250 minimum and other 8800 GT OC cards with similar clocks were near 300. That was 1,5 weeks after it was released so you will remember those prices if you make memory. The HD3870 was selling for 225 the cheapest and 240 the most expensive. 

Now we get back to power consumption (http://www.techpowerup.com/reviews/Sapphire/HD_3870/23.html) and we can see that the card consumes 15 watts more. 15 not 100. I think there is a difference, but you know, they didn't teach me a lot of maths in the fanboy academy... (sarcasm if you didn't catch it). So if 100w difference would suppose 100 euros per year in electricity in my case, guess what? 15w suposes 15 euros and the 8800GT is significantly faster than the HD3870 so end of story.


----------



## [I.R.A]_FBi (Apr 3, 2009)

and what can that buy wh1zz?


----------



## Marineborn (Apr 3, 2009)

DarkMatter said:


> It's you guys who are acting like some stupids and assholes and at least in the case of btrunr I know he will regret those comments in the future when he calms down.
> 
> I know for sure btrunr knows, because I have said that to him in more than one post (if he didn't forget about it), that I bought this 8800gt for 203 euros when at that time 8800GT's were selling for 250 minimum and other 8800 GT OC cards with similar clocks were near 300. That was 1,5 weeks after it was released so you will remember those prices if you make memory. The HD3870 was selling for 225 the cheapest and 240 the most expensive.
> 
> Now we get back to power consumption (http://www.techpowerup.com/reviews/Sapphire/HD_3870/23.html) and we can see that the card consumes 15 watts more. 15 not 100. I think there is a difference, but you know, they didn't teach me a lot of maths in the fanboy academy... (sarcasm if you didn't catch it). So if 100w difference would suppose 100 euros per year in electricity in my case, guess what? 15w suposes 15 euros and the 8800GT is significantly faster than the HD3870 so end of story.



NO we dont get back to power consumption this thread is about benchmarking! not power consumption! the CF 4890's pretty much outbenchmark any dual gpu out there, thats what this is about. im gonna drop it right now! and i would hope you would do the same!


----------



## jaydeejohn (Apr 3, 2009)

I dont post here as often as Id like to, but listening to this power consumption thing is a turn off, as these cards are definately not. Im thankfull this article wasnt just about power consumption, and showed the cards capabilities, as its seen as not only 1 of the highest performing solutions out there, but price/perf as well. Good job


----------



## HammerON (Apr 3, 2009)

Okay, I think we all understand that when DarkMatter goes to buy a GPU he/she will look at power consumption. That is well understood. Now let it go and realize there are others that are just worried about achieving the best game play, highest benchmark ~ and do not take into consideration how much juice they are sucking-up. I sure don't, but that is my personal preference.


----------



## DarkMatter (Apr 3, 2009)

W1zzard said:


> do the math how much 20W more power consumption in idle will cost you.
> 
> 20W * number of hours pc on per day * 30 / 1000 * price per kwh you pay to your power provider = price you pay for 20W extra for a month
> 
> for me: 20W * 16 * 30 / 1000 * € 0.1433 = € 1.38



It turns out to be a little bit more in my case. Let's say 1.5. 

1.5 * 12 = 18 a year.

203 + 18 + 9 = 230. So more o less what I would have paid for a HD3870 back in the day I bought this card.

Now with the HD4890 CF versus HD4870 X2 (to leave Nvidia out of the question so that Ati fans don't get offended...)

308 - 230 = 78

78w * 16 * 30/1000 * 0.1433 = € 5.36 per month. 64.38 a year. Enough to make a difference IMO. The price changes from 400 vs 500, to 400 vs 564 if you keep the card 1 year. Is that worth the perf difference?



HammerON said:


> Okay, I think we all understand that when DarkMatter goes to buy a GPU he/she will look at power consumption. That is well understood. Now let it go and realize there are others that are just worried about achieving the best game play, highest benchmark ~ and do not take into consideration how much juice they are sucking-up. I sure don't, but that is my personal preference.



You care about money? If no, then it doesn't matter. Of course you can buy and spend whatever you want and I trully don't have anything against that. I stated perf/price in my very first post and I have said so like 10 times already in this thread. If money doesn't concern you, ok, but don't say multi-GPU is a cheaper alternative to higher-end cards.


----------



## jaydeejohn (Apr 3, 2009)

It all depends on your choices, and what drives them. Again, Im thankfull this review wasnt only about power consumption, as thatd leave a very small amount of viewers that consider such things. IMHO, while power is important, its down the list compared to performance, or is this not an enthusiast site?


----------



## jaydeejohn (Apr 3, 2009)

Ill ask this. If power consumption is that important, why hasnt anyone anywhere done a review that shows the best perf for power only? And if they did, would the average enthusiast even be willing to get such a card? It may well end up being the very low end, would that make people happy then? Im trying to follow this to its end here. 
Or, do we draw a line, and say, this is where I sit with power vs perf? And how many lines are there? Especially at the very top of the performance end? Does it even mean anything at this level? Just some things to be considered overall


----------



## W1zzard (Apr 3, 2009)

jaydeejohn said:


> Ill ask this. If power consumption is that important, why hasnt anyone anywhere done a review that shows the best perf for power only?



did you see our performance per watt graphs?


----------



## DarkMatter (Apr 3, 2009)

jaydeejohn said:


> Ill ask this. If power consumption is that important, why hasnt anyone anywhere done a review that shows the best perf for power only? And if they did, would the average enthusiast even be willing to get such a card? It may well end up being the very low end, would that make people happy then? Im trying to follow this to its end here.
> Or, do we draw a line, and say, this is where I sit with power vs perf? And how many lines are there? Especially at the very top of the performance end? Does it even mean anything at this level? Just some things to be considered overall



I don't know why they don't do it, but I think they should*. I think it's very clear from the above calculations that even 20w can make a difference in the money spent in a card on it's lifespan. 20 euros/$ a year is significative. So in order to get the best bang for buck, you can pay $20 more for a card if it consumes 20w less. Of course depends on how much you use the card and how much you pay, but take into account that the average price per kilowatt for the world is $10 cents, $15 cents in Europe. I saw some stats a few months ago.

http://michaelbluejay.com/electricity/cost.html - not what I saw, but as to make an idea...
EU http://www.energy.eu/#domestic - Look at the highest one. Denmark with € 0.3. With the above calculations that goes for €1.7/year for every extra watt. 78 watts 132 euros.

*I actually know why they don't do it. It was an expression. It's imposible to make them because electricity prices are different everywhere.


----------



## jaydeejohn (Apr 3, 2009)

Yes. What Im saying is, no ones done a overall review for power vs performance review, just for its own sake. Im not advacating it either. I dont think its as important as price /perf, but is lower in the overall view of things, like most here do. 
Im just trying to point out that even tho theres concern in power usage, we all have our lines we draw. It changes not only with where you live, as power prices can be a concern, or not, but having monies to purchase the card of your choice. Going high end puts power consumption way down the list of puchase decisions in the highend, and its less a deal breaker at this perf level.
I really dont think an article on gpu power consumption regarding purchasing decisions would help. Its not cpus, where things like servers, farms etc need lower power.


----------



## jaydeejohn (Apr 3, 2009)

Anyways, thnx W1zzard. You managed to scoop the rest by doing this review.Very nice indeed


----------



## aetneerg (Apr 3, 2009)

I have a question, in Farcry 2 the benchmark is 1920x1200 4xAA and a Radeon 4870 512MB gets 20.1 FPS. In Hexus.net review, Sapphire 4870 1GB gets 50.1 fps, shouldn't a 4870 512MB still get at least remotely close to what a 4870 1GB would get? The only difference between them are the memory sizes to handle the different resolutions but I still find it strange.

A single 4890 1GB gets 49.6 FPS at 1920x1200 4xAA in FarCry 2. So if a 4870 1GB gets 50.1fps (according to Hexus) and a 4890 1GB gets 49.6, why does a 4870 512MB get 20.1? Isn't that a bit low? I am thinking it's because of the resolution and memory?? I just find it kind of strange. I am just looking at this from a different perspective to see if it's worth upgrading and how memory and scaling works from 4870 512MB to 4870 1GB to 4890 1GB to running in crossfire modes.


----------



## Paintface (Apr 3, 2009)

aetneerg said:


> I have a question, in Farcry 2 the benchmark is 1920x1200 4xAA and a Radeon 4870 512MB gets 20.1 FPS. In Hexus.net review, Sapphire 4870 1GB gets 50.1 fps, shouldn't a 4870 512MB still get at least remotely close to what a 4870 1GB would get? The only difference between them are the memory sizes to handle the different resolutions but I still find it strange.
> 
> A single 4890 1GB gets 49.6 FPS at 1920x1200 4xAA in FarCry 2. So if a 4870 1GB gets 50.1fps (according to Hexus) and a 4890 1GB gets 49.6, why does a 4870 512MB get 20.1? Isn't that a bit low? I am thinking it's because of the resolution and memory?? I just find it kind of strange. I am just looking at this from a different perspective to see if it's worth upgrading and how memory and scaling works from 4870 512MB to 4870 1GB to 4890 1GB to running in crossfire modes.



When building computers i always go for 1GB videoram when its used on a 1920*1200 resolution LCD, at that rez with AA/AF it can depending on game make a big difference, while the price difference is really small.

I wouldnt buy 512mb cards anymore unless you go for cards that perform less than a 4830/4850.


----------



## DarkMatter (Apr 3, 2009)

FC2 is very texture heavy and it uses loads of memory. Even the GTS250 1GB loosely outperforms the HD4870 512 at high resolutions. As you can see all the 512 MB cards get a hit. Surprisingly the 9800 GTX and GT does better than the HD4870, but who cares none of them is able to provide playable framerates.


----------



## eidairaman1 (Apr 4, 2009)

Paintface said:


> When building computers i always go for 1GB videoram when its used on a 1920*1200 resolution LCD, at that rez with AA/AF it can depending on game make a big difference, while the price difference is really small.
> 
> I wouldnt buy 512mb cards anymore unless you go for cards that perform less than a 4830/4850.



i would assume you run a 24" Plus Monitor


----------



## GandalfNYC (Apr 4, 2009)

erocker said:


> The white wizzard of NYC approaches!  Welcome to TPU!  I digg as much as possible and generally am the first to digg reviews here.  Not only does it save kittens but it keeps sheep fed too!



Thanks, I was considering turning *btarunr* into something unnatural for being so rude.
Has the courtesy of your hall somewhat lessened of late, Erocker son of Erouckely?


----------



## btarunr (Apr 4, 2009)

I know what NYC means, unfortunately, not GandalfNYC, my bad. Your welcome message was sent the moment you activated your account.


----------



## GandalfNYC (Apr 4, 2009)

btarunr said:


> I know what NYC means, unfortunately, not GandalfNYC, my bad. Your welcome message was sent the moment you activated your account.



Finally I am acknowledged, better late than never I suppose?  
This only makes you look even worse, since you saw my simple
request the moment I typed it but did nothing.  I am disappointed.  

This, after I did as you asked IMMEDIATELY, _without knowing what_ *btarunr* _means-_
and I even encouraged many others to boot!  

What say you?


----------



## Wile E (Apr 4, 2009)

DarkMatter said:


> I don't know why they don't do it, but I think they should*. I think it's very clear from the above calculations that even 20w can make a difference in the money spent in a card on it's lifespan. 20 euros/$ a year is significative. So in order to get the best bang for buck, you can pay $20 more for a card if it consumes 20w less. Of course depends on how much you use the card and how much you pay, but take into account that the average price per kilowatt for the world is $10 cents, $15 cents in Europe. I saw some stats a few months ago.
> 
> http://michaelbluejay.com/electricity/cost.html - not what I saw, but as to make an idea...
> EU http://www.energy.eu/#domestic - Look at the highest one. Denmark with € 0.3. With the above calculations that goes for €1.7/year for every extra watt. 78 watts 132 euros.
> ...


Yeah, you chose between a 3870 and an 8800GT, hardly considered high-end performance, even when they released. They have always been mid-range cards. I can fully understand power consumption being a major concern at that level, as we are already talking about a more modest computer build.

However, modesty goes completely out the window when you are building a rig with the graphical power that was shown in this review. When building a machine of this caliber, most only take performance, entry cost, and expandability into account. Power consumption is very low on the list of concerns. Usually only thought about enough to figure out the best psu for the system, not how much it will cost to run per month. This is the point I think you are failing to realize here. Most people building these kind of machines (at least the ones that I know) do not worry about power consumption. That's not the point of the machine.

I, for example, have a lower powered rig for casual usage. The rig in my specs is just a toy, and not always used, unless I have some encoding, benching or gaming I want to do. I usually surf on my Core2 laptop, or my Core2 iMac, both with all the power saving features enabled.


----------



## GandalfNYC (Apr 4, 2009)

Wile E said:


> Yeah, you chose between a 3870 and an 8800GT, hardly considered high-end performance, even when they released. They have always been mid-range cards.



Actually, the 8800GT was indeed considered a high end performance card when it was released, second only to the 8800GTX among Nvidia's offerings at the time.
This is why the 8800GT commanded a price of $300 or more!  Albeit, Nvidia delved too deeply and greedily into the pockets of consumers, much like the dwarves into the mountain - but that is the nature of both.

Also, it is in a different class than the 3870.  These two cards should _hardly_ be lumped together like that.
The AMD 4830 would currently be a better choice to compare to the 8800GT.  Both offer excellent price/performance and low cost crossfire and SLI, respectively.

Very nice system, by the way... just curious, how would 3 4870's (4870x2 + 4870) compare to 2 or 3 gtx 285's?  If you know of a good comparison chart please post a link!


----------



## Wile E (Apr 4, 2009)

GandalfNYC said:


> Actually, the 8800GT was indeed considered a high end performance card when it was released, second only to the 8800GTX among Nvidia's offerings at the time.
> This is why the 8800GT commanded a price of $300 or more!  Albeit, Nvidia delved too deeply and greedily into the pockets of consumers, much like the dwarves into the mountain - but that is the nature of both.
> 
> Also, it is in a different class than the 3870.  These two cards should _hardly_ be lumped together like that.
> ...



He referred to buying his 8800 during the time that both it and the 3870 released. That's what I was referring to.

And I think 2 GTX's and 4870 trifire should be roughly equal, with SLI being slightly ahead in most cases. Tri SLI would stomp it. lol. If my board did SLI, I'd actually probably have 2 GTX's, personally.


----------



## GandalfNYC (Apr 4, 2009)

Wile E said:


> And I think 2 GTX's and 4870 trifire should be roughly equal, with SLI being slightly ahead in most cases. Tri SLI would stomp it. lol. If my board did SLI, I'd actually probably have 2 GTX's, personally.



When I need more power, that's exactly what I intend to do... I'll just buy the motherboard that is best for dual or tri SLI'd GTX 285's.  
Depending on when the need arises, it might even be another socket 775 motherboard.
My warranty replaced abit ip35-e can't seem to run my e8400 past 3.6 ghz out of box and I do not want to bother with upgrading the bios for now.
Perhaps I will eventually come across a "friendly helpful supernerd" in the NYC area who will assist me...

As far as 2 or 3 SLI'd GTX 285's vs 2 or 3 crossfire'd 4870 GPU's performance-wise - I would think it depends on which game you want to play, to some extent.


----------



## jaydeejohn (Apr 4, 2009)

I found this interesting http://translate.google.com/translate?sourceid=navclient&hl=en&u=http://pclab.pl/art36063-17.html
It appears the high power usage at idle can mostly be blamed on the GDDR5 and not ATIs core. So, unless this changes, nVidia also will suffer high idle power usage with their next arch


----------



## DarkMatter (Apr 4, 2009)

jaydeejohn said:


> I found this interesting http://translate.google.com/translate?sourceid=navclient&hl=en&u=http://pclab.pl/art36063-17.html
> It appears the high power usage at idle can mostly be blamed on the GDDR5 and not ATIs core. So, unless this changes, nVidia also will suffer high idle power usage with their next arch



GDDR5 was supposed to consume less at the same clocks and currently GDDR3 speeds are higher than GDDR5 speeds, GDDR5 being in the 800-1000 realm (not OC) and GDDR3 being in the 900-1100 area (again not OC). At least that's how it was marketed and I think that was true.


----------



## r1rhyder (Apr 4, 2009)

someone needs a 30" monitor.


----------



## Josh81 (Apr 4, 2009)

I like the performance per dollar chart, a lot.


----------



## DarkMatter (Apr 4, 2009)

Wile E said:


> Yeah, you chose between a 3870 and an 8800GT, hardly considered high-end performance, even when they released. They have always been mid-range cards. I can fully understand power consumption being a major concern at that level, as we are already talking about a more modest computer build.



I didn't say that was the only reason, but if I didn't had the oportunity to get the 8800GT that cheap, I don't know what I would have done in the end. But that is a completely different thing, we are talking about a 15w difference in a card that is 20% faster (actually 30% on their max OC), which is far from 95w and a 9% difference in the case of GTX295 vs 4890 CF (two fastest things in Wizzard's charts, they cost the same BTW). OC ability doen0t matter as the GTX295 OC wonderfully. Watercooling or aftermarket cooling? If you are going to spend more, you might as well want to spend a little more and get GTX285 SLI.



> However, modesty goes completely out the window when you are building a rig with the graphical power that was shown in this review. When building a machine of this caliber, most only take performance, entry cost, and expandability into account. Power consumption is very low on the list of concerns. Usually only thought about enough to figure out the best psu for the system, not how much it will cost to run per month. This is the point I think you are failing to realize here. Most people building these kind of machines (at least the ones that I know) do not worry about power consumption. That's not the point of the machine.
> 
> I, for example, have a lower powered rig for casual usage. The rig in my specs is just a toy, and not always used, unless I have some encoding, benching or gaming I want to do. I usually surf on my Core2 laptop, or my Core2 iMac, both with all the power saving features enabled.



Buying a second rig won't save any money, you pay for the extra hardware.

Your point isn't good anyway. If money doesn't matter, that people wouldn't be considering the HD4890, they would simply go for the Quad CF with X2, the Quad GTX295 or Tri SLI with GTX285. 

I don't care too much about what people do, because I'm just trying to help those inconscious (in the sense that they never thought about this) people, trying to teach them something indeed. They should learn to manage their money and think about these things. You say it only matters the money at hand (entry cost) at the time of the purchase, well with a less consuming setup you'll get that money much earlier. It's hypocritical to say you can wait until you have $500 but you can't wait until you have $600, when at the same time you are spending much less in that timeframe. Look at the extreme example I put above, in Denmark (Italy, Netherlands) after 1 year of use the HD4890 CF can cost 132 euros more than the HD4870 X2, 161 € more than the GTX295 and 107 € more than GTX285 SLI. (Tri SLI'ing the GTX285 will consume a bit less than HD3890 CF in idle!!)(Notice how the numbers for the single GTX285 are the same as in Wizzard's review) For average Europe results, slash those numbers in half, they are still significative enough to the point of making GTX275/GTX285 SLI more appealing.

Again, using a different PC for web browsing, videos and all that won't save any money because the PC itself costs money. You only really save money if you buy something instead of another thing, not if you buy something on top of the other thing. I doubt any enthusiast will have a PC older than 2-3 years anyway, even if it is for watching videos.

The bottom line is that you can't talk about money arbitrarily. Either you care or either you don't. And with the current situation in the GPU arena, if you don't care you'd go GTX285 SLI (the least) because it's simply faster and if you do care, you should go for a setup that won't cost you more over the time, at least if you are conscious of how much it will cost you. Now that I have demostrated how much it can cost I hope that people in the EU take that into consideration, as it would be the smarter choice and also people in California or New York where the energy is expensive, for example.

Examples:

- Right now, if you want to break records you won't get HD4890 CF, you'll get GTX285 Tri or Quad CF/SLI or GTX285 SLI at least.

- If you just want one of the fastest things you can buy HD4890 CF *or* just buy the 9% slower GTX295 for the same price and one year later instead of buying another $500 card(s), buy a $600 card because you could save $100 more in bills or instead of waiting one year buy in just 9 months, because you already saved the $500 in that time.

That's what a smart money conscious person would do. Again, if you have money to burn this doesn't apply to you, do with it whatever you want and know that I would do the same and that I'm jealous.

EDIT: OH! And BTW, it just takes Ati fixing the problem with power management not working in Crossfire to make all my points null (practical not theorical, the examples, the cards) and me happy because something done well. Even if Ati fans (a lot here) are unable to see it, I have not a problem with Ati at all, neither I have a problem with the HD4890, except for the fact that GTX275 is a bit better. The only problem here is 308w at idle, fix that, end of problem.


----------



## Wile E (Apr 5, 2009)

DarkMatter said:


> I didn't say that was the only reason, but if I didn't had the oportunity to get the 8800GT that cheap, I don't know what I would have done in the end. But that is a completely different thing, we are talking about a 15w difference in a card that is 20% faster (actually 30% on their max OC), which is far from 95w and a 9% difference in the case of GTX295 vs 4890 CF (two fastest things in Wizzard's charts, they cost the same BTW). OC ability doen0t matter as the GTX295 OC wonderfully. Watercooling or aftermarket cooling? If you are going to spend more, you might as well want to spend a little more and get GTX285 SLI.
> 
> 
> 
> ...


I understand the point you are making, and have been understanding it. I'm just telling you that the people I know that are looking to spend this kind of money on a graphics card setup do not care that much about power consumption, so it doesn't really play into their decision. It's on the bottom of the list in importance.

And I'm not just defending ATI, nor am I an ATI fanboy. If my board did SLI, I'd have 2 280's or 285's. I'm just pointing out that your observations on power consumptions are most likely going to fall on deaf ears at this price and performance level.


----------



## hybrid1989 (Apr 6, 2009)

Hey guys, just joined the forum as I'm planning on making a build log for my new i7 build next month.

Just want to say, I've read through this whole forum and spending 20 dollars per year on 600 dollars worth of graphics cards is a cost easily incurred...

Spending 3.5% of the price of your cards per year  is totally insignificant. If 20 dollars is such a major concern I would highly recommend a) leaving this page, or b) closing your account, because this is a performance PC enthusiast website. 

That's like saying you shouldn't buy a Ferrari because it doesn't get the same mileage as a Kia rondo



4890....looks excellent but I'll probably wait for the RV870. I know I'd regret buying it when the 870 comes out with 1600 SPU's


----------



## DarkMatter (Apr 7, 2009)

hybrid1989 said:


> Hey guys, just joined the forum as I'm planning on making a build log for my new i7 build next month.
> 
> Just want to say, I've read through this whole forum and spending 20 dollars per year on 600 dollars worth of graphics cards is a cost easily incurred...
> 
> ...



You clearly didn't read te thread. It's not 20$, unless you only use your cmputer 2 hours a day, in which case I find it hard to believe you are an exthusiast. We're talking about anything from 50$ (in very few places) to $200 per year. 

Also which card is the Kia? The GTX295, the HD4870 X2 or GTX285/275 SLI? Man I didn't knew the small Kia cars were almost as fast as Ferraris or even faster. I'll consider Kia for my next car...


----------



## hybrid1989 (Apr 7, 2009)

W1zzard said:


> do the math how much 20W more power consumption in idle will cost you.
> 
> 20W * number of hours pc on per day * 30 / 1000 * price per kwh you pay to your power provider = price you pay for 20W extra for a month
> 
> for me: 20W * 16 * 30 / 1000 * € 0.1433 = € 1.38




This is what I was referring to.

Also i really wish Kia's were that fast....although with some modding, who knows


----------



## Culex (Apr 7, 2009)

God dammit these benchmarks are full of it. They have no consistency to any other benchmarks I've seen on paper or gameplay I've experienced. Just look at the left4dead benchmark. Broken is an understatement. Yeah, a gts250 would really beat a 4890 and 285. I've played left4dead on maximum settings on my 4870x2 (16xAA & 16xAF @ 1920x1200), and it runs constantly well over 160fps, and rightfully so (since the source engine does favor ATI cards). On my previous card, a 512Mb 4870, Call of Duty 4 ran @ an average of 95 to 125 fps (mostly maxing out @ 125fps), in multiplayer, even higher in single player (4xAA, 16AF 1920x1200). Sorry W1zzard, but either your hardware is completely screwed beyond anything I've seen before, or you're biased to the extreme.


----------



## ChaoticAtmosphere (Apr 7, 2009)

Power comsuption doesn't cost me, it's included in my rent. I even run two air conditioners in the summertime, again, included in the rent. 

Wattage is the bomb when it's included in your rent...I recommend renting!!!


----------



## Urlyin (Apr 7, 2009)

DarkMatter said:


> TBH I don't understand why you have the "discuss this articles in our forums" link, if discussion about such a critic thing as power consumption is so bad considered as to say it is spamm that should be in the nonsense thread. In fact, this is not the first time a mod suggests any discussion should be out of this kind of threads. I'd suggest you save time (both to you and us) and put a big "Thank Wizzard" button instead.






			
				darkmatter said:
			
		

> Wohooo here comes the poor hurt fanboi. You feel bad? Poor boy...



That my good man is trolling and has nothing to do with the discussion.. next time I'll just hand out infractions...


----------



## [I.R.A]_FBi (Apr 7, 2009)

DarkMatter said:


> You clearly didn't read te thread. It's not 20$, unless you only use your cmputer 2 hours a day, in which case I find it hard to believe you are an exthusiast. We're talking about anything from 50$ (in very few places) to $200 per year.
> 
> Also which card is the Kia? The GTX295, the HD4870 X2 or GTX285/275 SLI? Man I didn't knew the small Kia cars were almost as fast as Ferraris or even faster. I'll consider Kia for my next car...



Do you use all florescent bulbs at home?


----------



## ChaoticAtmosphere (Apr 7, 2009)

[I.R.A]_FBi said:


> Do you use all florescent bulbs at home?


----------



## W1zzard (Apr 7, 2009)

Culex said:


> God dammit these benchmarks are full of it. They have no consistency to any other benchmarks I've seen on paper or gameplay I've experienced. Just look at the left4dead benchmark. Broken is an understatement. Yeah, a gts250 would really beat a 4890 and 285. I've played left4dead on maximum settings on my 4870x2 (16xAA & 16xAF @ 1920x1200), and it runs constantly well over 160fps, and rightfully so (since the source engine does favor ATI cards). On my previous card, a 512Mb 4870, Call of Duty 4 ran @ an average of 95 to 125 fps (mostly maxing out @ 125fps), in multiplayer, even higher in single player (4xAA, 16AF 1920x1200). Sorry W1zzard, but either your hardware is completely screwed beyond anything I've seen before, or you're biased to the extreme.



we're using our own timedemos, if you are interested i can mail them to you or you can get them via the tpubench plugins. l4d clearly has a bug where every gpu it doesn't "know" get a performance boost


----------



## HammerON (Apr 7, 2009)

ChaoticAtmosphere said:


> Power comsuption doesn't cost me, it's included in my rent. I even run two air conditioners in the summertime, again, included in the rent.
> 
> Wattage is the bomb when it's included in your rent...I recommend renting!!!


Now that is what my apartment living butt is all about. Suck it up 
Although I am buying a house soon


----------



## DarkMatter (Apr 7, 2009)

[I.R.A]_FBi said:


> Do you use all florescent bulbs at home?



Nice try to make a joke. *Big FAIL!* The short answer would be: *yes!* The long one is: fluorescent, energy saving ones, LED ones, adjustable switches... I would suggest you think before you even try to make fun of someone next time. Try harder, seriously, you can. 



Urlyin said:


> That my good man is trolling and has nothing to do with the discussion.. next time I'll just hand out infractions...



Trolling? Ha! Maybe if mods respond in the way they should instead of attacking members, people could start respecting the rules better. If talking about power consumption is trolling and off topic, please don't include power consumption tests. Sorry Wizzard, but you should control your dogs tighter. Now you can ban me forever, I don't care anymore about a site with such moderators. Good bye to all the good people in TPU, I enjoyed talking with all of you.


----------



## Urlyin (Apr 7, 2009)

knock it off and get back on topic


----------



## hybrid1989 (Apr 7, 2009)

Buddy of mine just got a 4890 this morning. Sick card, but it is LOUD. Somewhere around 50 dbA at load  Not a big deal if you plan to liquid cool but thats a noise most people will find aggravating for sure. Any ideas on when the 58xx cards will be out?


----------



## W1zzard (Apr 7, 2009)

hybrid1989 said:


> Buddy of mine just got a 4890 this morning. Sick card, but it is LOUD. Somewhere around 50 dbA at load  Not a big deal if you plan to liquid cool but thats a noise most people will find aggravating for sure. Any ideas on when the 58xx cards will be out?



soon(tm)


----------



## neon neophyte (Apr 8, 2009)

im planning on getting 2 4890s when sapphire releases the vapour x cooler on the 4890. hopefully with some voltage tweaks, so i can tune over 1000mhz.


----------



## Culex (Apr 8, 2009)

W1zzard said:


> we're using our own timedemos, if you are interested i can mail them to you or you can get them via the tpubench plugins. l4d clearly has a bug where every gpu it doesn't "know" get a performance boost



Righto, but in relation to competing current gen gpus (ie 4870x2 and GTX295), if one is recognized they both should be. That's why I think there's something seriously wrong with your left4dead benchmark and Cod4 test. The X2 in reality out performs the 295 in that game, as it does in CoD4. The biggest shock however is the 4870 getting only 75 fps in CoD4. But anyway, you probably shouldn't use those timedemos or benchmark apps, since they are obviously flawed. I'd recommend running the actual games performing specific tasks (ie in CoD4 running through smoke, avg running through a heavily populated map etc and recording the results). HardOCD does this, which is why it appears to give far more believable results. Anyway, no hard feelings. Just use a different method in future, and you'll find you get far more legitimate readings. For the time being however, I'll check out the site you recommended. Good luck with your future tests.


----------



## eidairaman1 (Apr 8, 2009)

neon neophyte said:


> im planning on getting 2 4890s when sapphire releases the vapour x cooler on the 4890. hopefully with some voltage tweaks, so i can tune over 1000mhz.



ya the one that is on the 4870 Entices me to pick one up but i think the 4890 maybe my choice when the 2GB edition hits shelves.


----------



## Culex (Apr 8, 2009)

2GB of RAM on a single 256 bit bus? That's stretching it a bit methinks Also, it probably wouldn't be used.


----------



## eidairaman1 (Apr 8, 2009)

its funny people said that about the 1950 512s ram not being utilized now it is. Goin from a 9700 Pro to a X1950 Pro, the ram Quadrupled. Now im Goin from a 512MB to a 2GB Card, ram quadrupled again.


----------



## W1zzard (Apr 8, 2009)

i recorded both timedemos myself during gameplay. the manual playback +fraps method introduces a lot of variations, and dont get me started on the way hardocp presents their benchmarks.

i hope you're not looking at 1024x768 which is quite cpu limited in most games


----------



## eidairaman1 (Apr 8, 2009)

I Try to run my Games at native if possible, otherwise i try for Resolutions that match my Aspect Ratio, My Monitor is a 5:4 (1280x1024) and there arent many Aspects like that, I will upgrade to a 24" Monitor to support 1920x1200.


----------



## ShogoXT (Apr 8, 2009)

Looks nice, but I think I can hold in there for the next gen. Thanks for the review. 
Im still trying to figure out whats causing my computer to lose power... Maybe video cards?


----------



## eidairaman1 (Apr 8, 2009)

Drop back your overclock.


----------



## ShogoXT (Apr 10, 2009)

Hey W1zzard do you have that Core i7 hyperthreading on or off? 
Also when can we donate for that 30inch display?


----------



## W1zzard (Apr 10, 2009)

ShogoXT said:


> Hey W1zzard do you have that Core i7 hyperthreading on or off?
> Also when can we donate for that 30inch display?



ht is on iirc. you can donate any time, donate button at the bottom of the contact page


----------



## SonDa5 (Apr 11, 2009)

Great review.


----------



## Adonay (Apr 12, 2009)

Awsome review . Makes me wonder if i should get ridd one or both of my x2`s


----------



## Frizz (Apr 24, 2009)

lol refunding my gtx 295 was very heartbreaking especially when I heard the 4890's fans. 

Reading this thread gives me hope for the future when I go crossfire


----------



## AllHopeIsGone1 (May 8, 2009)

I got 2 ASUS 4890s in the post yesterday


----------

