• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon HD 4890 CrossFire

Now it's not even about HD 4890 vs. GTX 275 :(

therefore...



You seem to be very concerned about idle power consumption. Here's my advice: ditch that 8800 GT for a HD 4830.

http://img.techpowerup.org/090403/bta657.jpg

Notice how the margin looks similar (relatively) to that between HD 4890 and GTX 275.

Good day. Back to topic.

HAHAHAHAH!! i agree the 8800 likes the juice, and the 4830 outperforms, dont be hipocritically, if you start a agument you better have some good evidence backing you. now can we stop this whole thread about stupid stuff and just enjoy a new card on the market.
thanks
 
I care about power consumption because it means money.

do the math how much 20W more power consumption in idle will cost you.

20W * number of hours pc on per day * 30 / 1000 * price per kwh you pay to your power provider = price you pay for 20W extra for a month

for me: 20W * 16 * 30 / 1000 * € 0.1433 = € 1.38
 
HAHAHAHAH!! i agree the 8800 likes the juice, and the 4830 outperforms, dont be hipocritically, if you start a agument you better have some good evidence backing you. now can we stop this whole thread about stupid stuff and just enjoy a new card on the market.
thanks

It's you guys who are acting like some stupids and assholes and at least in the case of btrunr I know he will regret those comments in the future when he calms down.

I know for sure btrunr knows, because I have said that to him in more than one post (if he didn't forget about it), that I bought this 8800gt for 203 euros when at that time 8800GT's were selling for 250 minimum and other 8800 GT OC cards with similar clocks were near 300. That was 1,5 weeks after it was released so you will remember those prices if you make memory. The HD3870 was selling for 225 the cheapest and 240 the most expensive.

Now we get back to power consumption (http://www.techpowerup.com/reviews/Sapphire/HD_3870/23.html) and we can see that the card consumes 15 watts more. 15 not 100. I think there is a difference, but you know, they didn't teach me a lot of maths in the fanboy academy... (sarcasm if you didn't catch it). So if 100w difference would suppose 100 euros per year in electricity in my case, guess what? 15w suposes 15 euros and the 8800GT is significantly faster than the HD3870 so end of story.
 
and what can that buy wh1zz?
 
It's you guys who are acting like some stupids and assholes and at least in the case of btrunr I know he will regret those comments in the future when he calms down.

I know for sure btrunr knows, because I have said that to him in more than one post (if he didn't forget about it), that I bought this 8800gt for 203 euros when at that time 8800GT's were selling for 250 minimum and other 8800 GT OC cards with similar clocks were near 300. That was 1,5 weeks after it was released so you will remember those prices if you make memory. The HD3870 was selling for 225 the cheapest and 240 the most expensive.

Now we get back to power consumption (http://www.techpowerup.com/reviews/Sapphire/HD_3870/23.html) and we can see that the card consumes 15 watts more. 15 not 100. I think there is a difference, but you know, they didn't teach me a lot of maths in the fanboy academy... (sarcasm if you didn't catch it). So if 100w difference would suppose 100 euros per year in electricity in my case, guess what? 15w suposes 15 euros and the 8800GT is significantly faster than the HD3870 so end of story.

NO we dont get back to power consumption this thread is about benchmarking! not power consumption! the CF 4890's pretty much outbenchmark any dual gpu out there, thats what this is about. im gonna drop it right now! and i would hope you would do the same!
 
I dont post here as often as Id like to, but listening to this power consumption thing is a turn off, as these cards are definately not. Im thankfull this article wasnt just about power consumption, and showed the cards capabilities, as its seen as not only 1 of the highest performing solutions out there, but price/perf as well. Good job
 
Okay, I think we all understand that when DarkMatter goes to buy a GPU he/she will look at power consumption. That is well understood. Now let it go and realize there are others that are just worried about achieving the best game play, highest benchmark ~ and do not take into consideration how much juice they are sucking-up. I sure don't, but that is my personal preference.
 
do the math how much 20W more power consumption in idle will cost you.

20W * number of hours pc on per day * 30 / 1000 * price per kwh you pay to your power provider = price you pay for 20W extra for a month

for me: 20W * 16 * 30 / 1000 * € 0.1433 = € 1.38

It turns out to be a little bit more in my case. Let's say 1.5.

1.5 * 12 = 18 a year.

203 + 18 + 9 = 230. So more o less what I would have paid for a HD3870 back in the day I bought this card.

Now with the HD4890 CF versus HD4870 X2 (to leave Nvidia out of the question so that Ati fans don't get offended...)

308 - 230 = 78

78w * 16 * 30/1000 * 0.1433 = € 5.36 per month. 64.38 a year. Enough to make a difference IMO. The price changes from 400 vs 500, to 400 vs 564 if you keep the card 1 year. Is that worth the perf difference?

Okay, I think we all understand that when DarkMatter goes to buy a GPU he/she will look at power consumption. That is well understood. Now let it go and realize there are others that are just worried about achieving the best game play, highest benchmark ~ and do not take into consideration how much juice they are sucking-up. I sure don't, but that is my personal preference.

You care about money? If no, then it doesn't matter. Of course you can buy and spend whatever you want and I trully don't have anything against that. I stated perf/price in my very first post and I have said so like 10 times already in this thread. If money doesn't concern you, ok, but don't say multi-GPU is a cheaper alternative to higher-end cards.
 
Last edited:
It all depends on your choices, and what drives them. Again, Im thankfull this review wasnt only about power consumption, as thatd leave a very small amount of viewers that consider such things. IMHO, while power is important, its down the list compared to performance, or is this not an enthusiast site?
 
Ill ask this. If power consumption is that important, why hasnt anyone anywhere done a review that shows the best perf for power only? And if they did, would the average enthusiast even be willing to get such a card? It may well end up being the very low end, would that make people happy then? Im trying to follow this to its end here.
Or, do we draw a line, and say, this is where I sit with power vs perf? And how many lines are there? Especially at the very top of the performance end? Does it even mean anything at this level? Just some things to be considered overall
 
Ill ask this. If power consumption is that important, why hasnt anyone anywhere done a review that shows the best perf for power only?

did you see our performance per watt graphs?
 
Ill ask this. If power consumption is that important, why hasnt anyone anywhere done a review that shows the best perf for power only? And if they did, would the average enthusiast even be willing to get such a card? It may well end up being the very low end, would that make people happy then? Im trying to follow this to its end here.
Or, do we draw a line, and say, this is where I sit with power vs perf? And how many lines are there? Especially at the very top of the performance end? Does it even mean anything at this level? Just some things to be considered overall

I don't know why they don't do it, but I think they should*. I think it's very clear from the above calculations that even 20w can make a difference in the money spent in a card on it's lifespan. 20 euros/$ a year is significative. So in order to get the best bang for buck, you can pay $20 more for a card if it consumes 20w less. Of course depends on how much you use the card and how much you pay, but take into account that the average price per kilowatt for the world is $10 cents, $15 cents in Europe. I saw some stats a few months ago.

http://michaelbluejay.com/electricity/cost.html - not what I saw, but as to make an idea...
EU http://www.energy.eu/#domestic - Look at the highest one. Denmark with € 0.3. With the above calculations that goes for €1.7/year for every extra watt. 78 watts 132 euros.

*I actually know why they don't do it. It was an expression. It's imposible to make them because electricity prices are different everywhere.
 
Last edited:
Yes. What Im saying is, no ones done a overall review for power vs performance review, just for its own sake. Im not advacating it either. I dont think its as important as price /perf, but is lower in the overall view of things, like most here do.
Im just trying to point out that even tho theres concern in power usage, we all have our lines we draw. It changes not only with where you live, as power prices can be a concern, or not, but having monies to purchase the card of your choice. Going high end puts power consumption way down the list of puchase decisions in the highend, and its less a deal breaker at this perf level.
I really dont think an article on gpu power consumption regarding purchasing decisions would help. Its not cpus, where things like servers, farms etc need lower power.
 
Anyways, thnx W1zzard. You managed to scoop the rest by doing this review.Very nice indeed
 
I have a question, in Farcry 2 the benchmark is 1920x1200 4xAA and a Radeon 4870 512MB gets 20.1 FPS. In Hexus.net review, Sapphire 4870 1GB gets 50.1 fps, shouldn't a 4870 512MB still get at least remotely close to what a 4870 1GB would get? The only difference between them are the memory sizes to handle the different resolutions but I still find it strange.

A single 4890 1GB gets 49.6 FPS at 1920x1200 4xAA in FarCry 2. So if a 4870 1GB gets 50.1fps (according to Hexus) and a 4890 1GB gets 49.6, why does a 4870 512MB get 20.1? Isn't that a bit low? I am thinking it's because of the resolution and memory?? I just find it kind of strange. I am just looking at this from a different perspective to see if it's worth upgrading and how memory and scaling works from 4870 512MB to 4870 1GB to 4890 1GB to running in crossfire modes.
 
I have a question, in Farcry 2 the benchmark is 1920x1200 4xAA and a Radeon 4870 512MB gets 20.1 FPS. In Hexus.net review, Sapphire 4870 1GB gets 50.1 fps, shouldn't a 4870 512MB still get at least remotely close to what a 4870 1GB would get? The only difference between them are the memory sizes to handle the different resolutions but I still find it strange.

A single 4890 1GB gets 49.6 FPS at 1920x1200 4xAA in FarCry 2. So if a 4870 1GB gets 50.1fps (according to Hexus) and a 4890 1GB gets 49.6, why does a 4870 512MB get 20.1? Isn't that a bit low? I am thinking it's because of the resolution and memory?? I just find it kind of strange. I am just looking at this from a different perspective to see if it's worth upgrading and how memory and scaling works from 4870 512MB to 4870 1GB to 4890 1GB to running in crossfire modes.

When building computers i always go for 1GB videoram when its used on a 1920*1200 resolution LCD, at that rez with AA/AF it can depending on game make a big difference, while the price difference is really small.

I wouldnt buy 512mb cards anymore unless you go for cards that perform less than a 4830/4850.
 
FC2 is very texture heavy and it uses loads of memory. Even the GTS250 1GB loosely outperforms the HD4870 512 at high resolutions. As you can see all the 512 MB cards get a hit. Surprisingly the 9800 GTX and GT does better than the HD4870, but who cares none of them is able to provide playable framerates.
 
When building computers i always go for 1GB videoram when its used on a 1920*1200 resolution LCD, at that rez with AA/AF it can depending on game make a big difference, while the price difference is really small.

I wouldnt buy 512mb cards anymore unless you go for cards that perform less than a 4830/4850.

i would assume you run a 24" Plus Monitor
 
The white wizzard of NYC approaches! Welcome to TPU! :toast: I digg as much as possible and generally am the first to digg reviews here. Not only does it save kittens but it keeps sheep fed too!

Thanks, I was considering turning btarunr into something unnatural for being so rude.
Has the courtesy of your hall somewhat lessened of late, Erocker son of Erouckely? :)
 
I know what NYC means, unfortunately, not GandalfNYC, my bad. Your welcome message was sent the moment you activated your account.
 
I know what NYC means, unfortunately, not GandalfNYC, my bad. Your welcome message was sent the moment you activated your account.

Finally I am acknowledged, better late than never I suppose? :rolleyes:
This only makes you look even worse, since you saw my simple
request the moment I typed it but did nothing. I am disappointed. :wtf:

This, after I did as you asked IMMEDIATELY, without knowing what btarunr means-
and I even encouraged many others to boot!

What say you?
 
Last edited:
I don't know why they don't do it, but I think they should*. I think it's very clear from the above calculations that even 20w can make a difference in the money spent in a card on it's lifespan. 20 euros/$ a year is significative. So in order to get the best bang for buck, you can pay $20 more for a card if it consumes 20w less. Of course depends on how much you use the card and how much you pay, but take into account that the average price per kilowatt for the world is $10 cents, $15 cents in Europe. I saw some stats a few months ago.

http://michaelbluejay.com/electricity/cost.html - not what I saw, but as to make an idea...
EU http://www.energy.eu/#domestic - Look at the highest one. Denmark with € 0.3. With the above calculations that goes for €1.7/year for every extra watt. 78 watts 132 euros.

*I actually know why they don't do it. It was an expression. It's imposible to make them because electricity prices are different everywhere.
Yeah, you chose between a 3870 and an 8800GT, hardly considered high-end performance, even when they released. They have always been mid-range cards. I can fully understand power consumption being a major concern at that level, as we are already talking about a more modest computer build.

However, modesty goes completely out the window when you are building a rig with the graphical power that was shown in this review. When building a machine of this caliber, most only take performance, entry cost, and expandability into account. Power consumption is very low on the list of concerns. Usually only thought about enough to figure out the best psu for the system, not how much it will cost to run per month. This is the point I think you are failing to realize here. Most people building these kind of machines (at least the ones that I know) do not worry about power consumption. That's not the point of the machine.

I, for example, have a lower powered rig for casual usage. The rig in my specs is just a toy, and not always used, unless I have some encoding, benching or gaming I want to do. I usually surf on my Core2 laptop, or my Core2 iMac, both with all the power saving features enabled.
 
Yeah, you chose between a 3870 and an 8800GT, hardly considered high-end performance, even when they released. They have always been mid-range cards.

Actually, the 8800GT was indeed considered a high end performance card when it was released, second only to the 8800GTX among Nvidia's offerings at the time.
This is why the 8800GT commanded a price of $300 or more! Albeit, Nvidia delved too deeply and greedily into the pockets of consumers, much like the dwarves into the mountain - but that is the nature of both.

Also, it is in a different class than the 3870. These two cards should hardly be lumped together like that.
The AMD 4830 would currently be a better choice to compare to the 8800GT. Both offer excellent price/performance and low cost crossfire and SLI, respectively.

Very nice system, by the way... just curious, how would 3 4870's (4870x2 + 4870) compare to 2 or 3 gtx 285's? If you know of a good comparison chart please post a link! :)
 
Last edited:
Actually, the 8800GT was indeed considered a high end performance card when it was released, second only to the 8800GTX among Nvidia's offerings at the time.
This is why the 8800GT commanded a price of $300 or more! Albeit, Nvidia delved too deeply and greedily into the pockets of consumers, much like the dwarves into the mountain - but that is the nature of both.

Also, it is in a different class than the 3870. These two cards should hardly be lumped together like that.
The AMD 4830 would be a better choice to compare to the 8800GT. Both offer excellent price/performance and low cost crossfire and SLI, respectively.

Very nice system, by the way... just curious, how would 3 4870's (4870x2 + 4870) compare to 2 or 3 gtx 285's? If you know of a good comparison chart please post a link! :)

He referred to buying his 8800 during the time that both it and the 3870 released. That's what I was referring to.

And I think 2 GTX's and 4870 trifire should be roughly equal, with SLI being slightly ahead in most cases. Tri SLI would stomp it. lol. If my board did SLI, I'd actually probably have 2 GTX's, personally.
 
And I think 2 GTX's and 4870 trifire should be roughly equal, with SLI being slightly ahead in most cases. Tri SLI would stomp it. lol. If my board did SLI, I'd actually probably have 2 GTX's, personally.

When I need more power, that's exactly what I intend to do... I'll just buy the motherboard that is best for dual or tri SLI'd GTX 285's.
Depending on when the need arises, it might even be another socket 775 motherboard.
My warranty replaced abit ip35-e can't seem to run my e8400 past 3.6 ghz out of box and I do not want to bother with upgrading the bios for now.
Perhaps I will eventually come across a "friendly helpful supernerd" in the NYC area who will assist me...

As far as 2 or 3 SLI'd GTX 285's vs 2 or 3 crossfire'd 4870 GPU's performance-wise - I would think it depends on which game you want to play, to some extent.
 
Last edited:
Back
Top