• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ATI Radeon HD 4870 X2 1GB GDDR5 Nude Shots

Seems like it's going to be quite hot too. All the heat problems others have been having w/ one of these gpu's, I can't imagine what two will get to. 1600sp's though and 1gb of gddr5 ram, that just sounds tasty, hopefully it will be.
 
I'm going to buy this thing when it comes out (assuming benchmarks are up to par)

I love my 3870X2 and am looking forward to its big brother :-D

I'm really happy they are going with DDR5 on it... I'm hoping its no more than $500 ($450 would be a steal)
 
Seems like it's going to be quite hot too. All the heat problems others have been having w/ one of these gpu's, I can't imagine what two will get to. 1600sp's though and 1gb of gddr5 ram, that just sounds tasty, hopefully it will be.

Once RIVATURNER gets updated, just bump up the stock speed fan. The 38070X2 comes stock at 20% speed, I run it at 40% (sound doesn't bother me) and the thing never gets hotter than 57C in gaming.
 
very juicy and tasty indeed! glad to see the new plx chip is in place.. this will probably greatly increase the scability..
 
I'll just be ripping the cooler off this one too. I couldn't stand the heat from the 3870x2. One core was 5-10c hotter than the other. I want 30c idle temps & 50c load temps. If 3 slots are the only way to attain this besides water then so be it :D
 
this card is going to suck up alot of power.

- Christine

DUH! :shadedshu

2 Chips Double the Ram, basically 2 cards in crossfire, not worse than a 9800GX2.
 
WHen is the release date for the 4870x2
 
Looks nice.

The image does not contain the cooler because they have'nt imagined one that will keep it cool enough yet! :laugh:
 
Looks nice.

The image does not contain the cooler because they have'nt imagined one that will keep it cool enough yet! :laugh:

attachment.php


Because I'm so nice :D
 

Attachments

  • 4870x2[1].jpg
    4870x2[1].jpg
    105 KB · Views: 14,211
RM, see any potential volt mods yet??? :D
 
this card is going to suck up alot of power.

- Christine

i'm hoping no more than a overclock 2900XT @820\900. Mine take 80w+ more than a none overclocked 4870.
 
I would love to see Thermalright make a heatsink for this, something like the HR-03 but say 3-4 heat pipes going to each gpu which lead to one big heatsink
 
prolly won't be they didnt cool the HSI either remember, something ATI and cooling parts that get hot but not so hot they die without cooling, they just decide why bother
 
it looks so long, u gotta ask will it fit in a mid tower without disturbing the HDD;s ?
 
it looks so long, u gotta ask will it fit in a mid tower without disturbing the HDD;s ?

Not any mid-tower I know of. It looks just as large as the 3870x2 & we all know how much of a beast that thing is when it comes to space.


yogurt_21 said:
hmm I cant seem to see a spot for the plx chip on that cooler.....

All I can make out from the fuzzy pic is a few lines where the plx will sit at & that's after blowing up the pic several times over. But that giant black metal mixture heatsink don't exactly cool anything, not to mention that the ram on the back end of the card has no cooling at all. It doesn't even need that thing. I ran my x2 for the longest w/o it. It weighed the card down too much for my taste.
 
it looks so long, u gotta ask will it fit in a mid tower without disturbing the HDD;s ?


Yeah a cheapo generic chassis like the one I have where if you have two HDDs you have a problem with dual slot cooling solutions.
I cannot fit a 9800GTX in my case which is why I opted for the 4850.

If the mid tower has one of those hotswappable racks or only using one HDD then your'e fine.
 
did this version come only with qumonda ram's , i prefer sumsung ram's
 
hayder.master,
That's Qimonda GDDR5 right there on the card pictured in the 1st post.
:\

Luckily, GDDR5 is so fast as it is there should be only minimal gains from OC'ing the VRAM. HD4k's aren't exactly bandwidth limited.
hmm I cant seem to see a spot for the plx chip on that cooler.....
Sure there is.

HD4870X2_HSF_PLX.jpg
 
Last edited:
this will blow the gtx280 and 9800gx2 out of the water
hopefully...


Maybe, but it won't drive the 280 price down.

Nvidia can easily turn around and bitch slap ATi with another GX2 card, sadly enough.

No matter how far ATi has come in the last generation, they're still not going to top Nvidia until they change their architecture, simple as that.
 
Last edited:
Maybe, but it won't drive the 280 price down.

Nvidia can easily turn around and bitch slap ATi with anot GX2 card.

Sadly enough.

No matter how far ATi has come in the last generation, they're still not going to top Nvidia until they change their architecture, simple as that.

They are topping nvidia in the price vs performance and thats all I give a damn about. Who cares who gets the biggest most expensive card out if very few people want to spend that much money?
 
Who cares about people who sit around bitching about 'price/performance' as an excuse because they cannot afford the more expensive cards?


I stick by what I said. Cheap, expensive, whatever. Until ATi's architecture changes, they'll always be one step behind Nvidia.
 
They are topping nvidia in the price vs performance and thats all I give a damn about. Who cares who gets the biggest most expensive card out if very few people want to spend that much money?

Why is that so hard to understand. I don't give a rat's ass about the 'big picture'. Technology is ever-changing. This time the biggest, most expensive isn't the best - something is wrong with that but I'm still going with the best, paradox or not.
 
Why put so much money on the GTX280 when it is going to be replaced anyway by a better card in the near future?
Unless you are dirty rich, this is madness imo. the GTX280 cost exactly 895$ here in israel, while minimum wage is like what? 1200$?
btw, radeons are 410$ for 4870, and 275$ for the 4850. thats a BIG gap from the geforces.

So we can't afford it, and we don't want to. and this time around, we don't need to :)
 
Last edited:
Back
Top