# AMD Cypress ''Radeon HD 5870'' Stripped



## btarunr (Sep 11, 2009)

Here are the first pictures of the obverse side of Cypress' PCB, and the first pictures of the centre of attraction: the AMD Cypress GPU. CzechGamer dissembled two Cypress "Radeon HD 5870" cards for a quick blurrycam photo-session. The PCB shot reveals quite a bit about Cypress, particularly about the GPU. 

To begin with, the GPU is AMD's overhaul on transistor counts, and a bold work of engineering on the 40 nm manufacturing process, given the kind of problems foundry partners had initially. Apparently they seem to have recovered with most of them, as AMD's AIB partners are coming up with new products based on the 40 nm RV740 GPU on a weekly basis. The package holds a "diamond-shaped" die that is angled in a way similar to RV740, RV730, or more historically, the R600. The seemingly huge die measures 338 mm² (area), and for 40 nm, it translates to "huge", and is vindicated by the transistor count of ~2.1 billion. In contrast, AMD's older flagship GPU, the RV790 holds 959 million, and NVIDIA's GT200 holds 1.4 billion. 






The PCB has three distinct areas: the connectivity, processing, and VRM. To fuel the GPU is a high-grade 4 phase digital PWM power circuit, while the PCB has placeholders for an additional vGPU phase. The 8 (or 16 on the 2 GB model) memory chips, is powered by a 2 phase circuit. Power is drawn from two 6-pin PCI-Express power connectors, but there seems to be a placeholder for two more pins, i.e., to replace one of those 6-pin connectors with an 8-pin one. Bordering the GPU on two sides are the 8 GDDR5 memory chips, which AMD calls says is generation ahead of present GDDR5, and supports reference frequencies as high as 1300 MHz (2600 MHz DDR, 5.20 GHz effective). In the 2 GB variant, 8 more chips seat on the other side of the PCB. This is what perhaps, the backplate is intended to cool. On the connectivity portion of it, are the two CrossFire connectors, DisplayPort, HDMI and a cluster of two DVI-D connectors. There has been a raging debate about how adversely the small air vent would affect the card, but AMD is promising some energy efficiency breakthroughs, plus given how roomy the card is, the vent seems sufficient. 



 



Finally, information from ArabHardware.net suggests a pricing model on three of the first SKUs based on Cypress: HD 5870 2 GB, HD 5870 1 GB, and HD 5850 1 GB. All three use the same GPU and memory standard (GDDR5), but differ in clock speeds and GPU configurations. While HD 5870 sports 1600 stream processors, 80 TMUs, and 32 ROPs, HD 5850 has 1440 stream processors, 72 TMUs, and 32 ROPs. Although 32 ROPs puzzles us for a 256-bit wide memory interface, we suspect low-level design changes that make "32 ROPs" more of an effective count than an absolute count. While HD 5870 features over 800 MHz core clock and 5.20 GHz memory, its little sibling has over 700 MHz core clock and 4.40 GHz memory. Price points expected are US $449 for Radeon HD 5870 2 GB, $399 for HD 5870 1 GB, and $299 for HD 5850. AMD is expected to announce all three models on the coming 23rd. You'll be able to find them at your favourite computer store a little later, availability is a certainty by the time you're ready to buy Windows 7. AMD's newest products will be more than ready to squat under X-mas trees all over.

*View at TechPowerUp Main Site*


----------



## Lionheart (Sep 11, 2009)

Hmmmm I think it looks pretty damn awesome, seriously cant wait to see some benchmarks of these cards, Whooo!!!! yaayy 1600 stream processes !!SWEET!!, i was hoping it would be more than 1200


----------



## rampage (Sep 11, 2009)

YAY -      (DOSE A LIL HAPPY DANCE)

its finialy nice to get, some decent information on the cards

now im lokoing at about $500 aussie for the 5870 1 GB, altho its a bit higher then i thought, but still no where enear as the $800 i payed when the GTX280 first come out


----------



## newtekie1 (Sep 11, 2009)

I have a feeling this is still going to be a one loud ass card, the fan is going to have to work at full blast to push all the hot air out of that little openning...  And that die size is huge for 40nm!


----------



## phanbuey (Sep 11, 2009)

5870x2 4gb is gonna be my next card.  Just have to wait for h NV cards to come out so that prices can drop to a sane level.


----------



## Cheeseball (Sep 11, 2009)

Awesome cards, but too damn long.


----------



## MoonPig (Sep 11, 2009)

That's some dodgy basement... lol.

These look good, looking forward to more info


----------



## The Witcher (Sep 11, 2009)

ArabHardware.net ???

btarunr, do you speak Arabic ? 

Anyway, from my experience, the hardwares are always extremely expensive here in the Middle East so dunno if the prices will be the same in the USA.


----------



## zOaib (Sep 11, 2009)

These cards will support DX 11 correct ? i may have missed reading it on the above article so just confirming.


----------



## zOaib (Sep 11, 2009)

The Witcher said:


> ArabHardware.net ???
> 
> btarunr, do you speak Arabic ?
> 
> Anyway, from my experience, the hardwares are always extremely expensive here in the Middle East so dunno if the prices will be the same in the USA.



salaam wa alikum,


----------



## DarkOCean (Sep 11, 2009)

Cheeseball said:


> Awesome cards, but too damn long.


its the same size as a gtx 260


----------



## MRCL (Sep 11, 2009)

Wow, it (the card) looks less impressive than I thought.


----------



## pantherx12 (Sep 11, 2009)

newtekie1 said:


> I have a feeling this is still going to be a one loud ass card, the fan is going to have to work at full blast to push all the hot air out of that little openning...  And that die size is huge for 40nm!



I doubt it, air pressure will build up in the card and it will come out just fine with a normal fan.


----------



## btarunr (Sep 11, 2009)

Firstly this card is as long as a GTX 260. Secondly I'm hearing it's whisper-quiet.


----------



## lemonadesoda (Sep 11, 2009)

to the czech way of doing things! 

Goodness knows where/how they managed to make those cards disappear for the 007 spy shots.


----------



## [I.R.A]_FBi (Sep 11, 2009)

may consider a 5870 when they touch 250 dollar


----------



## wolf (Sep 11, 2009)

Looks like I might have to get myself a 2GB model and see what all the fuss is about


----------



## human_error (Sep 11, 2009)

That is one huge die size! Now enough of the blurrycam pictures already - i want benchies, and i want them *now*...


----------



## mdm-adph (Sep 11, 2009)

newtekie1 said:


> I have a feeling this is still going to be a one loud ass card, the fan is going to have to work at full blast to push all the hot air out of that little openning...  And that die size is huge for 40nm!



Yap yap yap -- how'd I know you were going to say something negative?


----------



## pantherx12 (Sep 11, 2009)

btarunr said:


> Firstly this card is as long as a GTX 260. Secondly I'm hearing it's whisper-quiet.



So awesome performance and quiet?

... it might be worth slapping down 240 pounds on one then ...

Heh I'll buy the rest of my own system later and just put on of these in my parents rig for now he he.


----------



## Scrizz (Sep 11, 2009)

reminds me of the 9800GTX+


----------



## Delta6326 (Sep 11, 2009)

So far this looks good i see one big flaw and that is were you plug the power into if you have a smaller case(i mean width wise) then you cant plug in the power because its on the side not the back this will make the plug in mash up against the side of the case if you look at the pics but time will tell if it will fit


----------



## Silverel (Sep 11, 2009)

Someone needs to facepalm the dude taking blurry pictures. It's called a tripod (or less cocaine). Use it... bleh.


Good to see AMD is still rolling around. How many predictions of death have we heard over the years anyway?
Looks like next summer is a new system refresh for me.


----------



## lemonadesoda (Sep 11, 2009)

99% of cases will fit. For the 1% where it is too narrow... it is TOO SHORT ALSO.

It's a good idea they turned the connector around to the top... a long card is always butting against the HDD container/peripherals etc.



Silverel said:


> Someone needs to facepalm the dude taking blurry pictures. It's called a tripod (or less cocaine). Use it... bleh.


Probably a no-cameras event. So these are Unofficial, clandestine, spy shots. Probably the card was pinched, taken to the toilet or some back room, and camera-photod, then retured.  Just a guess. They must have has secret screwdrivers and all. LOL.


----------



## IINexusII (Sep 11, 2009)

this guy must hav stole it from one of the rigs


----------



## Scrizz (Sep 11, 2009)

I have the other 2


----------



## Valdez (Sep 11, 2009)

newtekie1 said:


> I have a feeling this is still going to be a one loud ass card, the fan is going to have to work at full blast to push all the hot air out of that little openning...  And that die size is huge for 40nm!



Well... g300 will be much larger... (it's just a guess).


----------



## newtekie1 (Sep 11, 2009)

pantherx12 said:


> I doubt it, air pressure will build up in the card and it will come out just fine with a normal fan.



Just like the 9800GX2 and GTX295...Even nVidia couldn't keep those things quiet with tiny opennings like this card's...ATi doesn't stand a chance.



mdm-adph said:


> Yap yap yap -- how'd I know you were going to say something negative?



And how did I know you would troll my post, and not add anything even remotely relavant to the dicussion?

And if you noticed, I said something positive also...but you do only tend to notice when people say negative things about ATi...but you are an ATi fanboy so it is expected I guess...:shadedshu



Valdez said:


> Well... g300 will be much larger... (it's just a guess).



I don't care about G300, this isn't a topic about G300.  

The large die size is a good thing.  It means ATi packed a lot of shit in there, hopefully that means awesome performance.  And it means 40nm has matured to the point that it can handle large die sizes like this.


----------



## KainXS (Sep 11, 2009)

. . . . . 

anyway

these should be close

HD5870
GPU:RV870XT
Core Clock:850 Mhz
Shader Clock:850 Mhz
Memory Clock:1300 Mhz
Pixel Fill Rate:27200 MPixels/sec
Texture Fill Rate:68000 MTexels/sec

HD5850
GPU:RV870
Core Clock:750 Mhz
Shader Clock:750 Mhz
Memory Clock:1100 Mhz
Pixel Fill Rate:24000 MPixels/sec
Texture Fill Rate:54000 MTexels/sec


HD4870X2
GPU:RV770X2
Core Clock:750 Mhz
Shader Clock:750 Mhz
Memory Clock:1800 Mhz
Pixel Fill Rate:24000 MPixels/sec
Texture Fill Rate:60000 MTexels/sec


HD4890
GPU:RV770XT
Core Clock:850 Mhz
Shader Clock:850 Mhz
Memory Clock:975 Mhz
Pixel Fill Rate:13600 MPixels/sec
Texture Fill Rate:34000 MTexels/sec


----------



## pantherx12 (Sep 11, 2009)

newtekie1 said:


> Just like the 9800GX2 and GTX295...Even nVidia couldn't keep those things quiet with tiny opennings like this card's...ATi doesn't stand a chance.



Erm... both are dual GPU cards, its hardly comparable to a a single GPU card that also runs more effciantly, the Nvidia cards you mentioned would require high fan speeds to cool two gpu's ( already twice as much heat as a 5870) let alone them being less effciant in general.


----------



## h3llb3nd4 (Sep 11, 2009)

awesomeness


----------



## SteelSix (Sep 11, 2009)

phanbuey said:


> 5870x2 4gb is gonna be my next card.  Just have to wait for h NV cards to come out so that prices can drop to a sane level.



Hell yes, though I'll be grabbing one on launch day. Thanks for the news TPU!!


----------



## lemode (Sep 11, 2009)

Love the look of the Radeon HD 5870!

I typically build a new rig every summer/fall for myself alternating between Intel and AMD systems. I was actually excited to invest in i7 860/p55/single 295 this time around now I am considering waiting to see if an HD 5870x2 comes out! If it does I will just wait for the Leo platform to be released in May. I haven’t bought a Radeon card since the x800 so this is kind of exciting!


----------



## newtekie1 (Sep 11, 2009)

pantherx12 said:


> Erm... both are dual GPU cards, its hardly comparable to a a single GPU card that also runs more effciantly, the Nvidia cards you meantioned would require high fan speeds to cool two gpu's ( already twice as much heat as a 5870) let alone them being less effciant in general.



With this die size, even with the 40nm transition, I'm betting this card puts out just as much heat as two G92s.


----------



## pantherx12 (Sep 11, 2009)

Smaller fab process means greater surface area that coupled with a bigger die means greater heat disapation.

Should be fine and dandy.


----------



## The Witcher (Sep 11, 2009)

If am gonna buy this graphic card, am not gonna buy it because it support DX11, I'll just buy it because it's stronger than the current generation.

We all know that the 8000 series were the first G Cards to support DX10 and they didn't preform well in these games to be honest so I don't think that these new DX11 cards will do better than their predecessors , not to mention that we will probably see the first DX11 game after 2 or 3 years.


----------



## dir_d (Sep 11, 2009)

The Witcher said:


> If am gonna buy this graphic card, am not gonna buy it because it support DX11, I'll just buy it because it's stronger than the current generation.
> 
> We all know that the 8000 series were the first G Cards to support DX10 and they didn't preform well in these games to be honest so I don't think that these new DX11 cards will do better than their predecessors , not to mention that we will probably see the first DX11 game after 2 or 3 years.



Have you not read the news on this site for the past week or not heard about DiRT 2 being the 1st DX11 title named that will be out not too long from now.


----------



## air_ii (Sep 11, 2009)

newtekie1 said:


> With this die size, even with the 40nm transition, I'm betting this card puts out just as much heat as two G92s.



From what I've heard, it's 27W (!) while idling and 185-ish at load. Btw, I don't know if you've seen the cooler pictures, but the card seems to have exhausts on both sides. Some think that the front one is intake, but given the construction of the fan, it has to be exhaust.

I'm still gonna WC it and see what NV has to say later on .


----------



## air_ii (Sep 11, 2009)

dir_d said:


> Have you not read the news on this site for the past week or not heard about DiRT 2 being the 1st DX11 title named that will be out not too long from now.



Dirt 2 on 3 displays



			
				The Witcher said:
			
		

> If am gonna buy this graphic card, am not gonna buy it because it support DX11, I'll just buy it because it's stronger than the current generation.
> 
> We all know that the 8000 series were the first G Cards to support DX10 and they didn't preform well in these games to be honest so I don't think that these new DX11 cards will do better than their predecessors , not to mention that we will probably see the first DX11 game after 2 or 3 years.



The gameplay (link above) looks quite smooth on 3 panels!


----------



## HossHuge (Sep 11, 2009)

While I think this card is going to be huge,  I don't expect the type of increase that we got from a 3870 to a 4870.


----------



## TheMailMan78 (Sep 11, 2009)

Those pictures look like they were taken from some kinda sick twisted GPU smut film. I half way expect some kinda tentacle rape to be posted from the same source.


----------



## btarunr (Sep 11, 2009)

If you're into uh..tentacles, check out the latest news post.


----------



## newtekie1 (Sep 11, 2009)

dir_d said:


> Have you not read the news on this site for the past week or not heard about DiRT 2 being the 1st DX11 title named that will be out not too long from now.



DiRT 2 DX11 about as much as Bioshock was DX10...

Most of the early "DX11" games will be DX10 games with a few generally unnoticable additions.  What DX version the game uses, or even what version the card supports, isn't really important to me.  Good gameplay is what is important, and the horsepower of the card is what I'm worried about.



air_ii said:


> From what I've heard, it's 27W (!) while idling and 185-ish at load. Btw, I don't know if you've seen the cooler pictures, but the card seems to have exhausts on both sides. Some think that the front one is intake, but given the construction of the fan, it has to be exhaust.
> 
> I'm still gonna WC it and see what NV has to say later on .



With power saving features, I would believe 27w at idle, but I don't care about idle, even the current ATi cards are quiet at idle.  And at 185w at load, you are looking at more than a GTX295.

And I have seen the cooler design, and the two openning at the front of the card are decoration only it seems.  There is no way they are intakes, as they face the exhaust part of the fan.  And it looks like the fan is actually blocked off from the fan entirely.  If they aren't, then the cool air being sucked in from the fan is just being exhausted right out the holes, not cooling anything, making them useless.  Either way, they don't help the situation with the extremely small exhaust at the back of the card, this will limit airflow, causing higher temps, and force the fan to work harder and louder.

We will have to see when the card is actually released, but that is just my opinion after seeing the pictures so far.  And yes, I'm well aware that this might not be the final design, and they might reduce or re-arrange the connectors on the card.


----------



## wahdangun (Sep 11, 2009)

WTF, it's look like GTX 280, but with ati chip in it (dam, ati is going insane, they double all the spec )

i want to see benches right now , please


----------



## air_ii (Sep 11, 2009)

newtekie1 said:


> With power saving features, I would believe 27w at idle, but I don't care about idle, even the current ATi cards are quiet at idle.  And at 185w at load, you are looking at more than a GTX295.



I think it's closer to 285 for GTX295...


----------



## erocker (Sep 11, 2009)

newtekie1 said:


> DiRT 2 DX11 about as much as Bioshock was DX10...
> 
> Most of the early "DX11" games will be DX10 games with a few generally unnoticable additions.



Do you have any links or solid information about this?

Also, those with the card are claiming it's 50% faster than GTX285.


----------



## The Witcher (Sep 11, 2009)

dir_d said:


> Have you not read the news on this site for the past week or not heard about DiRT 2 being the 1st DX11 title named that will be out not too long from now.



Ok I admit that you beat me on that....

Its been a while since I checked the website (I'm shitting you not), I don't see much  graphical difference between the DX10 games and Dirt 2 so in my opinion it's not DX11 unless it has a graphical improvement I mean big improvement not minor changes which you won't be able to notice. 

Back to topic, when do you think we will see the first accurate benchmarks of these cards ?


----------



## air_ii (Sep 11, 2009)

If guys over at CHIPHELL are to be believed, 5870 scores min 30, avg 43 and max 54 fps in Crysis 1900x1200, 4AA+16AF DX10 Very High. That's on PhII 955BE.


----------



## jagd (Sep 11, 2009)

Yes of course they are DX11 .


zOaib said:


> These cards will support DX 11 correct ?


----------



## newtekie1 (Sep 11, 2009)

air_ii said:


> I think it's closer to 285 for GTX295...



If it is 185w under load, then it is closer to a GTX295, which according to W1z's reviews is at about 181-182w on average under load.



erocker said:


> Do you have any links or solid information about this?
> 
> Also, those with the card are claiming it's 50% faster than GTX285.



Nope, until we see what DiRT 2 looks like on a DX11 card vs. a DX10 card, it is all speculation, just like 90% of this thread.  But I'm guessing, since the game still runs on DX10 and DX9, I'm just guessing that DX11 isn't going to add a whole lot to the game...

And isn't a GTX295 about 50% faster than a GTX285?


----------



## erocker (Sep 11, 2009)

newtekie1 said:


> If it is 185w under load, then it is closer to a GTX295, which according to W1z's reviews is at about 181-182w on average under load.
> 
> 
> 
> Nope, until we see what DiRT 2 looks like on a DX11 card vs. a DX10 card, it is all speculation, just like 90% of this thread.  But I'm guessing, since the game still runs on DX10 and DX9, I'm just guessing that DX11 isn't going to add a whole lot to the game...



I believe all DirectX is based off the previous version. Most likely there is nothing about DX10 in DX11. It's more likely DX11 features on top of DX9. Looking at the features of DX11, regardless if it's based on DX10 or DX9, they will make a significant difference in visual quality. Just look at the wire frames between DX9 and DX11. The DX11 features are significant as DX10 features were not. Either way, it's the horsepower increase I'm more concerned with. DX11 will come either way.


----------



## air_ii (Sep 11, 2009)

newtekie1 said:


> If it is 185w under load, then it is closer to a GTX295, which according to W1z's reviews is at about 181-182w on average under load.



I think peak load is much higher. And the info was 185-190 max power consumption for 5870.

Edit: Just found it, TDP of GTX295 is 289W.


----------



## toyo (Sep 11, 2009)

Helllllooo ATI??? We thought (and wanted) the 5870 at 299$...


----------



## KainXS (Sep 11, 2009)

it will be between 350 and 400 dollars more than likely


----------



## a_ump (Sep 11, 2009)

toyo said:


> Helllllooo ATI??? We thought (and wanted) the 5870 at 299$...



haha that was indeed my hope. but hey, If these cards have the horsepower that has been speculated then i personally think it's worth $399, i think $349 would be ideal and a sweet spot, but then they need to try and make some cash while nvidia is out of the picture(and yes they will be come the 23rd). I too am weary of how efficient that tiny exhaust will be but no worries here  got an AC unit 3 feet from my comp.


----------



## Kitkat (Sep 11, 2009)

i want 2!


----------



## KainXS (Sep 11, 2009)

Yeah Its definitely worth it, besides the memory its double the specs of the HD4890, its like a 4870X2 with no crossfire performance loss so it could even outperform a GTX295, we will have to wait for benches but its possible


----------



## mdm-adph (Sep 11, 2009)

newtekie1 said:


> And how did I know you would troll my post, and not add anything even remotely relavant to the dicussion?
> 
> And if you noticed, I said something positive also...but you do only tend to notice when people say negative things about ATi...but you are an ATi fanboy so it is expected I guess...:shadedshu



You're either a liar or you're getting old:



newtekie1 said:


> I have a feeling this is still going to be a one loud ass card, the fan is going to have to work at full blast to push all the hot air out of that little openning...  And that die size is huge for 40nm!



What part of this was in any way "positive?"  Funny, I don't know anyone who likes "loud ass cards" or ones that have to "push all the hot air out of that little openning."  

Or do you like the chip for being "huge?"  Beats me.

Not to mention:

*Rule #33 of the Internet:  He who doth quoteth the taunt of "fanboy" first is himself the biggest fanboy.*


----------



## wahdangun (Sep 11, 2009)

look at this:


----------



## KainXS (Sep 11, 2009)

linkies


----------



## newtekie1 (Sep 11, 2009)

erocker said:


> I believe all DirectX is based off the previous version. Most likely there is nothing about DX10 in DX11. It's more likely DX11 features on top of DX9. Looking at the features of DX11, regardless if it's based on DX10 or DX9, they will make a significant difference in visual quality. Just look at the wire frames between DX9 and DX11. The DX11 features are significant as DX10 features were not. Either way, it's the horsepower increase I'm more concerned with. DX11 will come either way.



I highly doubt there will be a significan't difference in visual quality, certianly not in the first DX11 game, which is still based on DX9...



air_ii said:


> I think peak load is much higher. And the info was 185-190 max power consumption for 5870.
> 
> Edit: Just found it, TDP of GTX295 is 289W.



298w is way over what the actual TDP of the GTX295.  Real world is closer to 180w average.



mdm-adph said:


> You're either a liar or you're getting old:
> 
> 
> 
> ...



The huge die size is a positive, as I've already explained, and I'm sure you didn't notice since you only pay attention when someone is speaking negatively of ATi.

And you did everything but come right out and say it directly, so you cast the first stone there buddy...


----------



## The Witcher (Sep 11, 2009)

Hey, could someone remind me of how much was the GTX260 when it was released ? 

I bought it after a few months of it release for like $230 as I remember, I won't buy these new graphic cards until they reach a reasonable price point which is for me : $200~$280


----------



## btarunr (Sep 11, 2009)

Ok, calm down people.


----------



## lemonadesoda (Sep 11, 2009)

btarunr said:


> ... Secondly I'm hearing it's whisper-quiet.



188W of badness. I wonder how they keep _that_ quiet?

Ummm. 3x 2560x1600 sounds good to me. Time to upgrade my desktop arrangement.


----------



## wahdangun (Sep 11, 2009)

KainXS said:


> linkies



here the link :
http://bbs.chiphell.com/viewthread.p...extra=page=1


i think people just eager to see the benches and can't wait that long, and it will be starting a riot in here if ati didn't release the benches


----------



## KainXS (Sep 11, 2009)

404 not found, but I think that bench isn't far off, I personally think it will match and probably outperform the GTX295


----------



## pantherx12 (Sep 11, 2009)

newtekie1 said:


> 298w is way over what the actual TDP of the GTX295.  Real world is closer to 180w average.



Aye but bare in mind the 188 is the max on the new atis, real world average is bound to be less aswell right : ]

I'm willing to bet 20 pence that that little opening is enough, how ever ultimately its irrelevent, how many people own cards direct from ATI/Nvidia?

I'm pretty sure most people get cards from the likes of powercolor/xfx/sparkle bollocks like that.


----------



## btarunr (Sep 11, 2009)

lemonadesoda said:


> 188W of badness. I wonder how they keep _that_ quiet?



I'm hearing that not only is it whisper-quiet, but also surprisingly cool (surprising for its ~180W load power consumption). Wait till the 23rd.


----------



## pantherx12 (Sep 11, 2009)

Wiz got some of these cards right? I know he can't publish a review just yet, but can he tell us what the red bits at the end do?


----------



## a_ump (Sep 11, 2009)

wahdangun said:


> look at this:



well the only problem with that is it says HD 5870 *OC*. for all we know they could have it under water with insane clocks. I wish they'd done that at stock instead. However it is impressive none the less.


----------



## btarunr (Sep 11, 2009)

pantherx12 said:


> Wiz got some of these cards right? I know he can't publish a review just yet, but can he tell us what the red bits at the end do?



Just decorative.


----------



## newtekie1 (Sep 11, 2009)

pantherx12 said:


> Aye but bare in mind the 188 is the max on the new atis, real world average is bound to be less aswell right : ]
> 
> I'm willing to bet 20 pence that that little opening is enough, how ever ultimately its irrelevent, how many people own cards direct from ATI/Nvidia?
> 
> I'm pretty sure most people get cards from the likes of powercolor/xfx/sparkle bollocks like that.



But most of those companies use the reference design.  I fully expect the design to change by the time the retail design is finalized though.



btarunr said:


> I'm hearing that not only is it whisper-quiet, but also surprisingly cool (surprising for its ~180W load power consumption). Wait till the 23rd.



I hope this is true, because I love my HD4890, but the only thing I dislike is how hot it runs and how loud the fan is.


----------



## lemonadesoda (Sep 11, 2009)

I hope the 188W is not actually true... but it a misdirection just like the number of shaders on the 4xxx series that ATi pulled last time prior to launch.

188W might be whisper quiet and cool at the desktop in 2D (and running 27W) but there is no way 188W is going to be cool'n'quiet while gaming, stock or OC.


----------



## btarunr (Sep 11, 2009)

My claim includes 'load'. Again, Max ≠ Load. There's a clear difference between the two which you can look up in our latest reviews. Here's a sample: http://www.techpowerup.com/reviews/Mushkin/GTX_295_Single_PCB/28.html

Trivia includes: GTX 295's max consumption is 320W, and that of GTX 285 is 218W. So Cypress beats them both at max power.

So this is where Cypress will land (red mark), going by AMD's value of 188W max power:


----------



## lemonadesoda (Sep 11, 2009)

^ OK, gotya. 

Just that newteckie was complaining about the noise from his card that is 190W... so unless something very clever is going on... I would assume similar power, heat, and cooling issues to 4890. Obviously performance per watt has increased significantly... but 188W is still a high figure. Being less than GTX295 is nothing to brag about since GTX295 should *be ashamed of itself* running at an output enough to start up a commerical jet engine and warm an Olympic swimming pool.


----------



## sapetto (Sep 11, 2009)

If OCed is better than GTX295 when its not OCed its almost the same?


----------



## rpsgc (Sep 11, 2009)

*sigh* I feared this would happen... the card won't fit inside my case (Antec Solo), it's too long. Maybe the 5850 will maintain a sane size like the current gen.


----------



## Hitman.1stGame (Sep 11, 2009)

HD 5870 X2 will kill GT300 before it launch


----------



## a_ump (Sep 11, 2009)

Hitman.1stGame said:


> HD 5870 X2 will kill GT300 before it launch



lol that's quite a statement. it won't surprise me if the GT300 is 30-50% faster than the GTX 295. The rumored specs are unreal, more than double the GTX 285's as well as supposedly going with a 512-bit GDDR5 memory bus. 512shaders compared to GTX 285's current 240. GT300 is going to be a monster there is no denying it, however if the HD 5870 is even 10% faster than HD 4890's xfire then the HD 5870x2 should still be able to outperform the GT300 by a decent margin. But my last 2 sentences are all speculation and best guess.


----------



## wahdangun (Sep 11, 2009)

^
^  but ati is already doubleling it's spec why nvdia can't do the same (with a result of bad yield, power hungry, big die, and hot like hell)


----------



## TheMailMan78 (Sep 11, 2009)

wahdangun said:


> ^
> ^  but ati is already doubleling it's spec why nvdia can't do the same (with a result of bad yield, power hungry, big die, and hot like hell)



My spec has doubled. Damn Burger King. :shadedshu


----------



## a_ump (Sep 11, 2009)

wahdangun said:


> ^
> ^  but ati is already doubleling it's spec why nvdia can't do the same (with a result of bad yield, power hungry, big die, and hot like hell)



?why can't nvidia do the same, i said they were more than doubling lol. and with a much better memory system than the HD 5XXX series its most definitely going to outperform the HD 5870. They probly are having bad yields which is my suspicion why they can't release till Q1 2010, by then they hope to have enough stock built up and improvements in yields to keep up with supply.


----------



## pantherx12 (Sep 11, 2009)

btarunr said:


> Just decorative.
> 
> http://img.techpowerup.org/090911/bta1q2w.jpg



Looks like they're openings to me not solid bits of plastic

http://www.techpowerup.com/img/09-09-11/132b.jpg

The picture you posted its just a bad angle and the bottom bit of plastic looks as if it is part of the top piece.

They may still be decorative, but they're defitnitely holes.


----------



## btarunr (Sep 11, 2009)

pantherx12 said:


> Looks like they're openings to me not solid bits of plastic
> 
> http://www.techpowerup.com/img/09-09-11/132b.jpg
> 
> ...



Yeah, decorative holes that don't participate in the card's internal air-flow.


----------



## Nick89 (Sep 11, 2009)

I'll be getting one when nvidia rolls out there set of cards, that will lower prices.


----------



## a_ump (Sep 11, 2009)

yea....for decorative purposes is the most retarded reason to put those there. i'd rather they were part of the air flow and i don't see how it'd be that difficult to include them into it.


----------



## TheMailMan78 (Sep 11, 2009)

pantherx12 said:


> Looks like they're openings to me not solid bits of plastic
> 
> http://www.techpowerup.com/img/09-09-11/132b.jpg
> 
> ...



Mussels mouth is a decorative air hole.


----------



## [I.R.A]_FBi (Sep 11, 2009)

TheMailMan78 said:


> Mussels mouth is a decorative air hole.



Lool, the only one?


----------



## Nick89 (Sep 11, 2009)

TheMailMan78 said:


> Mussels mouth is a decorative air hole.




Oh man, I laughed my ass off! 

Any way looks like I'm going to need to sell my pair of 4870's before nvidia's cards come out.


----------



## Benetanegia (Sep 11, 2009)

btarunr said:


> My claim includes 'load'. Again, Max ≠ Load. There's a clear difference between the two which you can look up in our latest reviews. Here's a sample: http://www.techpowerup.com/reviews/Mushkin/GTX_295_Single_PCB/28.html
> 
> Trivia includes: GTX 295's max consumption is 320W, and that of GTX 285 is 218W. So Cypress beats them both at max power.
> 
> ...



Cards never actually consume what the vendor said it would consume. And ALL of them use TDP, or TDP-like numbers and call them Max power consumption. 

Anyway, if history should be taken into account... RV790' slide:







Actual consumption:






So there's no way we could know the actual Maximum consumption, but I think that a good estimate would be around RV790's. But if I'm correct and 188w is the TDP, absolute maximum could be anywhere between 225 and 275 W, depending on how much time they thought the card would be 100% loaded and idling during actual use when they calculated the TDP. TDP can be tricky if the variance between idle and load is huge and in RV870 it seems so. The GTX295 is a good example of how much it can change too.


----------



## TheMailMan78 (Sep 11, 2009)

Benetanegia said:


> Cards never actually consume what the vendor said it would consume. And ALL of them use TDP, or TDP-like numbers and call them Max power consumption.
> 
> Anyway, if history should be taken into account... RV790' slide:
> 
> ...



Hmmmmm Me thinks you're not new here.


----------



## btarunr (Sep 11, 2009)

Benetanegia said:


> Cards never actually consume what the vendor said it would consume. And ALL of them use TDP, or TDP-like numbers and call them Max power consumption.
> 
> Anyway, if history should be taken into account... RV790' slide:
> 
> ...



Right, so even with that kind of deviation, it will fall between GTX 275 and GTX 285 at worst, or between HD 4870 and GTX 285 at best.


----------



## pantherx12 (Sep 11, 2009)

btarunr said:


> Yeah, decorative holes that don't participate in the card's internal air-flow.



That's so rubbish!

I'm so going to find a way to utilise those holes for something!


----------



## Benetanegia (Sep 11, 2009)

btarunr said:


> Right, so even with that kind of deviation, it will fall between GTX 275 and GTX 285 at worst, or between HD 4870 and GTX 285 at best.



Yeah, yeah. I wasn't pretending it would consume as much as a GTX295. Just wanted to clarify that it wont be right where you put it either. 250W is already goood!!

And, it's true those things are purely decorative? I know I don't like them...


----------



## btarunr (Sep 11, 2009)

Benetanegia said:


> Yeah, yeah. I wasn't pretending it would consume as much as a GTX295. Just wanted to clarify that it wont be right where you put it either. 250W is already goood!!



I had acknowledged that I'm putting AMD's value in a graph full of TPU's measurements. So the deviation is obvious. What's clear is it's not going to lose to GTX 285 or GTX 295 at max power.


----------



## Benetanegia (Sep 11, 2009)

btarunr said:


> I had acknowledged that I'm putting AMD's value in a graph full of TPU's measurements. So the deviation is obvious. What's clear is it's not going to lose to GTX 285 or GTX 295 at max power.



Well, I think this conversation is unfair. I've seen all your posts and it's pretty clear you have "insider" info HAha. I'm guessing it won't lose to those cards. But by the numbers that we have, the rest of us, it could reach even 300W, who knows (always speaking about Wizzard's graph). I mean, look at this: 

http://img.donanimhaber.com/resimler/R700_4870-x2-slide.jpg

285 W according to the slide, 381W acording to Wizzards measurements. That's 100 W of difference. In the end it doesn't matter because a card will never reach those levels during actual use, that's just the consumtion it reaches in one odd clock cycle in which all of it's units are working together, and that happens, what, 1 time every complete 3dmark run??


----------



## Meizuman (Sep 11, 2009)

Bene, Pic No Work


----------



## Benetanegia (Sep 11, 2009)

Meizuman said:


> Bene, Pic No Work



mm works for me. 

Just google HD4870 X2 slide or something... You might even find one with a different number, I don't know. I'm not trying to say "ey guys this is like this!!", just that unless we know something (ahem Btarunr ) everything is possible... Or am I missing something else?


----------



## btarunr (Sep 11, 2009)

Not much insider info, but I talk to luckier people, in this case, people who made Cypress a substitute for their better-half/teddy for a few days. Anyway, mark your calendar/PIM/Blackberry for 23rd.


----------



## imperialreign (Sep 11, 2009)

looks like I might ahve to pack some loot away for when the 5870x2s roll out . . .


----------



## Benetanegia (Sep 11, 2009)

btarunr said:


> Not much insider info, but I talk to luckier people, in this case, people who made Cypress a substitute for their better-half/teddy for a few days. Anyway, mark your calendar/PIM/Blackberry for 23rd.



Yep, in other words: Wiz has it, but the watchdog called NDA is watching. he, exactly what I meant.


----------



## Imsochobo (Sep 11, 2009)

consumtion more like 160W.
It do draw 180 watts at times, just like 4890 which basicly use 163 watts average with 10 diffrent games.

that card was rated 190 W.

Lets all stop the speculations on powerconsumtion, and see what REVIEWS say, you guys missed whats more interesting than load powerconsumtion.

Idle! 23 W is a low figure, and should provide better powerbill savings than lower load powerconsumtion and higher idle (hd4xxx).

Many says:
HD5xxx is gonna be perfect if its near 4870-4890 load powerconsumtion and gtx260 idle consumtion!

I really dont care if i can hear the damn card without sound while im playing, as long as its idling i really dont wanna hear it, i guess people dont play without sound.


----------



## lemonadesoda (Sep 11, 2009)




----------



## [I.R.A]_FBi (Sep 12, 2009)

pantherx12 said:


> That's so rubbish!
> 
> I'm so going to find a way to utilise those holes for something!




Thats nasty


----------



## jagd (Sep 12, 2009)

We cant know actual numbers yet but theorical numbers are easy , 150w from 2 pci-e 6pin  plugs and 75w from pci-e 16 slot =225w max give or take 3-4w , 50w difference will be unacceptable  for standard wise


Benetanegia said:


> So there's no way we could know the actual Maximum consumption, but I think that a good estimate would be around RV790's. But if I'm correct and 188w is the TDP, absolute maximum could be anywhere between 225 and 275 W


----------



## pantherx12 (Sep 12, 2009)

[I.R.A]_FBi said:


> Thats nasty



Ha ha  genius


----------



## Benetanegia (Sep 12, 2009)

Imsochobo said:


> consumtion more like 160W.
> It do draw 180 watts at times, just like 4890 which basicly use 163 watts average with 10 diffrent games.
> 
> that card was rated 190 W.
> ...



Well fact is some people do care about noise while playing. Think that for example in Crysis you are alone most of the time and birds choring is as much as you hear, but the card would be almost 100% loaded. It is annoying. 

I do care about the sound myself while I'm playing, but I don't think that ANY card will ever be louder than my power supply anyway. I have never used one of those cards that do 50+ db on the reviews though, I usually choose the silent ones. Every card is silent in comparison to my power supply, it *was* silent but not anymore, I bent it's fan a bit while cleaning it from dust sadly. That's why I use headphones instead of my 5.1 setup most of the times until I replace it. But don't think the PS does a lot of noise. It's just that I paid a lot to have a noise free (clean sound) sound setup just to let something spoil it.



lemonadesoda said:


> http://img.techpowerup.org/090911/Capture521.jpg



Ey! You got the idea!


----------



## Benetanegia (Sep 12, 2009)

jagd said:


> We cant know actual numbers yet but theorical numbers are easy , 150w from 2 pci-e 6pin  plugs and 75w from pci-e 16 slot =225w max give or take 3-4w , 50w difference will be unacceptable  for standard wise



Cards do take more than PCIe specs in punctual moments (AKA maximum powr consumption). And GTX295/HD4870 X2 are the living examples.


----------



## pantherx12 (Sep 12, 2009)

Benetanegia said:


> Well fact is some people do care about noise while playing. Think that for example in Crysis you are alone most of the time and birds choring is as much as you hear, but the card would be almost 100% loaded. It is annoying.
> 
> I do care about the sound myself while I'm playing, but I don't think that ANY card will ever be louder than my power supply anyway. I have never used one of those cards that do 50+ db on the reviews though, I usually choose the silent ones. Every card is silent in comparison to my power supply, it *was* silent but not anymore, I bent it's fan a bit while cleaning it from dust sadly. That's why I use headphones instead of my 5.1 setup most of the times until I replace it. But don't think the PS does a lot of noise. It's just that I paid a lot to have a noise free (clean sound) sound setup just to let something spoil it.
> 
> ...




If its out of warantee just replace the fan, its pretty easy : ]


----------



## Benetanegia (Sep 12, 2009)

pantherx12 said:


> If its out of warantee just replace the fan, its pretty easy : ]



I have the Antec P180 and is a pain to take the PSU out. I'm going to buy a Quad PCIe 2.0 capable PSU soon so I will not bother yet. It really doesn't do much noise, but I'm picky, I want clean sound. But I have limits to that too. All the work it takes to take that thing out of the case... that's the limit.


----------



## air_ii (Sep 12, 2009)

CryEngine3 on Cypress

Hmm, runs quite well on 3* 1920x1200. I think it's only one Cypress board, as Eyefinity can't do CF atm...


----------



## toyo (Sep 12, 2009)

Some thoughts:
- It says it's the same GPU; although the 5850 has less shaders etc.??? Maybe ATI GPUs are going the way AMD CPUs are... some fiddling in the BIOS and there you are, full 1600 shaders  a la Radeon 9800 series.
- What CPU, RAM and PSU does it need? I wonder if a quad is needed to fully unleash this...
- Ummm... I'm still fixated on the matter of turning a 5850 on a 5870


----------



## btarunr (Sep 12, 2009)

toyo said:


> Some thoughts:
> - It says it's the same GPU; although the 5850 has less shaders etc.??? Maybe ATI GPUs are going the way AMD CPUs NVIDIA GPUs are... some fiddling in the BIOS and there you are, full 1600 shaders  a la Radeon 9800 series.



Teach me how to unlock all 240 shaders on my GTX 260. I'd be very grateful.


----------



## toyo (Sep 12, 2009)

That's an Nvidia GPU. AMD seems to be in the business of allowing de-activated stuff back online lately  Even Nvidia cards unlocked some pipelines back in the day... 6800 series or something, but you know that.
Remains to see if the cores are laser-cut or just inactive... and I expect you guys to let us know first hand since it will be some time 'til I get my hands on a DirectX 11 card.


----------



## btarunr (Sep 12, 2009)

toyo said:


> That's an Nvidia GPU. AMD seems to be in the business of allowing de-activated stuff back online lately



Very well, how to unlock all 800 SPs on an HD 4830 with 640 SPs? 

AMD needs to poach into Intel's market share. Hence it didn't bother fixing the faulty ACC microcode. It holds a healthy market position with GPUs. Will not let go of $100 with shoddy harvest methods.


----------



## toyo (Sep 12, 2009)

Hmm, I did a search on the 4830... I must admit I had no idea that this is possible. Must have been a very small lot of cards. See this.
I'm pretty damn sure the 5800 series will be locked at the advertised specs... but I'm hopeful for the best...
EDIT: It seems like a 4850 GPUs got somehow misplaced on the 4830 boards or something, not related to unlocking cores. The GPU-Z info is the same as the 4850 even the clocks. So guys who bought them I think just found out a better GPU in their computer. Oh well, who doesn't dream on such things...


----------



## Imsochobo (Sep 12, 2009)

I got a watercooled PHII 940 running 4 ghz. (really silent stuff 600 rpm fans on dualrad)
Dual pc power&cooling 750W.
i HAD 3 4870, but sold two.
And a silverstone TJ07, which isnt very sound isolated to be honest.

The 4870's are rather quiet, you do hear them a little when i play crysis, Yes. 
Nothing that ruins the game experience, hope the 5870 is in the same league


----------



## buggalugs (Sep 12, 2009)

Yay for ATI Booo for Nvidia. But its good Nvidia is actually using new tech in their new cards this time. I shudder at the price tag though. If they charge like they do for old tech their new cards could be $3000 a piece.


----------



## MoonPig (Sep 12, 2009)

Looks like i've got my money ready for a 5850/70 (depending on benchmarks).

Can't wait


----------



## pantherx12 (Sep 12, 2009)

MoonPig said:


> Looks like i've got my money ready for a 5850/70 (depending on benchmarks).
> 
> Can't wait




Aye me too 

Sacrificing having my own computer for a while and just going to whack it into my parents system XD


----------



## TheMailMan78 (Sep 12, 2009)

I wonder if a 5850 would be faster than two 4850's or about the same just with better scaling? What do you guys think?


----------



## Deleted member 24505 (Sep 12, 2009)

Faster i reckon.


----------



## TheMailMan78 (Sep 12, 2009)

tigger said:


> Faster i reckon.



I figured it would a be faster. Just not sure it would be worth selling them considering they run everything now maxed out. Honestly I can't think of any real advantage to the new 5800 series if you already have one of the 4800 series other than DX11.


----------



## pantherx12 (Sep 12, 2009)

Just for crysis to run at decent fps! ha ha


----------



## MoonPig (Sep 12, 2009)

And the Stalker series on Max


----------



## KainXS (Sep 12, 2009)

toyo said:


> That's an Nvidia GPU. AMD seems to be in the business of allowing de-activated stuff back online lately  Even Nvidia cards unlocked some pipelines back in the day... 6800 series or something, but you know that.
> Remains to see if the cores are laser-cut or just inactive... and I expect you guys to let us know first hand since it will be some time 'til I get my hands on a DirectX 11 card.



removed


----------



## pr0n Inspector (Sep 13, 2009)

the holes, they are just like those on nvidia coolers, except these have hilarious red rings.


----------



## air_ii (Sep 13, 2009)

TheMailMan78 said:


> I figured it would a be faster. Just not sure it would be worth selling them considering they run everything now maxed out. Honestly I can't think of any real advantage to the new 5800 series if you already have one of the 4800 series other than DX11.



Twice the perf/W?


----------



## Hayder_Master (Sep 13, 2009)

yeah at last ,  32 ROP'SSSSSSSSSSSSSSSSSSSSSSSS


----------



## wolf (Sep 13, 2009)

hayder.master said:


> yeah at last ,  32 ROP'SSSSSSSSSSSSSSSSSSSSSSSS



 I hear that, their first 32 ROP card, it should be a beast 

Majorly want to get my mits on a 2gb Eyefinity 5870


----------



## Scrizz (Sep 13, 2009)

ewww mits lol


----------



## lemode (Sep 13, 2009)

I kind of like the eyefinity idea of the 3x1 Portrait Display scenario now all i need to do is shell out $ for 2 more 27 inch monitors! This effectively makes Matrox head2go obsolete for home PC use imo.


----------



## erocker (Sep 13, 2009)

So if I have three monitors and each use DVI or HDMI, will I be able to hook all three up using the two DVI and HDMI outputs of the 5870? If so, that would be great since you can pick up 3 1080p monitors for around $450 bucks.


----------



## grunt_408 (Sep 13, 2009)

Thats the way it looks. I wanted to set up multi screen for gaming around a year ago but didnt want to pay the extra for a tripple head to go. Now I just need a 5870 and 2 more 24" screens and I will be able to play GRID on 3 screens yay.
http://www.amd.com/us/products/technologies/eyefinity/Pages/eyefinity.aspx


----------



## lemode (Sep 13, 2009)

erocker said:


> So if I have three monitors and each use DVI or HDMI, will I be able to hook all three up using the two DVI and HDMI outputs of the 5870? If so, that would be great since you can pick up 3 1080p monitors for around $450 bucks.



look like you will need a DisplayPort connector for the 3rd


----------



## WarEagleAU (Sep 13, 2009)

That looks bad ass and huge. Id love to get one, and Im not even mad about the price, but its the size of that baby that scares me, even with my HAF!


----------



## Frizz (Sep 13, 2009)

tbh i don't like the eyefinity idea, playing 1 game on like 6 - 9 different monitors.. would be like playing a game with 30%+ of the image being the gaps between the monitors. I bet you'll spend just as much on a 40 inch 1080p LCD with that many monitors.


----------



## lemode (Sep 14, 2009)

randomflip said:


> tbh i don't like the eyefinity idea, playing 1 game on like 6 - 9 different monitors.. would be like playing a game with 30%+ of the image being the gaps between the monitors. I bet you'll spend just as much on a 40 inch 1080p LCD with that many monitors.



i love the idea...fraps doesn't show the monitor lines...just a really wide image of your screen resolution...note i hate the game shown i am just illustrating a point


----------



## jessicafae (Sep 14, 2009)

Craigleberry said:


> http://www.amd.com/us/products/technologies/eyefinity/Pages/eyefinity.aspx



If we look at the fine print on the bottom of the official Eyefinity page, ATi says 
1) Driver version 8.66 (Catalyst 9.10) or above is required to support ATI Eyefinity technology and to enable a third display you require one panel with a DisplayPort connector.

So either this means they worked Eyefinity into the drivers a long time ago, or that this is mainly a hardware tweak on the Evergreen cards.  But it almost certainly means that Eyefinity will not be available on older cards (4xxx series for example).  The other interesting bit is that fact that it requires DisplayPort for more than two displays.  I think that is why all the demos used those displayport to DVI(?) adaptors





image source part of LegitReview article


----------



## Scrizz (Sep 14, 2009)

hmmm


----------



## lemode (Sep 14, 2009)

jessicafae said:


> So either this means they worked Eyefinity into the drivers a long time ago, or that this is mainly a hardware tweak on the Evergreen cards.  But it almost certainly means that Eyefinity will not be available on older cards (4xxx series for example).  The other interesting bit is that fact that it requires DisplayPort for more than two displays.  I think that is why all the demos used those displayport to DVI(?) adaptors
> http://www.legitreviews.com/images/reviews/1069/amd_eyefinity_displaylink.jpg
> image source part of LegitReview article



i wonder how many if any DisplayPort connector they will include with the retail eyefinity ed.


----------



## jessicafae (Sep 14, 2009)

jessicafae said:


> If we look at the fine print on the bottom of the official Eyefinity page, ATi says
> 1) Driver version 8.66 (Catalyst 9.10) or above is required to support ATI Eyefinity technology and to enable a third display you require one panel with a DisplayPort connector.





lemode said:


> i wonder how many if any DisplayPort connector they will include with the retail eyefinity ed.



I think the pictures we saw of the standard cards like 
http://www.techpowerup.com/?101730





shows 2 DVI, 1 DisplayPort, 1 HDMI which I think will be the standard configuration. So three display Eyefinity with 2DVI +1DisplayPort or DVI+HDMI+DisplayPort but not 2DVI+HDMI. At least that is what the fine print is implying.  
The big question is whether a DisplayPort->DVI cable will allow 3 DVI monitors to do 3way Eyefinity?


----------



## erocker (Sep 14, 2009)

If the DisplayPort output is needed, I hope there is some sort of an adapter if that is even a possibility. I definitely want to try three monitors.


----------



## Bo_Fox (Sep 14, 2009)

This version above (presumably the 2GB version) shows a full slot of exhaust, not a narrow one like in some other pictures..   

Well, if the final "reference" board is going to ship with a narrow exhaust slot opening, then I guess it would be because the card is indeed going to run cool enough after all.   I can vividly remember my X1900XTX exhausting a strong flow from its noisy stock cooler out the left half of the exhaust opening, while I hardly felt any airflow from the right half of the opening.  I replaced the cooler with a HIS IceQ Turbo cooler anyways, which did a quieter job of cooling that insanely hot/power-hungry chip.


----------



## jagd (Sep 14, 2009)

I read there is an adopter but cost 30$ atm . I would be sure before buying one that  2DVI +hdmi configuration not working or take short road and ask to w1zzard


----------



## Scrizz (Sep 14, 2009)

how the F do you know.
the cards haven't even come out of NDA


----------



## Bo_Fox (Sep 14, 2009)

btarunr said:


> I'm hearing that not only is it whisper-quiet, but also surprisingly cool (surprising for its ~180W load power consumption). Wait till the 23rd.



I find this incredibly hard to believe, given exactly 2x the trannies/shaders/tmu's/rops of a 4890, at the same 850MHz clock!!!  

Just one process node shrink from 55nm to 40nm (of which the INQ argues to actually be 45nm, but now they call it "40/45nm"), and we're seeing 2x the specs at a LOWER power consumption, using the same clocks.  The chip die size area is actually more than 2x as big as a virtual 4890 shrunk to 40nm.  

It must be an incredible design job done by ATI.  Remember how Nvidia's shrink from 65nm to 55nm almost did nothing with respect to power savings (maybe 15W at the most)?

What I am guessing is that the 5870 runs at an incredibly low voltage for ATI to be making an announcement of only 188W TDP (after they announced a 190W TDP for their 4890).  The 40nm spin must have been so successful that it was a breeze clocking their cores at 850MHz, at say, 1.1v or so.  Also, I can recall ATI stating that they would ensure their future GPU to be completely error free according to Tetedeiench 's new OCCT GPU stress-testing tool, so I would expect ATI to be a bit more serious about the power consumption figures that they announce (in that the 5870 would not be as power-hungry as a 4890).  

I would guess that it means we can overclock the HELL outta this thing!!!   Increasing it to say, 1.3 or 1.4v could yield 1.1GHz or more for stable operation...  hopefully!!


----------



## Steevo (Sep 14, 2009)

I just came


----------



## a_ump (Sep 14, 2009)

Bo_Fox said:


> This version above (presumably the 2GB version) shows a full slot of exhaust, not a narrow one like in some other pictures..



that's just the eyefinity SDK of the HD 5870 with a full rear exhaust. Though i personally think like erocker said somewhere that i'd rather they'd gone with standard 2 DVI and a S-Video outputs. Or just 1 DVI, HDMI, and S-Video with a HDMI to DVI adapter.


----------



## erocker (Sep 14, 2009)

I just want to be able to hook up three monitors without DisplayPort inputs on them.


----------



## a_ump (Sep 14, 2009)

eh, i'm fine with a single moniter personally, i'm just gonna hook up my TV that i plan to get to my card but thats it. I couldn't stand the borders on 2 or more screens when playing a game. esp in an FPS


----------



## phanbuey (Sep 14, 2009)

Steevo said:


> I just came



 ditto.


----------



## jessicafae (Sep 14, 2009)

erocker said:


> I just want to be able to hook up three monitors without DisplayPort inputs on them.



Eyefinity is no longer under NDA, so we should be able to get a clear answer on this now.  W1zzard?
It does look like it requires at least one DisplayPort monitor though to get 3 displays from the ATi fine print.



jessicafae said:


> If we look at the fine print on the bottom of the official Eyefinity page, ATi says
> 1) Driver version 8.66 (Catalyst 9.10) or above is required to support ATI Eyefinity technology and to enable a third display you require one panel with a DisplayPort connector.


----------



## Bo_Fox (Sep 14, 2009)

Steevo said:


> I just came



And you didn't thank me for it?!?


----------



## Hayder_Master (Sep 14, 2009)

wolf said:


> I hear that, their first 32 ROP card, it should be a beast
> 
> Majorly want to get my mits on a 2gb Eyefinity 5870



+1 for that i think about it from now too , or i wait to see the biggest beast 5870x2 with 4GB it will be awesome


----------



## FordGT90Concept (Sep 14, 2009)

erocker said:


> I just want to be able to hook up three monitors without DisplayPort inputs on them.


A lot of workstation cards (Fire/Quadro) support 4 x DVI through wiggy headers and breakout cables. XD


As far as 5870 goes, I was expecting the price to be lower.  That means I'll probably be waiting until NVIDIA launches a competitive card to drive AMD's prices down.  I still would like to see Larabee before upgrading though.


----------



## wolf (Sep 14, 2009)

Steevo said:


> I just came





phanbuey said:


> ditto.



I've been on perma-spooj since I realsied a few facts.

32 ROPS - 1600 sp's - 2GB - 6 monitors

and I JIZZ IN MY PANTS


----------



## a_ump (Sep 14, 2009)

FordGT90Concept said:


> A lot of workstation cards (Fire/Quadro) support 4 x DVI through wiggy headers and breakout cables. XD
> 
> 
> As far as 5870 goes, I was expecting the price to be lower.  That means I'll probably be waiting until NVIDIA launches a competitive card to drive AMD's prices down.  I still would like to see Larabee before upgrading though.



no insult or anything i'm just very surprised in how much faith or w/e people have in larrabee. I mean it's taken years and years for nvidia and ATI to develope their GPU's to the performance they can manufacture, i really don't see Intel doing it their first time. I see maybe HD 4850 performance but then that's another subject for a different thread


----------



## HossHuge (Sep 14, 2009)

I'm assuming that the Eyefinity thingy would work with three digital projectors as well?  Cause you could have 3 60+" screens.  I think I would get dizzy.


----------



## pantherx12 (Sep 14, 2009)

Yeah it does, I've seen a video of Left 4 Dead running on 3 projections 

was about 10 foot by 12 foot of zombie goodness X


----------



## HossHuge (Sep 14, 2009)

pantherx12 said:


> Yeah it does, I've seen a video of Left 4 Dead running on 3 projections
> 
> was about 10 foot by 12 foot of zombie goodness X



I would have replied earlier but I had to go and change my shorts....

I see this in my future...


----------



## pantherx12 (Sep 14, 2009)

I'll try and find the video for you will edit this post with a link!


Can't direct link to the video but here you go

http://www.engadget.com/2009/09/11/video-ati-radeon-eyefinity-eyes-on-featuring-left-4-dead-on-a/


----------



## TheMailMan78 (Sep 14, 2009)

a_ump said:


> no insult or anything i'm just very surprised in how much faith or w/e people have in larrabee. I mean it's taken years and years for nvidia and ATI to develope their GPU's to the performance they can manufacture, i really don't see Intel doing it their first time. I see maybe HD 4850 performance but then that's another subject for a different thread



Three words.

Intels massive budget.


----------



## jessicafae (Sep 14, 2009)

a_ump said:


> no insult or anything i'm just very surprised in how much faith or w/e people have in larrabee. I mean it's taken years and years for nvidia and ATI to develope their GPU's to the performance they can manufacture, i really don't see Intel doing it their first time. I see maybe HD 4850 performance but then that's another subject for a different thread





TheMailMan78 said:


> Three words.
> 
> Intels massive budget.



one word .... Itanium

some ideas just don't work no matter how much money you throw at it.
"The Itanium approach … was supposed to be so terrific—until it turned out that the wished-for compilers were basically impossible to write."
—computer scientist Donald Knuth[7]

Let's hope Larabee turns out to be brilliant and changes the world for the better, but it is not a guaranteed win.


----------

