Friday, September 11th 2009

AMD Cypress ''Radeon HD 5870'' Stripped

Here are the first pictures of the obverse side of Cypress' PCB, and the first pictures of the centre of attraction: the AMD Cypress GPU. CzechGamer dissembled two Cypress "Radeon HD 5870" cards for a quick blurrycam photo-session. The PCB shot reveals quite a bit about Cypress, particularly about the GPU.

To begin with, the GPU is AMD's overhaul on transistor counts, and a bold work of engineering on the 40 nm manufacturing process, given the kind of problems foundry partners had initially. Apparently they seem to have recovered with most of them, as AMD's AIB partners are coming up with new products based on the 40 nm RV740 GPU on a weekly basis. The package holds a "diamond-shaped" die that is angled in a way similar to RV740, RV730, or more historically, the R600. The seemingly huge die measures 338 mm² (area), and for 40 nm, it translates to "huge", and is vindicated by the transistor count of ~2.1 billion. In contrast, AMD's older flagship GPU, the RV790 holds 959 million, and NVIDIA's GT200 holds 1.4 billion.
The PCB has three distinct areas: the connectivity, processing, and VRM. To fuel the GPU is a high-grade 4 phase digital PWM power circuit, while the PCB has placeholders for an additional vGPU phase. The 8 (or 16 on the 2 GB model) memory chips, is powered by a 2 phase circuit. Power is drawn from two 6-pin PCI-Express power connectors, but there seems to be a placeholder for two more pins, i.e., to replace one of those 6-pin connectors with an 8-pin one. Bordering the GPU on two sides are the 8 GDDR5 memory chips, which AMD calls says is generation ahead of present GDDR5, and supports reference frequencies as high as 1300 MHz (2600 MHz DDR, 5.20 GHz effective). In the 2 GB variant, 8 more chips seat on the other side of the PCB. This is what perhaps, the backplate is intended to cool. On the connectivity portion of it, are the two CrossFire connectors, DisplayPort, HDMI and a cluster of two DVI-D connectors. There has been a raging debate about how adversely the small air vent would affect the card, but AMD is promising some energy efficiency breakthroughs, plus given how roomy the card is, the vent seems sufficient.

Finally, information from ArabHardware.net suggests a pricing model on three of the first SKUs based on Cypress: HD 5870 2 GB, HD 5870 1 GB, and HD 5850 1 GB. All three use the same GPU and memory standard (GDDR5), but differ in clock speeds and GPU configurations. While HD 5870 sports 1600 stream processors, 80 TMUs, and 32 ROPs, HD 5850 has 1440 stream processors, 72 TMUs, and 32 ROPs. Although 32 ROPs puzzles us for a 256-bit wide memory interface, we suspect low-level design changes that make "32 ROPs" more of an effective count than an absolute count. While HD 5870 features over 800 MHz core clock and 5.20 GHz memory, its little sibling has over 700 MHz core clock and 4.40 GHz memory. Price points expected are US $449 for Radeon HD 5870 2 GB, $399 for HD 5870 1 GB, and $299 for HD 5850. AMD is expected to announce all three models on the coming 23rd. You'll be able to find them at your favourite computer store a little later, availability is a certainty by the time you're ready to buy Windows 7. AMD's newest products will be more than ready to squat under X-mas trees all over.
Sources: Czech Gamer, Arab Hardware
Add your own comment

163 Comments on AMD Cypress ''Radeon HD 5870'' Stripped

#26
Valdez
newtekie1I have a feeling this is still going to be a one loud ass card, the fan is going to have to work at full blast to push all the hot air out of that little openning... And that die size is huge for 40nm!
Well... g300 will be much larger... (it's just a guess).
Posted on Reply
#27
newtekie1
Semi-Retired Folder
pantherx12I doubt it, air pressure will build up in the card and it will come out just fine with a normal fan.
Just like the 9800GX2 and GTX295...Even nVidia couldn't keep those things quiet with tiny opennings like this card's...ATi doesn't stand a chance.
mdm-adphYap yap yap -- how'd I know you were going to say something negative? :laugh:
And how did I know you would troll my post, and not add anything even remotely relavant to the dicussion?

And if you noticed, I said something positive also...but you do only tend to notice when people say negative things about ATi...but you are an ATi fanboy so it is expected I guess...:shadedshu
ValdezWell... g300 will be much larger... (it's just a guess).
I don't care about G300, this isn't a topic about G300.

The large die size is a good thing. It means ATi packed a lot of shit in there, hopefully that means awesome performance. And it means 40nm has matured to the point that it can handle large die sizes like this.
Posted on Reply
#28
KainXS
. . . . .

anyway

these should be close

HD5870
GPU:RV870XT
Core Clock:850 Mhz
Shader Clock:850 Mhz
Memory Clock:1300 Mhz
Pixel Fill Rate:27200 MPixels/sec
Texture Fill Rate:68000 MTexels/sec

HD5850
GPU:RV870
Core Clock:750 Mhz
Shader Clock:750 Mhz
Memory Clock:1100 Mhz
Pixel Fill Rate:24000 MPixels/sec
Texture Fill Rate:54000 MTexels/sec


HD4870X2
GPU:RV770X2
Core Clock:750 Mhz
Shader Clock:750 Mhz
Memory Clock:1800 Mhz
Pixel Fill Rate:24000 MPixels/sec
Texture Fill Rate:60000 MTexels/sec


HD4890
GPU:RV770XT
Core Clock:850 Mhz
Shader Clock:850 Mhz
Memory Clock:975 Mhz
Pixel Fill Rate:13600 MPixels/sec
Texture Fill Rate:34000 MTexels/sec
Posted on Reply
#29
pantherx12
newtekie1Just like the 9800GX2 and GTX295...Even nVidia couldn't keep those things quiet with tiny opennings like this card's...ATi doesn't stand a chance.
Erm... both are dual GPU cards, its hardly comparable to a a single GPU card that also runs more effciantly, the Nvidia cards you mentioned would require high fan speeds to cool two gpu's ( already twice as much heat as a 5870) let alone them being less effciant in general.
Posted on Reply
#31
SteelSix
phanbuey5870x2 4gb is gonna be my next card. Just have to wait for h NV cards to come out so that prices can drop to a sane level.
Hell yes, though I'll be grabbing one on launch day. Thanks for the news TPU!! :roll:
Posted on Reply
#32
lemode
Love the look of the Radeon HD 5870!

I typically build a new rig every summer/fall for myself alternating between Intel and AMD systems. I was actually excited to invest in i7 860/p55/single 295 this time around now I am considering waiting to see if an HD 5870x2 comes out! If it does I will just wait for the Leo platform to be released in May. I haven’t bought a Radeon card since the x800 so this is kind of exciting!
Posted on Reply
#33
newtekie1
Semi-Retired Folder
pantherx12Erm... both are dual GPU cards, its hardly comparable to a a single GPU card that also runs more effciantly, the Nvidia cards you meantioned would require high fan speeds to cool two gpu's ( already twice as much heat as a 5870) let alone them being less effciant in general.
With this die size, even with the 40nm transition, I'm betting this card puts out just as much heat as two G92s.
Posted on Reply
#34
pantherx12
Smaller fab process means greater surface area that coupled with a bigger die means greater heat disapation.

Should be fine and dandy.
Posted on Reply
#35
The Witcher
If am gonna buy this graphic card, am not gonna buy it because it support DX11, I'll just buy it because it's stronger than the current generation.

We all know that the 8000 series were the first G Cards to support DX10 and they didn't preform well in these games to be honest so I don't think that these new DX11 cards will do better than their predecessors , not to mention that we will probably see the first DX11 game after 2 or 3 years.
Posted on Reply
#36
dir_d
The WitcherIf am gonna buy this graphic card, am not gonna buy it because it support DX11, I'll just buy it because it's stronger than the current generation.

We all know that the 8000 series were the first G Cards to support DX10 and they didn't preform well in these games to be honest so I don't think that these new DX11 cards will do better than their predecessors , not to mention that we will probably see the first DX11 game after 2 or 3 years.
Have you not read the news on this site for the past week or not heard about DiRT 2 being the 1st DX11 title named that will be out not too long from now.
Posted on Reply
#37
air_ii
newtekie1With this die size, even with the 40nm transition, I'm betting this card puts out just as much heat as two G92s.
From what I've heard, it's 27W (!) while idling and 185-ish at load. Btw, I don't know if you've seen the cooler pictures, but the card seems to have exhausts on both sides. Some think that the front one is intake, but given the construction of the fan, it has to be exhaust.

I'm still gonna WC it and see what NV has to say later on ;).
Posted on Reply
#38
air_ii
dir_dHave you not read the news on this site for the past week or not heard about DiRT 2 being the 1st DX11 title named that will be out not too long from now.
Dirt 2 on 3 displays
The WitcherIf am gonna buy this graphic card, am not gonna buy it because it support DX11, I'll just buy it because it's stronger than the current generation.

We all know that the 8000 series were the first G Cards to support DX10 and they didn't preform well in these games to be honest so I don't think that these new DX11 cards will do better than their predecessors , not to mention that we will probably see the first DX11 game after 2 or 3 years.
The gameplay (link above) looks quite smooth on 3 panels!
Posted on Reply
#39
HossHuge
While I think this card is going to be huge, I don't expect the type of increase that we got from a 3870 to a 4870.
Posted on Reply
#40
TheMailMan78
Big Member
Those pictures look like they were taken from some kinda sick twisted GPU smut film. I half way expect some kinda tentacle rape to be posted from the same source.
Posted on Reply
#41
btarunr
Editor & Senior Moderator
If you're into uh..tentacles, check out the latest news post.
Posted on Reply
#42
newtekie1
Semi-Retired Folder
dir_dHave you not read the news on this site for the past week or not heard about DiRT 2 being the 1st DX11 title named that will be out not too long from now.
DiRT 2 DX11 about as much as Bioshock was DX10...

Most of the early "DX11" games will be DX10 games with a few generally unnoticable additions. What DX version the game uses, or even what version the card supports, isn't really important to me. Good gameplay is what is important, and the horsepower of the card is what I'm worried about.
air_iiFrom what I've heard, it's 27W (!) while idling and 185-ish at load. Btw, I don't know if you've seen the cooler pictures, but the card seems to have exhausts on both sides. Some think that the front one is intake, but given the construction of the fan, it has to be exhaust.

I'm still gonna WC it and see what NV has to say later on ;).
With power saving features, I would believe 27w at idle, but I don't care about idle, even the current ATi cards are quiet at idle. And at 185w at load, you are looking at more than a GTX295.:eek:

And I have seen the cooler design, and the two openning at the front of the card are decoration only it seems. There is no way they are intakes, as they face the exhaust part of the fan. And it looks like the fan is actually blocked off from the fan entirely. If they aren't, then the cool air being sucked in from the fan is just being exhausted right out the holes, not cooling anything, making them useless. Either way, they don't help the situation with the extremely small exhaust at the back of the card, this will limit airflow, causing higher temps, and force the fan to work harder and louder.

We will have to see when the card is actually released, but that is just my opinion after seeing the pictures so far. And yes, I'm well aware that this might not be the final design, and they might reduce or re-arrange the connectors on the card.
Posted on Reply
#43
Unregistered
WTF, it's look like GTX 280, but with ati chip in it (dam, ati is going insane, they double all the spec :rockout:)

i want to see benches right now :cry:, please
Posted on Edit | Reply
#44
air_ii
newtekie1With power saving features, I would believe 27w at idle, but I don't care about idle, even the current ATi cards are quiet at idle. And at 185w at load, you are looking at more than a GTX295.:eek:
I think it's closer to 285 for GTX295...
Posted on Reply
#45
erocker
*
newtekie1DiRT 2 DX11 about as much as Bioshock was DX10...

Most of the early "DX11" games will be DX10 games with a few generally unnoticable additions.
Do you have any links or solid information about this?

Also, those with the card are claiming it's 50% faster than GTX285.
Posted on Reply
#46
The Witcher
dir_dHave you not read the news on this site for the past week or not heard about DiRT 2 being the 1st DX11 title named that will be out not too long from now.
Ok I admit that you beat me on that....

Its been a while since I checked the website (I'm shitting you not), I don't see much graphical difference between the DX10 games and Dirt 2 so in my opinion it's not DX11 unless it has a graphical improvement I mean big improvement not minor changes which you won't be able to notice.

Back to topic, when do you think we will see the first accurate benchmarks of these cards ?
Posted on Reply
#47
air_ii
If guys over at CHIPHELL are to be believed, 5870 scores min 30, avg 43 and max 54 fps in Crysis 1900x1200, 4AA+16AF DX10 Very High. That's on PhII 955BE.
Posted on Reply
#48
jagd
Yes of course they are DX11 .
zOaibThese cards will support DX 11 correct ?
Posted on Reply
#49
newtekie1
Semi-Retired Folder
air_iiI think it's closer to 285 for GTX295...
If it is 185w under load, then it is closer to a GTX295, which according to W1z's reviews is at about 181-182w on average under load.
erockerDo you have any links or solid information about this?

Also, those with the card are claiming it's 50% faster than GTX285.
Nope, until we see what DiRT 2 looks like on a DX11 card vs. a DX10 card, it is all speculation, just like 90% of this thread. But I'm guessing, since the game still runs on DX10 and DX9, I'm just guessing that DX11 isn't going to add a whole lot to the game...

And isn't a GTX295 about 50% faster than a GTX285?
Posted on Reply
#50
erocker
*
newtekie1If it is 185w under load, then it is closer to a GTX295, which according to W1z's reviews is at about 181-182w on average under load.



Nope, until we see what DiRT 2 looks like on a DX11 card vs. a DX10 card, it is all speculation, just like 90% of this thread. But I'm guessing, since the game still runs on DX10 and DX9, I'm just guessing that DX11 isn't going to add a whole lot to the game...
I believe all DirectX is based off the previous version. Most likely there is nothing about DX10 in DX11. It's more likely DX11 features on top of DX9. Looking at the features of DX11, regardless if it's based on DX10 or DX9, they will make a significant difference in visual quality. Just look at the wire frames between DX9 and DX11. The DX11 features are significant as DX10 features were not. Either way, it's the horsepower increase I'm more concerned with. DX11 will come either way.
Posted on Reply
Add your own comment
Jul 22nd, 2024 23:21 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts