Friday, September 18th 2009

Sapphire Radeon HD 5870 and HD 5850 Smile for the Camera

Here are the fist pictures of Sapphire's Radeon HD 5800 series offerings: Radeon HD 5870 1GB and Radeon HD 5850. The cards sport the usual sticker design of a CGI girl in a reddish background. With these cards having the cosmetic "red-streak" cleaving the cooler shroud in the center, the so is the sticker. This is also perhaps the first public picture of the Radeon HD 5850, and our size projections were right: While the Radeon HD 5870 maintains a long PCB, the HD 5850 is about as long as a Radeon HD 4870 (reference design). Both accelerators stick to the reference AMD design.

* Images removed at request of Sapphire * Google for alternate source
Source: Hermitage Akihabara
Add your own comment

148 Comments on Sapphire Radeon HD 5870 and HD 5850 Smile for the Camera

#51
btarunr
Editor & Senior Moderator
newtekie1I don't think it is any more special than previous GPUs that it needs 100Gb/s+ memory bandwidth to be functional. Again, I'm just wondering if it really is because the memory bandwidth is needed, or because there are still problems with 40nm.
Nobody is talking about functionality, it's about being competitive. Radeon HD 4890 needed that extra bit of memory bandwidth, so if HD 5850 is touted to be twice as fast as a Radeon HD 4850, it definitely needs that bandwidth. Once again, it does not matter if HD 5850 is produced out of "defective" dies, because the resulting product is not defective. So no point in this "40 nm is faulty" rhetoric. Disabling components to carve out SKUs is not an indication of foundry-related problems.
Posted on Reply
#52
kylzer
^_^ im sure you've said that like 3 times now lol

Damn only 4-5days to go now and i can buy my new card and get rid of this 8800GS :P
Posted on Reply
#53
btarunr
Editor & Senior Moderator
kylzer^_^ im sure you've said that like 3 times now lol
Yeah, because the other person repeated the same thing too.
Posted on Reply
#54
Imsochobo
tiggerNot that i would buy sapphire ever again,that ahole GG is such a tit it as put me off sapphire permanently.
there are 11 stickers from sapphire on my keyboard since 2 years when its 1st of january.
None are defective, or had issues.

Not that other cards have had issues, had one RMA of a production fault from p0werc0lor with a limited x800 edition.
Posted on Reply
#55
Zubasa
I just can't wait to get rid of he sucker I have now, it don't have both the bandwidth and memory capacity to let mey game at full HD....
Even overclocking doesn't help much, the GDDR3 just won't clock much without crashing. :banghead:
There is simply not enough bandwidth with the 4850, as the performance increase much more by OCing the memory than the GPU.

On top of this, I just can't stop myself from thinking that ATi stuck the worst bined chips possible on the first batch of 4850s.
There is no other GPU that I have clock this badly.
No way I am buying the first batches of 5850s unless they are dirt cheap and OC like nobody's business.
I paid $1700 HK for this sucker, thats $213 US and now it is barely stable a its factory OC.
Posted on Reply
#57
Easo
Eh lol, i need money right now :D
Posted on Reply
#58
1Kurgan1
The Knife in your Back
Awesome, cant wait till they hit the market and I finally have the money for one.
Posted on Reply
#59
lemonadesoda
I'm getting bored of these manga girl fighters on the GPUs. It was OK way back when, but now the imagery is old, stale and tired.

If they want to stick with this manga stuff, then they should have more chicks on there... the hotter the card (more powerful), the more chicks, and for OC cards, they should wear less and less. Obviously the ultimate extreme versions of the card would then be XXX rated... available only to over 18, and consequently making them even more desirable.

Of course, the über fastest, wild, overclockerized super dooper edition would have a dude "benchmarking" a bevvy of near naked pool chick roller girls.

Naturally, the default windows jingle would be replaced by the drivers with some snazzy jazzed up pimp anthem.
Posted on Reply
#60
erocker
*
lemonadesodaIf they want to stick with this manga stuff, then they should have more chicks on there... the hotter the card (more powerful), the more chicks, and for OC cards, they should wear less and less. Obviously the ultimate extreme versions of the card would then be XXX rated... only making them even more desirable.

Of course, the über fastest, wild, overclockerized super dooper edition would have a dude benchmarking a bevvy of near naked pool chick roller girls.

Naturally, the default windows jingle would be replaced by the drivers with some snazzy jazzed up pimp anthem.
Your logic is absolutely flawless. :toast:
Posted on Reply
#61
MoonPig
I dont get the use of lasses on these... If they were REAL lasses, then ok... but silly CGI ones just look daft.

And i second the use of less and less clothes... that alone would make me want the XXX version. Pity a waterblock doesn't have the same image :(
Posted on Reply
#62
newtekie1
Semi-Retired Folder
btarunrNobody is talking about functionality, it's about being competitive. Radeon HD 4890 needed that extra bit of memory bandwidth, so if HD 5850 is touted to be twice as fast as a Radeon HD 4850, it definitely needs that bandwidth. Once again, it does not matter if HD 5850 is produced out of "defective" dies, because the resulting product is not defective. So no point in this "40 nm is faulty" rhetoric. Disabling components to carve out SKUs is not an indication of foundry-related problems.
No it doesn't, there have been plenty of cards that were twice as fast that didn't need extra memory bandwidth. I mean the HD4850 was twice as fast as the HD3850, and both used GDDR3. I'm not saying the bandwidth would have made no difference, I'm just saying, I believe, it would have crippled it as much as the cut down die does.

And your take on the defective dies, I agree with. It doesn't matter because the end product is not defective. That is not my point, and has nothing to do with what I am saying. I'm wondering if part of the decision to use the cut down dies was because of a high defective rate, indicating that there is still an issue with 40nm. I believe it does indicate that. That by itself is not enough to come to that conclusion, but when you couple it with the fact that we already know 40nm has problems, and it really isn't a hard conclusion to draw. When nVidia did it with G80, G92, and GT200 we didn't know that 65nm and 55nm was having issues, so we couldn't assume that is why they did it. On top of that, nVidia has used this practice for generations, while ATi has not. So ATi suddenly using it, coupled with the already known issues with 40nm, it what makes me wonder. This issue is also not a negative thing on ATi, it isn't a negative at all, I'm just curious. I don't want to see these things released and a promised price, but low supplies causing the prices to be jacked up, or even worse a repeat of the HD4770 with essentially a paper launch.
Posted on Reply
#63
TheMailMan78
Big Member
lemonadesodaI'm getting bored of these manga girl fighters on the GPUs. It was OK way back when, but now the imagery is old, stale and tired.

If they want to stick with this manga stuff, then they should have more chicks on there... the hotter the card (more powerful), the more chicks, and for OC cards, they should wear less and less. Obviously the ultimate extreme versions of the card would then be XXX rated... available only to over 18, and consequently making them even more desirable.

Of course, the über fastest, wild, overclockerized super dooper edition would have a dude "benchmarking" a bevvy of near naked pool chick roller girls.

Naturally, the default windows jingle would be replaced by the drivers with some snazzy jazzed up pimp anthem.
Hawking has nothing on you man. You're the F#$KING master of all that is awesome!
Posted on Reply
#64
FreedomEclipse
~Technological Technocrat~
5870 for me :rockout::rockout::rockout:
Posted on Reply
#65
1Kurgan1
The Knife in your Back
newtekie1And your take on the defective dies, I agree with. It doesn't matter because the end product is not defective. That is not my point, and has nothing to do with what I am saying. I'm wondering if part of the decision to use the cut down dies was because of a high defective rate, indicating that there is still an issue with 40nm. I believe it does indicate that. That by itself is not enough to come to that conclusion, but when you couple it with the fact that we already know 40nm has problems, and it really isn't a hard conclusion to draw. When nVidia did it with G80, G92, and GT200 we didn't know that 65nm and 55nm was having issues, so we couldn't assume that is why they did it. On top of that, nVidia has used this practice for generations, while ATi has not. So ATi suddenly using it, coupled with the already known issues with 40nm, it what makes me wonder. This issue is also not a negative thing on ATi, it isn't a negative at all, I'm just curious. I don't want to see these things released and a promised price, but low supplies causing the prices to be jacked up, or even worse a repeat of the HD4770 with essentially a paper launch.
ATI hasn't? I can remember back to the differences of the x1950pro vs x1950 XT. Either way, theres always going to be "defective" products with any manufacturing process. They cut down the 3870 to a 3850, they cut down the 4870 to a 4850 then a 4830. So who's to assume that the 5870 to 5850 scenario is any different? And even if it is, what does it matter as the 5850 will most likely move more products at launch anyways.
Posted on Reply
#66
A Cheese Danish
That is one sweet card :D
Can't wait to hold it in my hands and have it housed in my rig :D
Posted on Reply
#67
newtekie1
Semi-Retired Folder
1Kurgan1ATI hasn't? I can remember back to the differences of the x1950pro vs x1950 XT. Either way, theres always going to be "defective" products with any manufacturing process. They cut down the 3870 to a 3850, they cut down the 4870 to a 4850 then a 4830. So who's to assume that the 5870 to 5850 scenario is any different? And even if it is, what does it matter as the 5850 will most likely move more products at launch anyways.
*sigh* You obviously aren't getting it.

1.) The x1950Pro used a completely different core than the x1950XT. It used RV570, the x1950XT used RV580. The x1900GT used a cut down core from the x1900XT, however the x1900GT didn't come out until near the end of the x1900 product cycle.
2.) I know there will always be defective cores...not products...cores. I'm not saying anything about the final products.
3.) They did not cut down the HD3870 core to make an HD3850. They did not cur down the HD4870 core to make an HD4850. They did cut down the HD4870 core to make the HD4830, however.
4.) The difference is because ATi has traditionally has not launched a product line with defective cut down cores. They add the SKUs using the defective cut down cores later down the road.
Posted on Reply
#68
TheMailMan78
Big Member
I feel bad for newtekie1. He's not bashing ATI. He's wondering why they changed strategy so much and how that change will effect supply and price. ATI could make a 2nm chip with a 12,000Mhz GPU and sell it for $3.00 but if the yields are bad demand will be higher than supply thus jacking up its price.

Sure the card retails for 3 bucks but if ATI can only produce a dozen of them how much do you think they will REALLY cost. Basically newtekie1 is talking supply and demand. Nothing more.

@newtekie1. You have to keep things simple man. You talk to damn much about something simple it makes it complicated.
Posted on Reply
#69
1Kurgan1
The Knife in your Back
A Cheese DanishCan't wait to hold it in my hands and have it housed in my rig :D
Your talking about a videocard here right? :laugh:
Posted on Reply
#70
TheMailMan78
Big Member
1Kurgan1Your talking about a videocard here right? :laugh:
If not his dog better watch out!
Posted on Reply
#71
1Kurgan1
The Knife in your Back
newtekie13.) They did not cut down the HD3870 core to make an HD3850. They did not cur down the HD4870 core to make an HD4850. They did cut down the HD4870 core to make the HD4830, however.
4.) The difference is because ATi has traditionally has not launched a product line with defective cut down cores. They add the SKUs using the defective cut down cores later down the road.
I'm getting it, but what I'm saying is your looking far too much into this, it doesn't matter either way. So why bring it up in every single post when the cards will be released and be awesome.

Also the 4850 is a cut down 4870, it's running GDDR3 instead, but the GPU isn't clocked as high and it has a weaker power setup on it, which would make me think that 4870's would be a higher binned chip, and if a chip didn't pass it moved down to a 4850, and if the chip didn't pass there it moved down to a 4830. That or if a chip failed as a 4870, but didn't fail on clocks, and failed on SPU's then it skipped 4850 and went right to the 4830.

And mentioning that, look how well 4830's OC for the most part, I don't even know if they are really binned down since their was good demand for them. So who's to say the 5850 is really a binned down product? It might be off the bat, but if it sells well I'm betting 5870 GPU's that are higher binned will get cut down and used.

Either way, your looking way to much into this, defective products or not ATI is obviously going to make profit on this and is happy with the turnover rate enough to bring it to the market now, so who cares.
Posted on Reply
#72
tkpenalty
Newtekie's cause for concern is pretty valid imho. Considering how Nvidia lately has been only putting out PR spin instead of actually talking about their GT300, its likely that what Charlie said about the 3% yield rates of the GT300 being true-this means a problem for AMD as well who probably share the same process that nvidia is using.

Even though places like Charlie's most of the time is bullshit, how did he come up with a figure of 3% then? And why is nvidia just doing PR spin with investor advisories lately?

Btw with the 4xxx series, I believe its not binning but just automated selection where they just cast off the dies on the edges of a wafer for the lower end derivatives, which statistically speaking suck.
Posted on Reply
#73
btarunr
Editor & Senior Moderator
newtekie1No it doesn't, there have been plenty of cards that were twice as fast that didn't need extra memory bandwidth. I mean the HD4850 was twice as fast as the HD3850, and both used GDDR3. I'm not saying the bandwidth would have made no difference, I'm just saying, I believe, it would have crippled it as much as the cut down die does.
HD 4890 comes with 120+ GB/s memory bandwidth, should something faster than that also have higher bandwidth? And it was able to harvest RV770/RV790 ASICs using cut-down configurations and also let them be priced low. So I don't see how that part isn't looking likely.
Posted on Reply
#74
TheMailMan78
Big Member
1Kurgan1I'm getting it, but what I'm saying is your looking far too much into this, it doesn't matter either way. So why bring it up in every single post when the cards will be released and be awesome.

Also the 4850 is a cut down 4870, it's running GDDR3 instead, but the GPU isn't clocked as high and it has a weaker power setup on it, which would make me think that 4870's would be a higher binned chip, and if a chip didn't pass it moved down to a 4850, and if the chip didn't pass there it moved down to a 4830. That or if a chip failed as a 4870, but didn't fail on clocks, and failed on SPU's then it skipped 4850 and went right to the 4830.

And mentioning that, look how well 4830's OC for the most part, I don't even know if they are really binned down since their was good demand for them. So who's to say the 5850 is really a binned down product? It might be off the bat, but if it sells well I'm betting 5870 GPU's that are higher binned will get cut down and used.

Either way, your looking way to much into this, defective products or not ATI is obviously going to make profit on this and is happy with the turnover rate enough to bring it to the market now, so who cares.
He didn't say it would have problems selling. From what I read hes been pretty positive about it by newtekie1 standards. He was just stating his curiosity about the manufacturing strategy and how it will effect all of us down the road. Not that ATI is doing anything wrong, cheap, or other wise bad. Just WHY the sudden change. Ever heard the term if its too good to be true then it probably is?

Also I think he's been repeating himself as a defense mechanism. Some people on this forum love to attack before they try and understand what someone is saying. Hence his repetitiveness. Also keep in mind we all are not the best at getting our point across and lets face it, most of us are social misfits. :laugh:

Anyway I'm done defending newtekie1. If yall don't get it now then you need to ride the short bus to school.
Posted on Reply
#75
newtekie1
Semi-Retired Folder
1Kurgan1I'm getting it, but what I'm saying is your looking far too much into this, it doesn't matter either way. So why bring it up in every single post when the cards will be released and be awesome.

Also the 4850 is a cut down 4870, it's running GDDR3 instead, but the GPU isn't clocked as high and it has a weaker power setup on it, which would make me think that 4870's would be a higher binned chip, and if a chip didn't pass it moved down to a 4850, and if the chip didn't pass there it moved down to a 4830. That or if a chip failed as a 4870, but didn't fail on clocks, and failed on SPU's then it skipped 4850 and went right to the 4830.

And mentioning that, look how well 4830's OC for the most part, I don't even know if they are really binned down since their was good demand for them. So who's to say the 5850 is really a binned down product? It might be off the bat, but if it sells well I'm betting 5870 GPU's that are higher binned will get cut down and used.

Either way, your looking way to much into this, defective products or not ATI is obviously going to make profit on this and is happy with the turnover rate enough to bring it to the market now, so who cares.
No no no, your still missing the point. I know the HD4850 is weaker than the HD4870. I'm strickly talking about the core. Traditionally, ATi has used the same core configuration, but a weaker memory setup on their second from the top card. They have changed that and are now using a weaker core, with the same memory configuration.
btarunrHD 4890 comes with 120+ GB/s memory bandwidth, should something faster than that also have higher bandwidth?
Not necessarily. The HD2900XT comes with 100+ GB/s memory bandwidth, should something faster than that also have higher bandwidth...

Yet the next two generations after that had cards with less memory bandwidth, that were easily faster. Memory bandwidth doesn't need to increase with new cards. The HD5800 series might really benefit from it, or maybe ATi even removed the GDDR3 memory controller from the core, forcing GDDR5 use. If that was the case then cutting down the core was necesary. But we don't know, and that is why I'm asking.
Posted on Reply
Add your own comment
May 5th, 2024 17:38 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts