# Sapphire Radeon HD 5870 and HD 5850 Smile for the Camera



## btarunr (Sep 18, 2009)

Here are the fist pictures of Sapphire's Radeon HD 5800 series offerings: Radeon HD 5870 1GB and Radeon HD 5850. The cards sport the usual sticker design of a CGI girl in a reddish background. With these cards having the cosmetic "red-streak" cleaving the cooler shroud in the center, the so is the sticker. This is also perhaps the first public picture of the Radeon HD 5850, and our size projections were right: While the Radeon HD 5870 maintains a long PCB, the HD 5850 is about as long as a Radeon HD 4870 (reference design). Both accelerators stick to the reference AMD design.

* Images removed at request of Sapphire * Google for alternate source

*View at TechPowerUp Main Site*


----------



## kylzer (Sep 18, 2009)

Look so much better than the HD48xx series 

i really can't wait for release.


----------



## skylamer (Sep 18, 2009)

Yea,h it is really nice and the specs are fantastic^^


----------



## Crazykenny (Sep 18, 2009)

kylzer said:


> Look so much better than the HD48xx series
> 
> i really can't wait for release.



I still think my Vapor-X HD4870 from Sapphire looks sexy.


----------



## newtekie1 (Sep 18, 2009)

Can't wait for performance numbers!

What I find interesting is that ATi has decided to use a crippled core on the HD5850.  In the past they've used the full core, but changed the memory subsystem.  This time they've crippled the core, and left the memory relatively unchanged.

I wonder if that has anything to do with the problems with 40nm?


----------



## happita (Sep 18, 2009)

Vapor-X cards will always look dead sexy, they are the Fat Bastards of Austin Powers...


----------



## sapetto (Sep 18, 2009)

It's nice to see that they have the same cooler because i find it sexy .


----------



## pantherx12 (Sep 18, 2009)

Don't like the look of the 870.

The 850 looks much nicer, better CGI lass and more black, win!


----------



## Fitseries3 (Sep 18, 2009)

ewwww


take the sticker off. 

it looks so much better plain


----------



## AsRock (Sep 18, 2009)

Cool go faster line + a go faster sticker. As long as the card does well i'm happy .


----------



## KainXS (Sep 18, 2009)

lol the 5850 sticker looks better than the 5870 sticker, what the hell sapphire


----------



## pantherx12 (Sep 18, 2009)

I prefer the simple design as well, although a bit of colour might be okay, some sort of simple fractal.


----------



## wahdangun (Sep 18, 2009)

newtekie1 said:


> Can't wait for performance numbers!
> 
> What I find interesting is that ATi has decided to use a crippled core on the HD5850.  In the past they've used the full core, but changed the memory subsystem.  This time they've crippled the core, and left the memory relatively unchanged.
> 
> I wonder if that has anything to do with the problems with 40nm?



i think it becaouse there are no GDDR6 to begin with, and if ati use GDDR3/4 with HD 5850 it will definitely useless because it will become bandwidth limited (just like those 9600 GS with 96sp)


i can't wait for wizz benches,


----------



## LaidLawJones (Sep 18, 2009)

So this card must run pretty cool at max load as I do not see any cooling vents from this angle. If the other side is the same, then the majority of air intake is through the fan. Is this finally a stock cooling solution that works well and is quiet but not exotic?

To bad the bar down the center doesn't light up.


----------



## btarunr (Sep 18, 2009)

newtekie1 said:


> I wonder if that has anything to do with the problems with 40nm?



GTX 260 comes with a crippled core too. That doesn't indicate foundry problems, does it? It's merely how the SKUs are carved out. 

They wanted to make a full transition to GDDR5, and maybe didn't want to use GDDR3. Over 100 GB/s memory bandwidth using 256-bit GDDR3 is not possible / requires expensive chips (one of these), and under 100 GB/s of bandwidth will probably bottleneck Cypress.


----------



## TheMailMan78 (Sep 18, 2009)

Fitseries3 said:


> ewwww
> 
> 
> take the sticker off.
> ...



I agree. WTF happen the the other ones with Mussels mouth I mean "decorative air holes in the back" and the sloped cooler? These suck.

Oh and Bta GO TO BED MAN!


----------



## btarunr (Sep 18, 2009)

TheMailMan78 said:


> Oh and Bta GO TO BED MAN!



Wrong mod. It's time for Mussels to hit the bed, not me.


----------



## legends84 (Sep 18, 2009)

woah... power


----------



## TheMailMan78 (Sep 18, 2009)

btarunr said:


> Wrong mod. It's time for Mussels to hit the bed, not me.



Yeah I forget. I keep thinking India is the polar opposite in time. Nevermind. Continue on your regularly scheduled thread.


----------



## human_error (Sep 18, 2009)

Hmm, interesting.

Glad to see the 5850 with a dual slot cooler this round - i always felt the cooler on the 4850 could have been a better design. 

I also noticed how the 5850 is shorter than the 5870 there - by a reasonable amount as well. After seeing leaked images of the 5870 pcb it leaves me wandering what they removed from the pcb that is needed for the 5870 but not the 5850. I guess i'll have to wait until next week to find out


----------



## TheMailMan78 (Sep 18, 2009)

human_error said:


> Hmm, interesting.
> 
> Glad to see the 5850 with a dual slot cooler this round - i always felt the cooler on the 4850 could have been a better design.
> 
> I also noticed how the 5850 is shorter than the 5870 there - by a reasonable amount as well. After seeing leaked images of the 5870 pcb it leaves me wandering what they removed from the pcb that is needed for the 5870 but not the 5850. I guess i'll have to wait until next week to find out


It has to do with power consumption. Our GPU keeps getting smaller but the capacitors and resistors stay the same. 

The 5850 takes one less PCI-E connector than the 5870.


----------



## laszlo (Sep 18, 2009)

i don't want to see how they look !

i want to see real benches!


----------



## TheMailMan78 (Sep 18, 2009)

laszlo said:


> i don't want to see how they look !
> 
> i want to see real benches!



*Official 58XX series benches.*


Crysis: Faster than the 4870
Call of Duty 4: Faster than the 4870
FarCry2: Faster than the 4870
3DMark06: Faster than the 4870
Vantage: Faster than the 4870
Feel better? Now lets talk about the thread subject.


----------



## Imsochobo (Sep 18, 2009)

Hope cooler designs stay the same 

a 4870 waterblock( full coverage block) fits on 5870


----------



## newtekie1 (Sep 18, 2009)

wahdangun said:


> i think it becaouse there are no GDDR6 to begin with, and if ati use GDDR3/4 with HD 5850 it will definitely useless because it will become bandwidth limited (just like those 9600 GS with 96sp)
> 
> 
> i can't wait for wizz benches,



Not useless, just crippled.  That same way crippling the core would hinder it.  I don't believe dropping to GDDR3 would have hurt it any more than disabling shaders.



btarunr said:


> GTX 260 comes with a crippled core too. That doesn't indicate foundry problems, does it? It's merely how the SKUs are carved out.
> 
> They wanted to make a full transition to GDDR5, and maybe didn't want to use GDDR3. Over 100 GB/s memory bandwidth using 256-bit GDDR3 is not possible / requires expensive chips (one of these), and under 100 GB/s of bandwidth will probably bottleneck Cypress.



It does indicate defective GPUs, nVidia has used the crippling method to get rid of defective cores for as far back as I can remember.  ATi used to do it also, but stopped recently, at least when the GPUs first game out.  They would add the crippled GPUs in later.

I'm not saying it is a bad thing to do this, don't get me wrong, there is nothing wrong with it.  It is a common practice in the industry, and if it leads to cheaper parts that perform well, I'm all for it!  And it has given use some of the best performance for the money cards, the 8800GT/9800GT, 8800GS/9600GSO, HD4830, 8800GTS 320MB, X1900GT, GTX260, X800GTO, 7800GT...

I'm just trying to figure out why ATi is changing their strategy all of a sudden.  I wouldn't be surprised if they are still having issues with 40nm, and that is part of the decision.


----------



## TheMailMan78 (Sep 18, 2009)

newtekie1 said:


> I'm just trying to figure out why ATi is changing their strategy all of a sudden.


 They heard about you newtekie1. ATI's engineers new mission in life is now to confuse you. Billions will be spent on this endeavor and they will not stop until you become so confused that björk will finally make sense and you support AMD.


----------



## Imsochobo (Sep 18, 2009)

newtekie1 said:


> I'm just trying to figure out why ATi is changing their strategy all of a sudden.  I wouldn't be surprised if they are still having issues with 40nm, and that is part of the decision.



***Note not full quote.

Well, ati have been doing it for quite some time.
4850 is crippled to some degree, however all shaders live.
4830 is crippled, shaders removed.
4860 is crippled, shaders removed. less crippled than 4830. or crippled 4890's.

Ati rushed to get 4xxx out while 5xxx is been a longer process, might have picked up more crippled cores due to longer production run than 4xxx had in the start.

might not get a clear answer on that from ati


----------



## newtekie1 (Sep 18, 2009)

Imsochobo said:


> ***Note not full quote.
> 
> Well, ati have been doing it for quite some time.
> 4850 is crippled to some degree, however all shaders live.
> ...



Yes, I realize that they have done it in the last generation, but not at first.  However, they haven't done it at the onset of a product life cycle in a long time.  And I mentioned the HD4850 having a crippled memory subsystem, and that is actually what I was expecting on the HD5850 also.  The full core, with weaker memory.


----------



## mdm-adph (Sep 18, 2009)

newtekie1 said:


> I'm just trying to figure out why ATi is changing their strategy all of a sudden.  I wouldn't be surprised if they are still having issues with 40nm, and that is part of the decision.



(Man -- always with the FUD, aren't you?  Do you get paid for this?  Just enjoy your Nvidia cards, and let ATI shine for a few months -- come on, give it a break..)

I think you're really looking too far into this.  I don't think it's foundry problems  -- I think it's more just to do with ATI finally increasing a bit of revenue and market share, and finally having the time and money to do those same practices that other video cards producers perform, having long enjoyed their time at the top.


----------



## TheMailMan78 (Sep 18, 2009)

mdm-adph said:


> (Man -- always with the FUD, aren't you?  Do you get paid for this?  Just enjoy your Nvidia cards, and let ATI shine for a few months -- come on, give it a break..)
> 
> I think you're really looking too far into this.  I don't think it's foundry problems  -- I think it's more just to do with ATI finally increasing a bit of revenue and market share, and finally having the time and money to do those same practices that other video cards producers perform, having long enjoyed their time at the top.



This week on Dawson's Creek.


----------



## mdm-adph (Sep 18, 2009)

TheMailMan78 said:


> This week on Dawson's Creek.



Oh, give over.  You can be just as bad sometimes, capitalist running-dog.


----------



## btarunr (Sep 18, 2009)

newtekie1 said:


> Not useless, just crippled.  That same way crippling the core would hinder it.  I don't believe dropping to GDDR3 would have hurt it any more than disabling shaders.



It would have since it needs memory bandwidth. 

And even if it's harvesting "defective" cores, that's purely academic, not the consumer's concern at all. The consumer gets a warranty-backed product.


----------



## TheMailMan78 (Sep 18, 2009)

mdm-adph said:


> Oh, give over.  You can be just as bad sometimes, capitalist running-dog.



Yeah man but to everyone. You seem to be stalking newtekie1.


----------



## jaredpace (Sep 18, 2009)

nice sig, btarunr!


----------



## mdm-adph (Sep 18, 2009)

TheMailMan78 said:


> Yeah man but to everyone. You seem to be stalking newtekie1.



Not at all, but I do believe he's being paid to spread FUD.

I think there's a "preponderance of evidence."


----------



## MoonPig (Sep 18, 2009)

Ahhh... they look good. 

I don't understand the fecking CGI girl though... whats the point? Crap logo.

Whats the release date of the 5850? i forget..


----------



## pantherx12 (Sep 18, 2009)

Same day as the 870 : ]


----------



## MoonPig (Sep 18, 2009)

Lol... which is?


----------



## Imsochobo (Sep 18, 2009)

btarunr said:


> It would have since it needs memory bandwidth.
> 
> And even if it's harvesting "defective" cores, that's purely academic, not the consumer's concern at all. The consumer gets a warranty-backed product.



I support bt in that statement and adds to the cause.



newtekie1 said:


> Yes, I realize that they have done it in the last generation, but not at first.  However, they haven't done it at the onset of a product life cycle in a long time.  And I mentioned the HD4850 having a crippled memory subsystem, and that is actually what I was expecting on the HD5850 also.  The full core, with weaker memory.




Disabling those cores do less damage than going down to GDDR3 since it would be like a 4850 with 4850x2 performance, that card did suck you know.
Low memory bandwidth, limited to 1680x1050.

As a reminder, 256 bit GDDR3 on high end cards do 73 gb/sec
256 bit midrange GDDR5 does 140 gb/sec.
Going to 512 bit GDDR3 would make it more expensive than 5870 and still output less performance.


----------



## mechtech (Sep 18, 2009)

And I..........Jizz in my PANTS


----------



## PVTCaboose1337 (Sep 18, 2009)

Does it have dual HDMI, or is one of those HDMI and one Displayport?


----------



## newtekie1 (Sep 18, 2009)

btarunr said:


> It would have since it needs memory bandwidth.
> 
> And even if it's harvesting "defective" cores, that's purely academic, not the consumer's concern at all. The consumer gets a warranty-backed product.



I don't think it is any more special than previous GPUs that it needs 100Gb/s+ memory bandwidth to be functional.  Again, I'm just wondering if it really is because the memory bandwidth is needed, or because there are still problems with 40nm.

And the cores being defective doesn't matter, as I've said, it is standard practice.



TheMailMan78 said:


> Yeah man but to everyone. You seem to be stalking newtekie1.



Maybe if we don't feed him, he'll go back under his bridge...


----------



## ZoneDymo (Sep 18, 2009)

I know that the look of the actual card should be the very last thing you worry about but imo this design is boring as hell.

The Saphire Vapor-X coolers for the 4870 and 4890, now THAT is design!


----------



## morphy (Sep 18, 2009)

agree ^^...I'm just waiting for the Vapor-X edition of the 5870 and by then the prices would have come down too. win-win


----------



## wahdangun (Sep 18, 2009)

newtekie1 said:


> I don't think it is any more special than previous GPUs that it needs 100Gb/s+ memory bandwidth to be functional.  Again, I'm just wondering if it really is because the memory bandwidth is needed, or because there are still problems with 40nm.
> 
> And the cores being defective doesn't matter, as I've said, it is standard practice.
> 
> ...




yes it will definitely, bandwidth limited, even RV 770 when equipped with GDDR3 become bandwidth limited (HD 4850)  and when they equipped with GDDR5(HD 4870) the performance increase drastically (and can compete with GTX 260), so i say it's useless to use GDDR3 for that amount of power


----------



## morphy (Sep 18, 2009)

MoonPig said:


> Lol... which is?



Sept23rd


----------



## Zubasa (Sep 18, 2009)

newtekie1 said:


> I don't think it is any more special than previous GPUs that it needs 100Gb/s+ memory bandwidth to be functional.  Again, I'm just wondering if it really is because the memory bandwidth is needed, or because there are still problems with 40nm.
> 
> And the cores being defective doesn't matter, as I've said, it is standard practice.


The bandwidth really hinders the 4850's performance as you can see from reviews of the 4830.
The 4830's core is both crippled and clocked lower but still manager keep up with the 4850.
http://www.techpowerup.com/reviews/Powercolor/HD_4830/26.html
The 4850 is less than 10% faster due to its suckass bandwidth.

Not to mention the 4770 which can take on a 4850 simply becaue it is clocked higher. (And its memory clocks much better)
The number of tmus and shaders don't seems to matter as much as the number of ROPs, 
this might be due to the fact that most games are still optimized for shader model 3.0.


----------



## johnnyfiive (Sep 18, 2009)

peel the stickers off, perfect.


----------



## ArmoredCavalry (Sep 18, 2009)

Well, comparing my reference Sapphire 4870 to these, I have to say the 5k series looks much better...

Now where are the benchmarks!!!!


----------



## Deleted member 24505 (Sep 18, 2009)

Not that i would buy sapphire ever again,that ahole GG is such a tit it as put me off sapphire permanently.


----------



## Zubasa (Sep 18, 2009)

ArmoredCavalry said:


> Well, comparing my reference Sapphire 4870 to these, I have to say the 5k series looks much better...
> 
> Now where are the benchmarks!!!!


I am still worry about the 5870 might get bottlenecked pretty hard by the bandwidth,
since it has basically twice the power of a 4870 but the bandwidth does not seems to grow nearly as much.


----------



## btarunr (Sep 18, 2009)

newtekie1 said:


> I don't think it is any more special than previous GPUs that it needs 100Gb/s+ memory bandwidth to be functional.  Again, I'm just wondering if it really is because the memory bandwidth is needed, or because there are still problems with 40nm.



Nobody is talking about functionality, it's about being competitive. Radeon HD 4890 needed that extra bit of memory bandwidth, so if HD 5850 is touted to be twice as fast as a Radeon HD 4850, it definitely needs that bandwidth. Once again, it does not matter if HD 5850 is produced out of "defective" dies, because the resulting product is not defective. So no point in this "40 nm is faulty" rhetoric. Disabling components to carve out SKUs is not an indication of foundry-related problems.


----------



## kylzer (Sep 18, 2009)

^_^ im sure you've said that like 3 times now lol

Damn only 4-5days to go now and i can buy my new card and get rid of this 8800GS


----------



## btarunr (Sep 18, 2009)

kylzer said:


> ^_^ im sure you've said that like 3 times now lol



Yeah, because the other person repeated the same thing too.


----------



## Imsochobo (Sep 18, 2009)

tigger said:


> Not that i would buy sapphire ever again,that ahole GG is such a tit it as put me off sapphire permanently.



there are 11 stickers from sapphire on my keyboard since 2 years when its 1st of january.
None are defective, or had issues.

Not that other cards have had issues, had one RMA of a production fault from p0werc0lor with a limited x800 edition.


----------



## Zubasa (Sep 18, 2009)

I just can't wait to get rid of he sucker I have now, it don't have both the bandwidth and memory capacity to let mey game at full HD.... 
Even overclocking doesn't help much, the GDDR3 just won't clock much without crashing. 
There is simply not enough bandwidth with the 4850, as the performance increase much more by OCing the memory than the GPU.

On top of this, I just can't stop myself from thinking that ATi stuck the worst bined chips possible on the first batch of 4850s.
There is no other GPU that I have clock this badly.
No way I am buying the first batches of 5850s unless they are dirt cheap and OC like nobody's business.
I paid $1700 HK for this sucker, thats $213 US and now it is barely stable a its factory OC.


----------



## Ra97oR (Sep 18, 2009)

Looks fine.


----------



## Easo (Sep 18, 2009)

Eh lol, i need money right now


----------



## 1Kurgan1 (Sep 18, 2009)

Awesome, cant wait till they hit the market and I finally have the money for one.


----------



## lemonadesoda (Sep 18, 2009)

I'm getting bored of these manga girl fighters on the GPUs. It was OK way back when, but now the imagery is old, stale and tired.

If they want to stick with this manga stuff, then they should have more chicks on there...  the hotter the card (more powerful), the more chicks, and for OC cards, they should wear less and less. Obviously the ultimate extreme versions of the card would then be XXX rated... available only to over 18, and consequently making them even more desirable.

Of course, the über fastest, wild, overclockerized super dooper edition would have a dude "benchmarking" a bevvy of near naked pool chick roller girls.

Naturally, the default windows jingle would be replaced by the drivers with some snazzy jazzed up pimp anthem.


----------



## erocker (Sep 18, 2009)

lemonadesoda said:


> If they want to stick with this manga stuff, then they should have more chicks on there...  the hotter the card (more powerful), the more chicks, and for OC cards, they should wear less and less. Obviously the ultimate extreme versions of the card would then be XXX rated... only making them even more desirable.
> 
> Of course, the über fastest, wild, overclockerized super dooper edition would have a dude benchmarking a bevvy of near naked pool chick roller girls.
> 
> Naturally, the default windows jingle would be replaced by the drivers with some snazzy jazzed up pimp anthem.



Your logic is absolutely flawless.


----------



## MoonPig (Sep 18, 2009)

I dont get the use of lasses on these... If they were REAL lasses, then ok... but silly CGI ones just look daft.

And i second the use of less and less clothes... that alone would make me want the XXX version. Pity a waterblock doesn't have the same image


----------



## newtekie1 (Sep 18, 2009)

btarunr said:


> Nobody is talking about functionality, it's about being competitive. Radeon HD 4890 needed that extra bit of memory bandwidth, so if HD 5850 is touted to be twice as fast as a Radeon HD 4850, it definitely needs that bandwidth. Once again, it does not matter if HD 5850 is produced out of "defective" dies, because the resulting product is not defective. So no point in this "40 nm is faulty" rhetoric. Disabling components to carve out SKUs is not an indication of foundry-related problems.



No it doesn't, there have been plenty of cards that were twice as fast that didn't need extra memory bandwidth.  I mean the HD4850 was twice as fast as the HD3850, and both used GDDR3.  I'm not saying the bandwidth would have made no difference, I'm just saying, I believe, it would have crippled it as much as the cut down die does.

And your take on the defective dies, I agree with.  It doesn't matter because the end product is not defective.  That is not my point, and has nothing to do with what I am saying.  I'm wondering if part of the decision to use the cut down dies was because of a high defective rate, indicating that there is still an issue with 40nm.  I believe it does indicate that.  That by itself is not enough to come to that conclusion, but when you couple it with the fact that we already know 40nm has problems, and it really isn't a hard conclusion to draw.  When nVidia did it with G80, G92, and GT200 we didn't know that 65nm and 55nm was having issues, so we couldn't assume that is why they did it.  On top of that, nVidia has used this practice for generations, while ATi has not.  So ATi suddenly using it, coupled with the already known issues with 40nm, it what makes me wonder.  This issue is also not a negative thing on ATi, it isn't a negative at all, I'm just curious.  I don't want to see these things released and a promised price, but low supplies causing the prices to be jacked up, or even worse a repeat of the HD4770 with essentially a paper launch.


----------



## TheMailMan78 (Sep 18, 2009)

lemonadesoda said:


> I'm getting bored of these manga girl fighters on the GPUs. It was OK way back when, but now the imagery is old, stale and tired.
> 
> If they want to stick with this manga stuff, then they should have more chicks on there...  the hotter the card (more powerful), the more chicks, and for OC cards, they should wear less and less. Obviously the ultimate extreme versions of the card would then be XXX rated... available only to over 18, and consequently making them even more desirable.
> 
> ...



Hawking has nothing on you man. You're the F#$KING master of all that is awesome!


----------



## FreedomEclipse (Sep 18, 2009)

5870 for me


----------



## 1Kurgan1 (Sep 18, 2009)

newtekie1 said:


> And your take on the defective dies, I agree with.  It doesn't matter because the end product is not defective.  That is not my point, and has nothing to do with what I am saying.  I'm wondering if part of the decision to use the cut down dies was because of a high defective rate, indicating that there is still an issue with 40nm.  I believe it does indicate that.  That by itself is not enough to come to that conclusion, but when you couple it with the fact that we already know 40nm has problems, and it really isn't a hard conclusion to draw.  When nVidia did it with G80, G92, and GT200 we didn't know that 65nm and 55nm was having issues, so we couldn't assume that is why they did it.  On top of that, nVidia has used this practice for generations, while ATi has not.  So ATi suddenly using it, coupled with the already known issues with 40nm, it what makes me wonder.  This issue is also not a negative thing on ATi, it isn't a negative at all, I'm just curious.  I don't want to see these things released and a promised price, but low supplies causing the prices to be jacked up, or even worse a repeat of the HD4770 with essentially a paper launch.



ATI hasn't? I can remember back to the differences of the x1950pro vs x1950 XT. Either way, theres always going to be "defective" products with any manufacturing process. They cut down the 3870 to a 3850, they cut down the 4870 to a 4850 then a 4830. So who's to assume that the 5870 to 5850 scenario is any different? And even if it is, what does it matter as the 5850 will most likely move more products at launch anyways.


----------



## A Cheese Danish (Sep 18, 2009)

That is one sweet card 
Can't wait to hold it in my hands and have it housed in my rig


----------



## newtekie1 (Sep 18, 2009)

1Kurgan1 said:


> ATI hasn't? I can remember back to the differences of the x1950pro vs x1950 XT. Either way, theres always going to be "defective" products with any manufacturing process. They cut down the 3870 to a 3850, they cut down the 4870 to a 4850 then a 4830. So who's to assume that the 5870 to 5850 scenario is any different? And even if it is, what does it matter as the 5850 will most likely move more products at launch anyways.



*sigh* You obviously aren't getting it.

1.) The x1950Pro used a completely different core than the x1950XT.  It used RV570, the x1950XT used RV580.  The x1900GT used a cut down core from the x1900XT, however the x1900GT didn't come out until near the end of the x1900 product cycle.
2.) I know there will always be defective cores...not products...cores.  I'm not saying anything about the final products.
3.) They did not cut down the HD3870 core to make an HD3850.  They did not cur down the HD4870 core to make an HD4850.  They did cut down the HD4870 core to make the HD4830, however.
4.) The difference is because ATi has traditionally has not launched a product line with defective cut down cores.  They add the SKUs using the defective cut down cores later down the road.


----------



## TheMailMan78 (Sep 18, 2009)

I  feel bad for newtekie1. He's not bashing ATI. He's wondering why they changed strategy so much and how that change will effect supply and price. ATI could make a 2nm chip with a 12,000Mhz GPU and sell it for $3.00 but if the yields are bad demand will be higher than supply thus jacking up its price.

Sure the card retails for 3 bucks but if ATI can only produce a dozen of them how much do you think they will REALLY cost. Basically newtekie1 is talking supply and demand. Nothing more.

@newtekie1. You have to keep things simple man. You talk to damn much about something simple it makes it complicated.


----------



## 1Kurgan1 (Sep 18, 2009)

A Cheese Danish said:


> Can't wait to hold it in my hands and have it housed in my rig



Your talking about a videocard here right?


----------



## TheMailMan78 (Sep 18, 2009)

1Kurgan1 said:


> Your talking about a videocard here right?



If not his dog better watch out!


----------



## 1Kurgan1 (Sep 18, 2009)

newtekie1 said:


> 3.) They did not cut down the HD3870 core to make an HD3850.  They did not cur down the HD4870 core to make an HD4850.  They did cut down the HD4870 core to make the HD4830, however.
> 4.) The difference is because ATi has traditionally has not launched a product line with defective cut down cores.  They add the SKUs using the defective cut down cores later down the road.



I'm getting it, but what I'm saying is your looking far too much into this, it doesn't matter either way. So why bring it up in every single post when the cards will be released and be awesome.

Also the 4850 is a cut down 4870, it's running GDDR3 instead, but the GPU isn't clocked as high and it has a weaker power setup on it, which would make me think that 4870's would be a higher binned chip, and if a chip didn't pass it moved down to a 4850, and if the chip didn't pass there it moved down to a 4830. That or if a chip failed as a 4870, but didn't fail on clocks, and failed on SPU's then it skipped 4850 and went right to the 4830.

And mentioning that, look how well 4830's OC for the most part, I don't even know if they are really binned down since their was good demand for them. So who's to say the 5850 is really a binned down product? It might be off the bat, but if it sells well I'm betting 5870 GPU's that are higher binned will get cut down and used.

Either way, your looking way to much into this, defective products or not ATI is obviously going to make profit on this and is happy with the turnover rate enough to bring it to the market now, so who cares.


----------



## tkpenalty (Sep 18, 2009)

Newtekie's cause for concern is pretty valid imho. Considering how Nvidia lately has been only putting out PR spin instead of actually talking about their GT300, its likely that what Charlie said about the 3% yield rates of the GT300 being true-this means a problem for AMD as well who probably share the same process that nvidia is using.

Even though places like Charlie's most of the time is bullshit, how did he come up with a figure of 3% then? And why is nvidia just doing PR spin with investor advisories lately?

Btw with the 4xxx series, I believe its not binning but just automated selection where they just cast off the dies on the edges of a wafer for the lower end derivatives, which statistically speaking suck.


----------



## btarunr (Sep 18, 2009)

newtekie1 said:


> No it doesn't, there have been plenty of cards that were twice as fast that didn't need extra memory bandwidth.  I mean the HD4850 was twice as fast as the HD3850, and both used GDDR3.  I'm not saying the bandwidth would have made no difference, I'm just saying, I believe, it would have crippled it as much as the cut down die does.



HD 4890 comes with 120+ GB/s memory bandwidth, should something faster than that also have higher bandwidth? And it was able to harvest RV770/RV790 ASICs using cut-down configurations _and_ also let them be priced low. So I don't see how that part isn't looking likely.


----------



## TheMailMan78 (Sep 18, 2009)

1Kurgan1 said:


> I'm getting it, but what I'm saying is your looking far too much into this, it doesn't matter either way. So why bring it up in every single post when the cards will be released and be awesome.
> 
> Also the 4850 is a cut down 4870, it's running GDDR3 instead, but the GPU isn't clocked as high and it has a weaker power setup on it, which would make me think that 4870's would be a higher binned chip, and if a chip didn't pass it moved down to a 4850, and if the chip didn't pass there it moved down to a 4830. That or if a chip failed as a 4870, but didn't fail on clocks, and failed on SPU's then it skipped 4850 and went right to the 4830.
> 
> ...



He didn't say it would have problems selling. From what I read hes been pretty positive about it by newtekie1 standards. He was just stating his curiosity about the manufacturing strategy and how it will effect all of us down the road. Not that ATI is doing anything wrong, cheap, or other wise bad. Just WHY the sudden change. Ever heard the term if its too good to be true then it probably is?

Also I think he's been repeating himself as a defense mechanism. Some people on this forum love to attack before they try and understand what someone is saying. Hence his repetitiveness. Also keep in mind we all are not the best at getting our point across and lets face it, most of us are social misfits. 

Anyway I'm done defending newtekie1. If yall don't get it now then you need to ride the short bus to school.


----------



## newtekie1 (Sep 18, 2009)

1Kurgan1 said:


> I'm getting it, but what I'm saying is your looking far too much into this, it doesn't matter either way. So why bring it up in every single post when the cards will be released and be awesome.
> 
> Also the 4850 is a cut down 4870, it's running GDDR3 instead, but the GPU isn't clocked as high and it has a weaker power setup on it, which would make me think that 4870's would be a higher binned chip, and if a chip didn't pass it moved down to a 4850, and if the chip didn't pass there it moved down to a 4830. That or if a chip failed as a 4870, but didn't fail on clocks, and failed on SPU's then it skipped 4850 and went right to the 4830.
> 
> ...




No no no, your still missing the point.  I know the HD4850 is weaker than the HD4870.  I'm strickly talking about the core.  Traditionally, ATi has used the same core configuration, but a weaker memory setup on their second from the top card.  They have changed that and are now using a weaker core, with the same memory configuration.




btarunr said:


> HD 4890 comes with 120+ GB/s memory bandwidth, should something faster than that also have higher bandwidth?



Not necessarily.  The HD2900XT comes with 100+ GB/s memory bandwidth, should something faster than that also have higher bandwidth...

Yet the next two generations after that had cards with less memory bandwidth, that were easily faster.  Memory bandwidth doesn't need to increase with new cards.  The HD5800 series might really benefit from it, or maybe ATi even removed the GDDR3 memory controller from the core, forcing GDDR5 use.  If that was the case then cutting down the core was necesary.  But we don't know, and that is why I'm asking.


----------



## tkpenalty (Sep 18, 2009)

http://www.fudzilla.com/content/view/15535/65/ lol


----------



## phanbuey (Sep 18, 2009)

newtekie1 said:


> I'm just trying to figure out why ATi is changing their strategy all of a sudden.  I wouldn't be surprised if they are still having issues with 40nm, and that is part of the decision.



Yeah I definitely see that... it's either harvesting or that they didn't want the 5850 to cannibalize the 5870 - since both have GDDR5.


----------



## Flyordie (Sep 18, 2009)

I shall be waiting for another company to release theirs... Can't go with Sapphire... After what their customer service did to that lady.. the sub-par build quality... but mainly just customer support is really bad with AthlonMicro...


----------



## wahdangun (Sep 18, 2009)

newtekie1 said:


> ...
> 
> Yet the next two generations after that had cards with less memory bandwidth, that were easily faster.  Memory bandwidth doesn't need to increase with new cards.  The HD5800 series might really benefit from it, or maybe *ATi *even *removed the GDDR3 memory controller from the core*,* forcing GDDR5 use*.  If that was the case then cutting down the core was necesary.  But we don't know, and that is why I'm asking.



there you go, you got the answer already. ati will use GDDR5 to all card, even for low end card (maybe to cut down complex PCB and thus cost saving)

and we don't have new type of ram, so they can't use old strategy. and they was forced to used cut down version instead. and i believe if they have developed GDDDR6 then they will use it for HD 5870 and use GDDR5 for HD 5850.


----------



## phanbuey (Sep 18, 2009)

Flyordie said:


> I shall be waiting for another company to release theirs... Can't go with Sapphire... *After what their customer service did to that lady*.. the sub-par build quality... but mainly just customer support is really bad with AthlonMicro...



 you make it sound like they took her out back and ...


----------



## 1Kurgan1 (Sep 18, 2009)

TheMailMan78 said:


> He didn't say it would have problems selling. From what I read hes been pretty positive about it by newtekie1 standards. He was just stating his curiosity about the manufacturing strategy and how it will effect all of us down the road. Not that ATI is doing anything wrong, cheap, or other wise bad. Just WHY the sudden change. Ever heard the term if its too good to be true then it probably is?
> 
> Also I think he's been repeating himself as a defense mechanism. Some people on this forum love to attack before they try and understand what someone is saying. Hence his repetitiveness. Also keep in mind we all are not the best at getting our point across and lets face it, most of us are social misfits.
> 
> Anyway I'm done defending newtekie1. If yall don't get it now then you need to ride the short bus to school.



I am sure there are issues, both companys went with rebranding on last gen, and I would assume the smaller the manufacturing costs the more we will see of it, most likely it's here to stay. Thats all I'm saying, I just don't see if affecting anyone down the line, unless they get a stock pile of cards that don't sell, which rebranding or not, that would hurt a company.

I understand what he's saying, I just don't think it's a huge issue.



newtekie1 said:


> No no no, your still missing the point.  I know the HD4850 is weaker than the HD4870.  I'm strickly talking about the core.  Traditionally, ATi has used the same core configuration, but a weaker memory setup on their second from the top card.  They have changed that and are now using a weaker core, with the same memory configuration.



The core on the 4850 is weaker, despite being full featured it's a binned down product. I think they went this route though to move away from DDR3, it would most likely be cheaper for their company to use 1 type of memory, where as the 4000 series was their bridge to doing so. Even 4850's started popping up with GDDR5. I'm not missing any points here  I know what your saying things are changing, but thats the way the world works, especially in manufacturing processes where advancement is very fast.



Flyordie said:


> I shall be waiting for another company to release theirs... Can't go with Sapphire... After what their customer service did to that lady.. the sub-par build quality... but mainly just customer support is really bad with AthlonMicro...



Did to what lady?


----------



## A Cheese Danish (Sep 18, 2009)

1Kurgan1 said:


> Your talking about a videocard here right?





TheMailMan78 said:


> If not his dog better watch out!



Yes, I know I'm talking about a video card


----------



## Steevo (Sep 19, 2009)

Where is my 2GB edition!!!!


----------



## trt740 (Sep 19, 2009)

any confirmation on prices?


----------



## Roph (Sep 19, 2009)

It's the girlfriend's birthday on sept 23rd, and she sometimes plays on my computer, perhaps I could order a 5850 "for her" 

And with AMD's recent track record, would it be surprising to find out that the extra shaders on the 5850 are unlockable?


----------



## Vincy Boy (Sep 19, 2009)

Roph said:


> It's the girlfriend's birthday on sept 23rd, and she sometimes plays on my computer, perhaps I could order a 5850 "for her"
> 
> And with AMD's recent track record, would it be surprising to find out that the extra shaders on the 5850 are unlockable?



Man after my own heart there.


----------



## Velvet Wafer (Sep 19, 2009)

Roph said:


> It's the girlfriend's birthday on sept 23rd, and she sometimes plays on my computer, perhaps I could order a 5850 "for her"
> 
> And with AMD's recent track record, would it be surprising to find out that the extra shaders on the 5850 are unlockable?



do you really think she will grant that? mine wouldnt


----------



## p_o_s_pc (Sep 19, 2009)

KainXS said:


> lol the 5850 sticker looks better than the 5870 sticker, what the hell sapphire





AsRock said:


> Cool go faster line + a go faster sticker. As long as the card does well i'm happy .





Fitseries3 said:


> ewwww
> 
> 
> take the sticker off.
> ...





pantherx12 said:


> Don't like the look of the 870.
> 
> The 850 looks much nicer, better CGI lass and more black, win!



What the hell guys. How many people stare at there card when its in the case? its not like you really even see the sticker anyways.


----------



## Initialised (Sep 19, 2009)

Until this card the GTX295 (in stock trim with no stickers) was the best looking (and performing card. Lose the sticker and this one takes both crowns.

Itching to play with one of these puppies and see what they can do.


----------



## Initialised (Sep 19, 2009)

p_o_s_pc said:


> What the hell guys. How many people stare at there card when its in the case? its not like you really even see the sticker anyways.


What you dont have a window and angle your case so it your graphics card is visible from the other side of the room?







The also look damn sexy on test benches:


----------



## newtekie1 (Sep 19, 2009)

wahdangun said:


> there you go, you got the answer already. ati will use GDDR5 to all card, even for low end card (maybe to cut down complex PCB and thus cost saving)
> 
> and we don't have new type of ram, so they can't use old strategy. and they was forced to used cut down version instead. and i believe if they have developed GDDDR6 then they will use it for HD 5870 and use GDDR5 for HD 5850.



I don't have an answer, because we don't know for sure that the GDDR3 memory controller has been removed, only ATi knows that.  Unless you have seen some news that I haven't.

I think at this point, removing the GDDR3 controller would be a bad move, as GDDR5 is still in relatively short supply compared to GDDR3.  And I doubt they have removed it, though it is possible.



1Kurgan1 said:


> The core on the 4850 is weaker, despite being full featured it's a binned down product. I think they went this route though to move away from DDR3, it would most likely be cheaper for their company to use 1 type of memory, where as the 4000 series was their bridge to doing so. Even 4850's started popping up with GDDR5. I'm not missing any points here  I know what your saying things are changing, but thats the way the world works, especially in manufacturing processes where advancement is very fast.



Of course it was weaker, but it wasn't cut down.  I'm a weaker pitcher compared to a major league pitcher, but my arms haven't been cut off.  There is a major difference between just not being able to clock as high, and having parts physically disabled.  And traditionally ATi hasn't done this, so my question is still why?


----------



## p_o_s_pc (Sep 19, 2009)

Initialised said:


> What you dont have a window and angle your case so it your graphics card is visible from the other side of the room?
> 
> http://farm4.static.flickr.com/3315/3427142585_e1fc29f5bc_b.jpg
> 
> ...



they do look sexy on the test bench. i don't have a window on this case but have all my other cases and never did i have it angled so it would shot the video card. I don't give a shit what my card looks like. The only reason i painted my cooler is cause other people care. To me it was fine just how it was


----------



## WarEagleAU (Sep 19, 2009)

For all this back and forth, no one has brought out the main and most intriguing point: Ruby has dreds on the 5870!!!!


----------



## p_o_s_pc (Sep 19, 2009)

WarEagleAU said:


> For all this back and forth, no one has brought out the main and most intriguing point: Ruby has dreds on the 5870!!!!



i didn't even notice that. i think she looks sexy on the 5850


----------



## Bo_Fox (Sep 19, 2009)

1Kurgan1 said:


> Did to what lady?



Wow, I can't believe that it's still hurting Sapphire right now, with this "awareness" going on..

To answer your Q, Kurgan, it's that lady over at [H]forums..   Sapphire customer service murdered her!!!   


just kidding..  but it was a huge issue a year ago or 2..


----------



## phanbuey (Sep 19, 2009)

newtekie1 said:


> No no no, your still missing the point.  I know the HD4850 is weaker than the HD4870.  I'm strickly talking about the core.  Traditionally, ATi has used the same core configuration, but a weaker memory setup on their second from the top card.  They have changed that and are now using a weaker core, with the same memory configuration.



what about 3870 and 3850 - they were literally identical cards... with the same memory interface... (although I think the 3850 was 256 memory standard and the 3870 was 512)

only DX 10 series with weaker memory config was the the 4xxx series.  Usually they go with a weaker (manufacturing-wise) core and cheaper construction and power cuircuitry.


----------



## Maban (Sep 19, 2009)

Would be nice to see a real 5850 and not just a shortened photoshopped 5870.


----------



## PP Mguire (Sep 19, 2009)

Wow, was considering Sapphire 5870 but damn thats ugly. I think ill wait till the bette rlooking cards come out.


----------



## Flyordie (Sep 19, 2009)

Bo_Fox said:


> Wow, I can't believe that it's still hurting Sapphire right now, with this "awareness" going on..
> 
> To answer your Q, Kurgan, it's that lady over at [H]forums..   Sapphire customer service murdered her!!!
> 
> ...





p_o_s_pc said:


> i didn't even notice that. i think she looks sexy on the 5850





1Kurgan1 said:


> Did to what lady?



http://www.hardforum.com/showthread.php?t=1241346
--
Also, For those of you who said the gal on the cooler was sexy... welll....... what about this girl...


----------



## erocker (Sep 19, 2009)

PP Mguire said:


> Wow, was considering Sapphire 5870 but damn thats ugly. I think ill wait till the bette rlooking cards come out.



I'm painting mine pink and drab green. It looks fine from the side anyways.


----------



## PP Mguire (Sep 19, 2009)

You should wholesale those and sell me one =)


----------



## btarunr (Sep 19, 2009)

newtekie1 said:


> Not necessarily.  The HD2900XT comes with 100+ GB/s memory bandwidth, should something faster than that also have higher bandwidth...



HD 4000 series needed the 100+ GB/s memory bandwidth, evident from memory speeds enhancing performance. Same applies to HD 5850. Using the very same logic, a GeForce GTX 260 that is as fast as a HD 4870 doesn't need 111 GB/s then. And GTX 285 which is as fast as HD 4850 X2 doesn't need all that bandwidth either.


----------



## lemonadesoda (Sep 19, 2009)

erocker said:


> I'm painting mine pink and drab green. It looks fine from the side anyways.



Pink drag-queen edition? Might sell...


----------



## Maban (Sep 19, 2009)

lemonadesoda said:


> Pink drag-queen edition? Might sell...



I'd buy it.


----------



## wahdangun (Sep 19, 2009)

or pink ponie edition XXX


----------



## Velvet Wafer (Sep 19, 2009)

i hope this card will be out soon. the jokes start to get really lame


----------



## TheMailMan78 (Sep 19, 2009)

They could sell a "Save the TaTa's" edition GPU to raise money to fight breast cancer. One thing nerds love is some tits. Ati could sell them in sets of two. Hell I'd pay extra for that and put save the titties on my desktop.


----------



## Kovoet (Sep 19, 2009)

LOL can't believe you guys and girls worry about the colours of the actual card.


----------



## Zubasa (Sep 19, 2009)

newtekie1 said:


> Not necessarily.  The HD2900XT comes with 100+ GB/s memory bandwidth, should something faster than that also have higher bandwidth...
> 
> Yet the next two generations after that had cards with less memory bandwidth, that were easily faster.  Memory bandwidth doesn't need to increase with new cards.  The HD5800 series might really benefit from it, or maybe ATi even removed the GDDR3 memory controller from the core, forcing GDDR5 use.  If that was the case then cutting down the core was necesary.  But we don't know, and that is why I'm asking.


You really should not bring out the 2900XT as an example, because we all know that ATi mispredicted that design pretty badly.  It is only about as fast as a 3870 really.

I can telll you right now that the rv770 on a 4850 is quite limited by its memory bandwidth.


----------



## newtekie1 (Sep 19, 2009)

phanbuey said:


> what about 3870 and 3850 - they were literally identical cards... with the same memory interface... (although I think the 3850 was 256 memory standard and the 3870 was 512)
> 
> only DX 10 series with weaker memory config was the the 4xxx series.  Usually they go with a weaker (manufacturing-wise) core and cheaper construction and power cuircuitry.



No they weren't, the HD3870 used GDDR4 and the HD3850 used GDDR3.

The last time we saw identical cards was the HD2900XT and HD2900Pro, and eventually ATi changed the HD2900Pro to use a 256-bit bus instead of the 512-bit like the HD2900XT.



btarunr said:


> HD 4000 series needed the 100+ GB/s memory bandwidth, evident from memory speeds enhancing performance. Same applies to HD 5850. Using the very same logic, a GeForce GTX 260 that is as fast as a HD 4870 doesn't need 111 GB/s then. And GTX 285 which is as fast as HD 4850 X2 doesn't need all that bandwidth either.



When has memory speeds NOT enhanced performance?  Never...so your logic is flawed.  Yes, lowering the memory bandwidth by moving to GDDR3 would have hurt performance, so does crippling the core.  The whole point of BOTH is to hinder performance.  ATi has never used this strategy to hinder performance right out of the starting block.  The cards don't need GDDR5 to perform well.


----------



## btarunr (Sep 19, 2009)

newtekie1 said:


> When has memory speeds NOT enhanced performance?  Never...so your logic is flawed.  Yes, lowering the memory bandwidth by moving to GDDR3 would have hurt performance, so does crippling the core.  The whole point of BOTH is to hinder performance.  ATi has never used this strategy to hinder performance right out of the starting block.  The cards don't need GDDR5 to perform well.



Then your argument is flawed, because you acknowledge that faster the memory the better. Today's performance GPUs need over 100 GB/s memory bandwidths to remain competitive. HD 4850 with its GDDR3 memory was competitive with G92-based GPUs, but not with NVIDIA GPUs with over 100 GB/s memory bandwidth. RV770/RV790 products with GDDR5 were. So once again, AMD could not carve a Pro SKU with GDDR3 memory, because there's no way you can end up with sufficient bandwidths on a 256-bit wide interface.


----------



## newtekie1 (Sep 19, 2009)

btarunr said:


> Then your argument is flawed, because you acknowledge that faster the memory the better. Today's performance GPUs need over 100 GB/s memory bandwidths to remain competitive. HD 4850 with its GDDR3 memory was competitive with G92-based GPUs, but not with NVIDIA GPUs with over 100 GB/s memory bandwidth. RV770/RV790 products with GDDR5 were. So once again, AMD could not carve a Pro SKU with GDDR3 memory, because there's no way you can end up with sufficient bandwidths on a 256-bit wide interface.



I disagree, the HD4850 wasn't just crippled because of memory, it was clocked a far bit lower on the core/shader clock also.  That drastically affected performance.  The HD4870/4890 had issued competing with nVidia's GPUs with over 100 GB/s, and it had plenty of memory bandwidth.  The over 100MHz downclock on the HD4850 was more of a performance factor than the GDDR3 memory bandwidth.

I don't believe that a HD5850 with GDDR3 would perform that much worse than the incarnation we are seeing here.


----------



## wahdangun (Sep 19, 2009)

but i really don't care what ever strategy ati use as long as it perform good and have reasonable price(HD 5850), i will buy it.


----------



## pantherx12 (Sep 19, 2009)

Shitty cards but for an example of memory giving a performance boost.

HD4350 - HD550 only difference is memory type and you get quite a nice boost.


----------



## Mistral (Sep 19, 2009)

newtekie1, as much as I love the way you manage to contradict yourself, I'm starting to wander what exactly is the point of your last few posts...

Sure faster/wider memory is nicer, sure clocking higher the core makes it faster. Also sure is that the people at DAAMIT who designed the cards have at least a vague idea what to do in order to hit the market segments they need to. And this is what it's all about.


----------



## lemode (Sep 19, 2009)

I tend to stay away from Gigabyte, Sparkle, PNY, MSI, and Sapphire graphic cards only because every gamer I’ve ever been friends with has had issues with these manufactures.

I’ve bought ASUS, HIS, EVGA, BFG, XFX and haven’t had a problem with any.


----------



## dr emulator (madmax) (Sep 19, 2009)

arguing with the news editor is a big :shadedshu 
i just hope it lasts more than five minuets
as i want a card with good performance and longevity


----------



## a_ump (Sep 19, 2009)

newteckie's origional point or question makes sense to me and i'd have to say you cant say yes or no with what we know. But what stated with 40nm yield issues makes sense. I'm sure there had to be some defective RV770 core when they started manufacturing, yet they didn't use those at the start for a crippled HD 4850 with same memory subsystem like they're doing with the HD 5850. Instead i'd assume they kept the defective RV770's cores for when they released the HD 4830. There's nothing wrong with speculation or curiosity. 

Yet HD 5850 looks better than the HD 5870, but meh i'll probly take the sticker off, never now might lower core temp by 1 degree


----------



## p_o_s_pc (Sep 19, 2009)

Flyordie said:


> http://www.hardforum.com/showthread.php?t=1241346
> --
> Also, For those of you who said the gal on the cooler was sexy... welll....... what about this girl...
> http://i32.photobucket.com/albums/d20/Flyordie07/Rage3D/PB132570.jpg



she isn't bad. I like the one on the 5850 better


----------



## F430 (Sep 20, 2009)

WOW


----------



## grunt_408 (Sep 20, 2009)

lol @ wizzard mon is GPU-Z working with 5xxx series yet ? I am confused...
Is this real? Is the one above real wtf? CPU scores look a little squwiff


----------



## kylzer (Sep 20, 2009)

Well i know the GPUZ is fake or modded somehow in f430s pic

im sure there both fake tbh.

EDIT wait?

thats a HD4890 lol


----------



## grunt_408 (Sep 20, 2009)

kylzer said:


> Well i know the GPUZ is fake or modded somehow in f430s pic
> 
> im sure there both fake tbh.
> 
> ...



I know about the 4890, 
I was curious about the 5xxx GPU-Z screeny and posted up 4890 results for comparison.
I remember reading here on TPU that Wiz has not yet released a version of GPU-Z supporting the 5xxx series cards.

P.S I see your sig is fake too 

Going back on topic now though them cards do look great . I wonder why the vent on the back is so small?


----------



## kylzer (Sep 20, 2009)

Craigleberry said:


> I know about the 4890,
> I was curious about the 5xxx GPU-Z screeny and posted up 4890 results for comparison.
> I remember reading here on TPU that Wiz has not yet released a version of GPU-Z supporting the 5xxx series cards.
> 
> P.S I see your sig is fake too



Yeah i thought he said that too

http://forums.techpowerup.com/showpost.php?p=1552086&postcount=716

and yes ^_^ premature i guess


----------



## grunt_408 (Sep 20, 2009)

lol it dosnt hurt to have some forward planing for the future at least when you own one you will have no need to change your sig


----------



## btarunr (Sep 20, 2009)

newtekie1 said:


> I disagree, the HD4850 wasn't just crippled because of memory, it was clocked a far bit lower on the core/shader clock also.  That drastically affected performance.  The HD4870/4890 had issued competing with nVidia's GPUs with over 100 GB/s, and it had plenty of memory bandwidth.  The over 100MHz downclock on the HD4850 was more of a performance factor than the GDDR3 memory bandwidth.
> 
> I don't believe that a HD5850 with GDDR3 would perform that much worse than the incarnation we are seeing here.



Of course memory isn't the only thing that downscales HD 4850, don't think people are that stupid, that you can jump from one component to another just to show that that one component we're debating on doesn't cripple the GPU as much, and hence you should be right. 

And I do believe that GDDR3 would have crippled HD 5850. It would not have been able to target GTX 285 with sub-100 GB/s memory bandwidths.


----------



## stevednmc (Sep 20, 2009)

heres my question..will getting a single 5870 be better than my 2 4850's in crossfire? Cuz if that the case..i want one. and then later, another one!


----------



## air_ii (Sep 20, 2009)

I think that it could be poorer yields that made ATI harvest 5850 dice. It could also be that 5870 is so bw limited that lower clocked 5850 would not be that much slower and would cannibalise the 5870. Either way, it doesnt matter to prospective 5850 buyers


----------



## air_ii (Sep 20, 2009)

stevednmc said:


> heres my question..will getting a single 5870 be better than my 2 4850's in crossfire? Cuz if that the case..i want one. and then later, another one!



I'm pretty sure it would. And it would eat half the power as well.


----------



## btarunr (Sep 20, 2009)

stevednmc said:


> heres my question..will getting a single 5870 be better than my 2 4850's in crossfire? Cuz if that the case..i want one. and then later, another one!



Yes it will.


----------



## stevednmc (Sep 20, 2009)

True enuff, would also run cooler in my case as well..of course eventually id have to get another one to run in crossfire, just because, well because i can!


would love to see some benchmarking for sure.


----------



## newtekie1 (Sep 20, 2009)

btarunr said:


> Of course memory isn't the only thing that downscales HD 4850, don't think people are that stupid, that you can jump from one component to another just to show that that one component we're debating on doesn't cripple the GPU as much, and hence you should be right.
> 
> And I do believe that GDDR3 would have crippled HD 5850. It would not have been able to target GTX 285 with sub-100 GB/s memory bandwidths.



Well, the only people that can really answer that are the engineers at ATi.  We've both made out beliefs clear, but the fact is that neither will be confirmed, unless they release a GDDR3 version down the road(or someone downclocks the HD5870 to give GDDR3 memory bandwidth and does test, maybe we can get W1z to do it).

And I really don't think targetting the GTX285 is intelligent in any way, and I really hope that is not what they had in mind when making these cards.


----------



## OneCool (Sep 20, 2009)

Love those fake ass stickers


----------



## Woody112 (Sep 20, 2009)

Don't think I'm going to buy another ATI card till they up the memory bandwidth. What this thing could do with 512bit bus. I've pretty much always owned ATI but going to hold out for Nvidia's answer to this card. They will be using ddr5 and with their larger bus width, it's going to be a killer for sure. Just my opinion, I think ATI screwed up sticking with the traditional 256bit bus.


----------



## wolf (Sep 20, 2009)

stevednmc said:


> heres my question..will getting a single 5870 be better than my 2 4850's in crossfire? Cuz if that the case..i want one. and then later, another one!



From the 'leaked' specs and what is on paper, yes it should act slightly better than two 4850's in CF scaling perfectly, but that's on paper, and nobody knows for sure yet.

Like usual, time will tell, you only need wait a few days now.


----------



## stevednmc (Sep 20, 2009)

Im thinking it should be a pretty good increase, especially since i have the 512 versions of the 4850. well, ill sure be watching this closely.


----------



## CDdude55 (Sep 20, 2009)

While you guys are getting these i'll be getting a GTX 260.

I am a gamer, so its not like it matters which card i get that much at 1440x900.


----------



## Anonimo (Sep 21, 2009)

Woody112 said:


> Don't think I'm going to buy another ATI card till they up the memory bandwidth. What this thing could do with 512bit bus. I've pretty much always owned ATI but going to hold out for Nvidia's answer to this card. They will be using ddr5 and with their larger bus width, it's going to be a killer for sure. Just my opinion, I think ATI screwed up sticking with the traditional 256bit bus.


Oh god, stop saying "it would be much better with more bandwidht", "BIG CHIPS NEED BIG BANDWIDHT" or " MOAR BITS IS FOR THE WIN".
No one here knows how more bandwidht would affect the performance. Also, ATI engineers are not dumb, they built the chip, they know how much bandwidht the chip needs.

About the Radeons, I think this cooler is more beautiful:


----------



## inferKNOX (Sep 21, 2009)

I wonder how the Powercolor PCS+ coolers will be.
I hope they make them vent the heat out of the case unlike their previous cards.


----------



## Bjorn_Of_Iceland (Sep 21, 2009)

They Crippled the core for future usage. Since its the only one out there for now, it would makes sense. Now if ever GT300 would out do it, its full core time for them with competitive price.


----------



## D3M0N-G4M3R (Sep 21, 2009)

Images removed at request of saphire?? Why.....

Anyways they do looks pretty sweet and a lot better than their predecessors


----------



## Kovoet (Sep 21, 2009)

Think I'll be building a second PC around this GFX card when it's out and using a new corsair case.


----------



## Tatty_One (Sep 21, 2009)

Woody112 said:


> Don't think I'm going to buy another ATI card till they up the memory bandwidth. What this thing could do with 512bit bus. I've pretty much always owned ATI but going to hold out for Nvidia's answer to this card. They will be using ddr5 and with their larger bus width, it's going to be a killer for sure. Just my opinion, I think ATI screwed up sticking with the traditional 256bit bus.



In practical terms.... DDR5 effectively doubles the bandwidth so in essence you are getting 512Bit bus, there is absolutely no need for this card to have an effective bandwidth of a Gigabit by giving it a 512bit bus.


----------



## phanbuey (Sep 21, 2009)

Tatty_One said:


> In practical terms.... DDR5 effectively doubles the bandwidth so in essence you are getting 512Bit bus, there is absolutely no need for this card to have an effective bandwidth of a Gigabit by giving it a 512bit bus.



yeah the amount of extra cost $/performance-wise of a 512bit bus prolly isn't worth it.

I am sure that this was the optimal configuration all things considered,  I mean I doubt the engineers chose the 256-bit with GDDR5 for sh**s and giggles.


----------



## TheMailMan78 (Sep 21, 2009)

Tatty_One said:


> In practical terms.... DDR5 effectively doubles the bandwidth so in essence you are getting 512Bit bus, there is absolutely no need for this card to have an effective bandwidth of a Gigabit by giving it a 512bit bus.



Knowledge has been dropped.


----------



## mdm-adph (Sep 21, 2009)

Tatty_One said:


> In practical terms.... DDR5 effectively doubles the bandwidth so in essence you are getting 512Bit bus, there is absolutely no need for this card to have an effective bandwidth of a Gigabit by giving it a 512bit bus.





TheMailMan78 said:


> Knowledge has been dropped.



But... but... but... the Nvidia cards have a bigger number!


----------



## suraswami (Sep 21, 2009)

did anyone notice the misprint on the box of X2 240e saying 'True Quad Core design'? 

check that asian website


----------



## Tatty_One (Sep 21, 2009)

mdm-adph said:


> But... but... but... the Nvidia cards have a bigger number!



5,875 versus 300 ......... I think not


----------

