• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sapphire Radeon HD 5870 and HD 5850 Smile for the Camera

I'm just trying to figure out why ATi is changing their strategy all of a sudden.
They heard about you newtekie1. ATI's engineers new mission in life is now to confuse you. Billions will be spent on this endeavor and they will not stop until you become so confused that björk will finally make sense and you support AMD.
 
I'm just trying to figure out why ATi is changing their strategy all of a sudden. I wouldn't be surprised if they are still having issues with 40nm, and that is part of the decision.

***Note not full quote.

Well, ati have been doing it for quite some time.
4850 is crippled to some degree, however all shaders live.
4830 is crippled, shaders removed.
4860 is crippled, shaders removed. less crippled than 4830. or crippled 4890's.

Ati rushed to get 4xxx out while 5xxx is been a longer process, might have picked up more crippled cores due to longer production run than 4xxx had in the start.

might not get a clear answer on that from ati :P
 
***Note not full quote.

Well, ati have been doing it for quite some time.
4850 is crippled to some degree, however all shaders live.
4830 is crippled, shaders removed.
4860 is crippled, shaders removed. less crippled than 4830. or crippled 4890's.

Ati rushed to get 4xxx out while 5xxx is been a longer process, might have picked up more crippled cores due to longer production run than 4xxx had in the start.

might not get a clear answer on that from ati :P

Yes, I realize that they have done it in the last generation, but not at first. However, they haven't done it at the onset of a product life cycle in a long time. And I mentioned the HD4850 having a crippled memory subsystem, and that is actually what I was expecting on the HD5850 also. The full core, with weaker memory.
 
I'm just trying to figure out why ATi is changing their strategy all of a sudden. I wouldn't be surprised if they are still having issues with 40nm, and that is part of the decision.

(Man -- always with the FUD, aren't you? Do you get paid for this? Just enjoy your Nvidia cards, and let ATI shine for a few months -- come on, give it a break..)

I think you're really looking too far into this. I don't think it's foundry problems -- I think it's more just to do with ATI finally increasing a bit of revenue and market share, and finally having the time and money to do those same practices that other video cards producers perform, having long enjoyed their time at the top.
 
(Man -- always with the FUD, aren't you? Do you get paid for this? Just enjoy your Nvidia cards, and let ATI shine for a few months -- come on, give it a break..)

I think you're really looking too far into this. I don't think it's foundry problems -- I think it's more just to do with ATI finally increasing a bit of revenue and market share, and finally having the time and money to do those same practices that other video cards producers perform, having long enjoyed their time at the top.

This week on Dawson's Creek.
 
Not useless, just crippled. That same way crippling the core would hinder it. I don't believe dropping to GDDR3 would have hurt it any more than disabling shaders.

It would have since it needs memory bandwidth.

And even if it's harvesting "defective" cores, that's purely academic, not the consumer's concern at all. The consumer gets a warranty-backed product.
 
Oh, give over. You can be just as bad sometimes, capitalist running-dog. :laugh:

Yeah man but to everyone. You seem to be stalking newtekie1.
 
Yeah man but to everyone. You seem to be stalking newtekie1.

Not at all, but I do believe he's being paid to spread FUD.

I think there's a "preponderance of evidence." :laugh:
 
Ahhh... they look good.

I don't understand the fecking CGI girl though... whats the point? Crap logo.

Whats the release date of the 5850? i forget..
 
Same day as the 870 : ]
 
It would have since it needs memory bandwidth.

And even if it's harvesting "defective" cores, that's purely academic, not the consumer's concern at all. The consumer gets a warranty-backed product.

I support bt in that statement and adds to the cause.

Yes, I realize that they have done it in the last generation, but not at first. However, they haven't done it at the onset of a product life cycle in a long time. And I mentioned the HD4850 having a crippled memory subsystem, and that is actually what I was expecting on the HD5850 also. The full core, with weaker memory.


Disabling those cores do less damage than going down to GDDR3 since it would be like a 4850 with 4850x2 performance, that card did suck you know.
Low memory bandwidth, limited to 1680x1050.

As a reminder, 256 bit GDDR3 on high end cards do 73 gb/sec
256 bit midrange GDDR5 does 140 gb/sec.
Going to 512 bit GDDR3 would make it more expensive than 5870 and still output less performance.
 
And I..........Jizz in my PANTS
 
Does it have dual HDMI, or is one of those HDMI and one Displayport?
 
It would have since it needs memory bandwidth.

And even if it's harvesting "defective" cores, that's purely academic, not the consumer's concern at all. The consumer gets a warranty-backed product.

I don't think it is any more special than previous GPUs that it needs 100Gb/s+ memory bandwidth to be functional. Again, I'm just wondering if it really is because the memory bandwidth is needed, or because there are still problems with 40nm.

And the cores being defective doesn't matter, as I've said, it is standard practice.

Yeah man but to everyone. You seem to be stalking newtekie1.

Maybe if we don't feed him, he'll go back under his bridge...
 
Last edited:
I know that the look of the actual card should be the very last thing you worry about but imo this design is boring as hell.

The Saphire Vapor-X coolers for the 4870 and 4890, now THAT is design!
 
agree ^^...I'm just waiting for the Vapor-X edition of the 5870 and by then the prices would have come down too. win-win :toast:
 
I don't think it is any more special than previous GPUs that it needs 100Gb/s+ memory bandwidth to be functional. Again, I'm just wondering if it really is because the memory bandwidth is needed, or because there are still problems with 40nm.

And the cores being defective doesn't matter, as I've said, it is standard practice.



Maybe if we don't feed him, he'll go back under his bridge...


yes it will definitely, bandwidth limited, even RV 770 when equipped with GDDR3 become bandwidth limited (HD 4850) and when they equipped with GDDR5(HD 4870) the performance increase drastically (and can compete with GTX 260), so i say it's useless to use GDDR3 for that amount of power
 
I don't think it is any more special than previous GPUs that it needs 100Gb/s+ memory bandwidth to be functional. Again, I'm just wondering if it really is because the memory bandwidth is needed, or because there are still problems with 40nm.

And the cores being defective doesn't matter, as I've said, it is standard practice.
The bandwidth really hinders the 4850's performance as you can see from reviews of the 4830.
The 4830's core is both crippled and clocked lower but still manager keep up with the 4850.
http://www.techpowerup.com/reviews/Powercolor/HD_4830/26.html
The 4850 is less than 10% faster due to its suckass bandwidth.

Not to mention the 4770 which can take on a 4850 simply becaue it is clocked higher. (And its memory clocks much better)
The number of tmus and shaders don't seems to matter as much as the number of ROPs,
this might be due to the fact that most games are still optimized for shader model 3.0.
 
Last edited:
Well, comparing my reference Sapphire 4870 to these, I have to say the 5k series looks much better...

Now where are the benchmarks!!!! :cry:
 
Not that i would buy sapphire ever again,that ahole GG is such a tit it as put me off sapphire permanently.
 
Back
Top