• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sapphire Radeon HD 5870 and HD 5850 Smile for the Camera

lol it dosnt hurt to have some forward planing for the future at least when you own one you will have no need to change your sig :p
 
I disagree, the HD4850 wasn't just crippled because of memory, it was clocked a far bit lower on the core/shader clock also. That drastically affected performance. The HD4870/4890 had issued competing with nVidia's GPUs with over 100 GB/s, and it had plenty of memory bandwidth. The over 100MHz downclock on the HD4850 was more of a performance factor than the GDDR3 memory bandwidth.

I don't believe that a HD5850 with GDDR3 would perform that much worse than the incarnation we are seeing here.

Of course memory isn't the only thing that downscales HD 4850, don't think people are that stupid, that you can jump from one component to another just to show that that one component we're debating on doesn't cripple the GPU as much, and hence you should be right.

And I do believe that GDDR3 would have crippled HD 5850. It would not have been able to target GTX 285 with sub-100 GB/s memory bandwidths.
 
Last edited:
heres my question..will getting a single 5870 be better than my 2 4850's in crossfire? Cuz if that the case..i want one. and then later, another one!
 
I think that it could be poorer yields that made ATI harvest 5850 dice. It could also be that 5870 is so bw limited that lower clocked 5850 would not be that much slower and would cannibalise the 5870. Either way, it doesnt matter to prospective 5850 buyers
 
heres my question..will getting a single 5870 be better than my 2 4850's in crossfire? Cuz if that the case..i want one. and then later, another one!

I'm pretty sure it would. And it would eat half the power as well.
 
heres my question..will getting a single 5870 be better than my 2 4850's in crossfire? Cuz if that the case..i want one. and then later, another one!

Yes it will.
 
True enuff, would also run cooler in my case as well..of course eventually id have to get another one to run in crossfire, just because, well because i can!


would love to see some benchmarking for sure.
 
Of course memory isn't the only thing that downscales HD 4850, don't think people are that stupid, that you can jump from one component to another just to show that that one component we're debating on doesn't cripple the GPU as much, and hence you should be right.

And I do believe that GDDR3 would have crippled HD 5850. It would not have been able to target GTX 285 with sub-100 GB/s memory bandwidths.

Well, the only people that can really answer that are the engineers at ATi. We've both made out beliefs clear, but the fact is that neither will be confirmed, unless they release a GDDR3 version down the road(or someone downclocks the HD5870 to give GDDR3 memory bandwidth and does test, maybe we can get W1z to do it).

And I really don't think targetting the GTX285 is intelligent in any way, and I really hope that is not what they had in mind when making these cards.
 
Love those fake ass stickers
 
Don't think I'm going to buy another ATI card till they up the memory bandwidth. What this thing could do with 512bit bus. I've pretty much always owned ATI but going to hold out for Nvidia's answer to this card. They will be using ddr5 and with their larger bus width, it's going to be a killer for sure. Just my opinion, I think ATI screwed up sticking with the traditional 256bit bus.
 
heres my question..will getting a single 5870 be better than my 2 4850's in crossfire? Cuz if that the case..i want one. and then later, another one!

From the 'leaked' specs and what is on paper, yes it should act slightly better than two 4850's in CF scaling perfectly, but that's on paper, and nobody knows for sure yet.

Like usual, time will tell, you only need wait a few days now.
 
Im thinking it should be a pretty good increase, especially since i have the 512 versions of the 4850. well, ill sure be watching this closely.
 
While you guys are getting these i'll be getting a GTX 260.:)

I am a gamer, so its not like it matters which card i get that much at 1440x900.
 
Don't think I'm going to buy another ATI card till they up the memory bandwidth. What this thing could do with 512bit bus. I've pretty much always owned ATI but going to hold out for Nvidia's answer to this card. They will be using ddr5 and with their larger bus width, it's going to be a killer for sure. Just my opinion, I think ATI screwed up sticking with the traditional 256bit bus.
Oh god, stop saying "it would be much better with more bandwidht", "BIG CHIPS NEED BIG BANDWIDHT" or " MOAR BITS IS FOR THE WIN".
No one here knows how more bandwidht would affect the performance. Also, ATI engineers are not dumb, they built the chip, they know how much bandwidht the chip needs.

About the Radeons, I think this cooler is more beautiful:
Ati-Radeon-HD-5850-01.jpg
 
I wonder how the Powercolor PCS+ coolers will be.
I hope they make them vent the heat out of the case unlike their previous cards.
 
They Crippled the core for future usage. Since its the only one out there for now, it would makes sense. Now if ever GT300 would out do it, its full core time for them with competitive price.
 
Images removed at request of saphire?? Why.....

Anyways they do looks pretty sweet and a lot better than their predecessors
 
Think I'll be building a second PC around this GFX card when it's out and using a new corsair case.
 
Don't think I'm going to buy another ATI card till they up the memory bandwidth. What this thing could do with 512bit bus. I've pretty much always owned ATI but going to hold out for Nvidia's answer to this card. They will be using ddr5 and with their larger bus width, it's going to be a killer for sure. Just my opinion, I think ATI screwed up sticking with the traditional 256bit bus.

In practical terms.... DDR5 effectively doubles the bandwidth so in essence you are getting 512Bit bus, there is absolutely no need for this card to have an effective bandwidth of a Gigabit by giving it a 512bit bus.
 
In practical terms.... DDR5 effectively doubles the bandwidth so in essence you are getting 512Bit bus, there is absolutely no need for this card to have an effective bandwidth of a Gigabit by giving it a 512bit bus.

yeah the amount of extra cost $/performance-wise of a 512bit bus prolly isn't worth it.

I am sure that this was the optimal configuration all things considered, I mean I doubt the engineers chose the 256-bit with GDDR5 for sh**s and giggles.
 
In practical terms.... DDR5 effectively doubles the bandwidth so in essence you are getting 512Bit bus, there is absolutely no need for this card to have an effective bandwidth of a Gigabit by giving it a 512bit bus.

Knowledge has been dropped. :rockout:
 
In practical terms.... DDR5 effectively doubles the bandwidth so in essence you are getting 512Bit bus, there is absolutely no need for this card to have an effective bandwidth of a Gigabit by giving it a 512bit bus.

Knowledge has been dropped. :rockout:

But... but... but... the Nvidia cards have a bigger number!
 
did anyone notice the misprint on the box of X2 240e saying 'True Quad Core design'? :laugh::roll:

check that asian website:D
 
Back
Top