Wednesday, September 9th 2009

AMD Cypress Graphics Accelerator Pictured

Here's the first sighting of a fully-assembled upcoming AMD Cypress "Radeon HD 5870" accelerator. This photo-shoot comes a couple of days ahead of its unveiling to the press tomorrow. Here's our very first thoughts on what we see:
  • The accelerator is unusually long for a single-GPU one from AMD. The company wouldn't splurge too much on aesthetics (especially lengthening the PCB), if there's no need for it to do so. Apparently there is.
  • Connectivity options galore. With two DVI-D, and one each of DisplayPort and HDMI, AMD promises it can handle three display-heads per GPU.
  • The components behind the GPU area (exposed) indicates the GPU to be somewhat large
  • History tells us that AMD uses a backplate only if it finds a real utility in it, such as cooling additional memory chips or VRM components. This card has a large, almost full-coverage backplate.
What surfaced months ago on sources such as ChipHell, which was then ridiculed for accuracy, has finally taken shape. If anything, Cypress does look like it means business. Expect further details to be out soon. Cypress is codename for AMD's next-generation DirectX 11 compliant graphics processor in the high-performance segment.

* Images removed at request of AMD *
Source: Tweakers.net
Add your own comment

188 Comments on AMD Cypress Graphics Accelerator Pictured

#101
HammerON
The Watchful Moderator
Either way ~ this will be an interesting:)
Can't wait for the reviews!
Posted on Reply
#102
grunt_408
One cannot rely on rumour and innuendo I will believe it all when I see it even the pictures.
Posted on Reply
#103
human_error
Ser-JWhoever says its going to be faster then 4870x2 is crazy
Why? The 4870 was faster than the 3870x2 in a lot of real-world game tests (not synthetic benchmarks, but i don't play synthetic benchmarks every night). It wouldn't be unrealistic to believe the 5870 can be faster than the 4870x2 in real-world game testing. I guess we'll have to wait a couple of weeks for some decent reviews to come out before we can know for definate.
Posted on Reply
#105
Steevo
How they could just kill a man Nvidia
Posted on Reply
#106
jaydeejohn
The 3870x2 had faster clocks fewer shaders, so if the 5870 has as many shaders and faster clocks and memory, it should outdo the 4870x2
Posted on Reply
#107
Scrizz
jaydeejohnThe 3870x2 had faster clocks fewer shaders, so if the 5870 has as many shaders and faster clocks and memory, it should outdo the 4870x2
wut? :wtf:
Posted on Reply
#108
tkpenalty
LAN_deRf_HA...Why is it so hideous? That is some of the cheapest looking plastic I've ever seen on a pc part. Those red vents look like shit you'd see on a happy meal in the 90s. Even without them the quality of the finish on the rest of it is so absurdly sub-par. I'm officially considering this card nonexistent until they fix the quality issue. Maybe glossing up the main body and removing the vents would require the least redesign effort.
All I can say to your opinion is WOW. A matte finish is what that finish is. Its by no means cheaper or more expensive than a gloss finish because all that is different is the mould that the plastic is injected into. Personally i find gloss finishes hideous ;). Those vents look awesome. Just like a forumula 1 car's sidepod intakes.

And anyway whats so official about what you said there?
Posted on Reply
#109
Imsochobo
So people would rather try and buy a defective nvidia card with a glossy cooler than buy a card that works.

Is the same with cars.
Americans might love big engines and american cars but they go with an toyota cause it doesnt break....

Nvidia have had alot of issues lately, wonder if they can make the G300 in time and no bugs on it.
Posted on Reply
#112
Kei
I'm really loving the native HDMI support instead of using the converter that comes with the current gen cards. I"m eager to see how well this card does...and the inevitable 5850 version when it's announced as that's likely what I'll end up with if I buy a new generation card.

My 4850 is still monster enough to do anything I want now so it's no hurry, but with the native HDMI support and having the extra power wouldn't be a bad idea at all. :)

Kei
Posted on Reply
#113
lemonadesoda
LAN_deRf_HA...Why is it so hideous? That is some of the cheapest looking plastic I've ever seen on a pc part. Those red vents look like shit you'd see on a happy meal in the 90s. Even without them the quality of the finish on the rest of it is so absurdly sub-par. I'm officially considering this card nonexistent until they fix the quality issue. Maybe glossing up the main body and removing the vents would require the least redesign effort.
I agree it looks hideous and like a happy meal toy from the 90s.

BUT, some people like happy meal toys... and some people will like this hideous look.
Posted on Reply
#114
Zehnsucht
lemonadesodaI agree it looks hideous and like a happy meal toy from the 90s.

BUT, some people like happy meal toys... and some people will like this hideous look.
Actually, the plastic looks about the same as our prototypes that have been created with a 3D printer. So don't expect this to be final.
Posted on Reply
#115
Tatty_Two
Gone Fishing
enaherwow that looks vey powerfull for 299$, that thing runs hot, Ati would never use such a cooler unless it's really needed:rockout:
I'll still beleive $299 when i see it, especially if it is 50% faster than a 4890(I really hope it is though :rockout:), but the cooling seems weak to me, just look at the size of the fan, it's small and with the hot air from the VRM's and memory having to travel up the length of that card and out the back I would imagine unless the fan is on leaf blower mode it's gonna get warm in there! Why on earth both sides cant put a low RPM angled 80mm in there for better airflow and quietness, design looks pretty minimalistic though and I like that...... "flash" is so nineties :)
Posted on Reply
#116
a_ump
yea i remember seeing leaked pics that looked exactly like that n we all denied it. it looks fine to me, could be better but eh idc i just want it :). though i thought it was going to be 299. I think if they priced it at $349/5870, $279/5850 it'd be perfect. But idc lol i'm getting one as soon as is released :D
Posted on Reply
#117
InnocentCriminal
Resident Grammar Amender
erockerI like the red vents. :(
Same 'ere.
Posted on Reply
#118
Zubasa
a_umpyea i remember seeing leaked pics that looked exactly like that n we all denied it. it looks fine to me, could be better but eh idc i just want it :). though i thought it was going to be 299. I think if they priced it at $349/5870, $279/5850 it'd be perfect. But idc lol i'm getting one as soon as is released :D
Then the thing is, I don't see how aggressive this pricing is.:shadedshu
$299 5870 and $199 5850 will simply kick nVidias ass and nuts up and down the stairs. :rockout:
Posted on Reply
#119
gumpty
Can someone please explain to me why those holes at the end of the card are meant to be intakes.

As I understand it, those fans work by drawing air in from the top and pushing it radially out in all directions (physical barriers can be used to direct that flow). So for me those holes at the end seem more likely to be exhaust than intake - which would also explain the small exhaust out the back.
The problem with this idea though is that the air which vents out these holes into the case is not going to cool much on the board - maybe the power circuits though?

Feel free to correct me on that though.
Posted on Reply
#120
Mussels
Freshwater Moderator
aj28Would it blow your mind if I suggested the fan ran... Backwards!? =O

Anyway, I am largely against the high-end coolers made by both ATi and nVidia. Arctic Cooling's excuse (when they stopped manufacturing them to exhaust out the back, their primary selling point for the longest time) had to do with the air pressure difference between the inside of the case and the outside. It's certainly a thought, to be sure, though I'm pretty sure few (if any) people on this forum (including myself) are qualified to do much more than speculate on it.

Regardless though, death to those fans!! Why aren't the blades longer, anyway? Especially if they're going to utilize the side (or better yet the bottom) for intake, they don't need to leave that much space between the blades and the bearing.
most cases have a shit design, and use negative air pressure. every moron and his dog gets an erection over cases with huge 200mm exhaust fans.

In a case with more air out than in, the case would suck air in the same hole the GPU fan was trying to blow out, effectively making the card run hotter, or just sucking the hot air back in either side of the card.
Posted on Reply
#121
lococol
i think i'll stick with my 4870x2 for now , as for looks , who cares as long as it performs , personally i'd like to take all the cooling off and run it on water , then overclock it
Posted on Reply
#122
Sihastru
If those 1600 SPs are in 80 groups of 20, like the 800SPs of 4870/90 were in 80 groups of 10 then my crystal ball says RV870 will only be 25-40% faster then RV770. In highly optimized games could be higher.

The problem is ATI drivers had a hard time loading up all those groups of 10, imagine how difficult it will be with groups of 20. So older/current games will only feel a little tingling sensation and no real performance boost. Whatever.

So one 4870X2 will be better then one 5870. Kinda sad really.

My crystal ball has been known to be wrong, but come tomorrow I will not need it for this particular problem.
Posted on Reply
#123
Bjorn_Of_Iceland
sweet. but ima still stickin to my GTX200s.. not unless dx11 rendering method titles are out the same day that card comes out. Otherwise, its a waste of money and effort like them dx10.1 cards.
Posted on Reply
#124
REVHEAD
tiggerFor all i care,it could look like a 50yr old hooker.Its the performance that matters for me.
Right on.:nutkick:
Posted on Reply
#125
wolf
Better Than Native
Zubasa$299 5870 and $199 5850 will simply kick nVidias ass and nuts up and down the stairs. :rockout:
Ill believe it when I see it, apparently every generation ATi release will do this...

Its not about one brand kicking the others ass, its about both of them staying very competitive.

See how much you love ATi *if* they did kick Nvidias ass to bad they went bust, ATi would be frowned upon as the corporate giant. Hec even if they got kicked to a point were current market share is just swapped, Nvidia would become that hugely loved underdog that competes better on price than having the best of the best.

Competition is good, lets hope the price of either companies cards doesn't kick consumers asses up and down the stairs.
Posted on Reply
Add your own comment
Nov 22nd, 2024 03:02 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts