# ASUS EAH 5850 TOP DirectCU



## W1zzard (Feb 22, 2010)

ASUS has released what seems to be one of the best custom designed HD 5850s. The card which supports DirectX 11 and Eyefinity is extremely quiet, offers better price/performance than the AMD reference design, comes with an overclock out of the box and supports voltage control via SmartDoctor.

*Show full review*


----------



## Delta6326 (Mar 9, 2010)

Wow this really closed up the gap from the 5850 -5870 that thing rocks. 
Nice review Wizz


----------



## Sasqui (Mar 9, 2010)

Ditto, nice review.  one thing I noticed, you listed the CCC limits:

CCC Overdrive Limits 
Core 1200 MHz 
Memory 1400 MHz 

I thought the CCC limits were set at 900/1300 (or does this card have a BIOS that ups that?)


----------



## WarEagleAU (Mar 9, 2010)

Awesome review and just a 10 dollar bump? Not bad at all. Now just to wait patiently for prices to come down after Nvidia rolls out their beast.


----------



## W1zzard (Mar 9, 2010)

Sasqui said:


> Ditto, nice review.  one thing I noticed, you listed the CCC limits:
> 
> CCC Overdrive Limits
> Core 1200 MHz
> ...



ccc limits are where the sliders END .. ie. the maximum clock you can set using ccc


----------



## crow1001 (Mar 9, 2010)

> No support for CUDA / PhysX



Really, is that a negative? in recent past NV card reviews you should state " no eyefinity support " as a negative.


----------



## Sasqui (Mar 9, 2010)

W1zzard said:


> ccc limits are where the sliders END .. ie. the maximum clock you can set using ccc



Yes, and I distinctly recall them being 900/1300, unless the latest Cat's are higher?


----------



## heky (Mar 9, 2010)

I totally agree with you crow1001. And "no ATI Stream support" should be added to Nvidia reviews.
No offense Wizzard, its a great review, but this No support for CUDA/Physx is just a load of crap.


----------



## W1zzard (Mar 9, 2010)

Sasqui said:


> Yes, and I distinctly recall them being 900/1300, unless the latest Cat's are higher?



you have a asus 5850 directcu ? you are aware that these limits vary from card to card, right ?


----------



## [H]@RD5TUFF (Mar 9, 2010)

heky said:


> I totally agree with you crow1001. And "no ATI Stream support" should be added to Nvidia reviews.
> No offense Wizzard, its a great review, but this No support for CUDA/Physx is just a load of crap.



Any card can do more than one monitor, there is a difference from multi monitor support, and support for GPU standards, it wouldn't be a big deal if ATI had a comperiable technology, but they don't *yet*.


----------



## heky (Mar 9, 2010)

What are you talking about, what GPU standards? And doing 2 monitors and 6 is a big difference. Oh and ATI Stream is a comparable technology.


----------



## erocker (Mar 9, 2010)

heky said:


> What are you talking about, what GPU standards? And doing 2 monitors and 6 is a big difference. Oh and ATI Stream is a comparable technology.



I'm sorry but as a user of ATi and Nvidia, as it stands right now I'd rather see ATi use CUDA. Fact of the matter is, CUDA applications exist and work, ATi stream not so much. Once we actually start seeing Stream working, I may change my mind but as of right now, it's merely something to download on their developer site. I can clearly understand why it is said in the review.


----------



## pantherx12 (Mar 9, 2010)

I agree it shouldn't be listed as a negative, since lots of open source physics work on any GPU, physx is just a certain brand of physics simulation not the only one available. 

For example, that game just cause 2, the physics work fine and dandy for me and behave the same way as someone using an NV card ( seen videos posted by nvidia users) and I'm not even sure if that runs on the GPU, just on the processor.

Listing the lack of a gimmick as a negative is odd.


----------



## lism (Mar 9, 2010)

I think the Cuda/PhysX thing was mentioned before on Ati cards. Wizzard has'nt answered any of them why that is a negativity of the card.


----------



## Sasqui (Mar 9, 2010)

W1zzard said:


> you have a asus 5850 directcu ? you are aware that these limits vary from card to card, right ?



Apparently not... for the 58xx series I did not know that.  I assume that's the magical ASUS BIOS at work.  I always assumed it needed additional software to breach the CCC limits.


----------



## Kitkat (Mar 9, 2010)

crow1001 said:


> Really, is that a negative? in recent past NV card reviews you should state " no eyefinity support " as a negative.



yeah im so tired of seeing that its a nvid technology why do they need to support it?? they ran off and made something and didnt invite them to compete by making it open or accessible lol whys it a negative?? on the nvid reviews plz put No crossfire support as a negative.


----------



## W1zzard (Mar 9, 2010)

i find it amazing that you guys (ati fanboys according to system specs?) have nothing to discuss in every review, except for one stupid line of text .. that has been discussed in almost every review before.


----------



## pantherx12 (Mar 9, 2010)

W1zzard said:


> i find it amazing that you guys (ati fanboys according to system specs?) have nothing to discuss in every review, except for one stupid line of text .. that has been discussed in almost every review before.




Stop being so presumptuous it makes you look very foolish.

That people are criticizing you at all you should take seriously. ( let alone you imply that people have mentioned/complained about it before, perhaps that's because it is indeed NOT a negative) 


Phsyx IS just a brand and thus should not be treated as a positive OR a negative.


I take "fanboy"* as an insult and seeing as everyone else gets called out for personal insults you should too wizz.

I posted a perfectly valid comment and rather then doing the right thing of maybe making a valid comment back you just insult us all? I know this is your website, and your forum but can't you follow your own rules?

If the Answer is no then maybe TPU isn't the right forum for me.


If you could at-least reply to this post in a more productive manner that would be great.


*Along with dismissing the comments and not even bothering to counter some valid points.


----------



## W1zzard (Mar 10, 2010)

i have replied to the cuda/physx thing several times in previous reviews, just tired to see it every time again


----------



## pantherx12 (Mar 10, 2010)

Can you at the least link me to one of your previous posts then so I can see it for myself?


----------



## heky (Mar 10, 2010)

I am 100% with pantherx12. Oh and W1zzard, just becouse i own a ATI card, it doesent mean i am an ATI fanboy. I used Nvidia before, and maybe will again someday, but thats not the point. It can not be a negative that a ATI card doesnt support Physx. And if i am not mistakeing, physx is being droped more nad more these days.

And maybe(judging by your sig) you are just an NVidia fanboy.  joke


----------



## [H]@RD5TUFF (Mar 11, 2010)

heky said:


> What are you talking about, what GPU standards? And doing 2 monitors and 6 is a big difference. Oh and ATI Stream is a comparable technology.



Name me on ATI Stream app, please. Even if you can here is a range of uses that are already being done with CUDA. Sorry but ATI stream is just a fanboy talking point. Oh and by the way my Quadro powered computer can and does do six monitors.

Just let it go, saying anymore makes you look butt hurt.



pantherx12 said:


> I agree it shouldn't be listed as a negative, since lots of open source physics work on any GPU, physx is just a certain brand of physics simulation not the only one available.
> 
> For example, that game just cause 2, the physics work fine and dandy for me and behave the same way as someone using an NV card ( seen videos posted by nvidia users) and I'm not even sure if that runs on the GPU, just on the processor.
> 
> Listing the lack of a gimmick as a negative is odd.



So does that mean we shouldn't list lack of eyefinity, as a negative on Nvidia cards, since according to you "Listing the lack of a gimmick as a negative is odd."?

More over I can confidentely say the percentage of people that use eyefinity isn't even half of those who use PhysX. Wouldn't that make eyefinity more of gimmick that PhysX?

Just let it go, you've lost.


----------



## LifeOnMars (Mar 11, 2010)

[H]@RD5TUFF said:


> Name me on ATI Stream app, please. Even if you can here is a range of uses that are already being done with CUDA. Sorry but ATI stream is just a fanboy talking point. Oh and by the way my Quadro powered computer can do six monitors.
> 
> Just let it go, saying anymore makes you look butt hurt.
> 
> ...



Does somebody have a few unresolved issues


----------



## [H]@RD5TUFF (Mar 11, 2010)

LifeOnMars said:


> Does somebody have a few unresolved issues



*shrug* Whatever floats your boat.


----------



## LifeOnMars (Mar 11, 2010)

[H]@RD5TUFF said:


> *shrug* Whatever floats your boat.



Meh, shrugs, breaks wind, picks his nose......just stating the obvious


----------



## pantherx12 (Mar 11, 2010)

[H]@RD5TUFF said:


> So does that mean we shouldn't list lack of eyefinity, as a negative on Nvidia cards, since according to you "Listing the lack of a gimmick as a negative is odd."?
> 
> 
> .



You smoking the crack fella?

Did I say eyefinity should be listed as positive or negative?



As the answer is no I did not.

Eyefinity is cool but its something designed for a specific use in mind its handy to certain people but most people just use one screen.

So yes I do think that that lack of eyefinity should not be listed as a neg.




Why does everyone think I'm some sort of idiot fanboy all of a sudden?

I don't care about brands at all.

Hell I do my clothes shopping at primark !


----------



## heky (Mar 11, 2010)

[H]@RD5TUFF said:


> Name me on ATI Stream app, please. Even if you can here is a range of uses that are already being done with CUDA. Sorry but ATI stream is just a fanboy talking point. Oh and by the way my Quadro powered computer can and does do six monitors.
> 
> Just let it go, saying anymore makes you look butt hurt.
> 
> ...



First of all, CUDA/Physx is not open source, which means its not ATI´s fault it doesnt work on ATI gpus, its Nvidias fault, becouse it is disableing it in its drivers for Physx.
Second of all, now that OpenCL is in the game, Nvidia with its Physx will be obsolete. Fact. Now go cry over it.

Oh, and you are running 6 monitors on one Quadro, which model is that?


----------



## pantherx12 (Mar 11, 2010)

Physx maybe, Nvidia no way.

Both companies will continue to do fine for ages as they're so many fan boys who just buy one brand of card rather then choosing the best card for the job.

This applies to both ATI users and NV users.

Fanboyism = fail


----------



## W1zzard (Mar 11, 2010)

heky said:


> First of all, CUDA/Physx is not open source, which means its not ATI´s fault it doesnt work on ATI gpus, its Nvidias fault, becouse it is disableing it in its drivers for Physx.
> Second of all, now that OpenCL is in the game, Nvidia with its Physx will be obsolete. Fact. Now go cry over it.



don't you agree that "can not run windows software" is a valid issue for apple mac ? the same as blackberry/nokia/google os has no iphone app store.

as soon as we see more widespread adoption of opencl, i'll be the most happy person to remove the no cuda remark. but if i looked at opencl right now i'd have to say "ati opencl drivers suck". look at all the stream sdk issues..


----------



## heky (Mar 11, 2010)

Sure, i meant Physx not Nvidia. Nvidia will be using OpenCL too. 

I agree with you on that W1zzard, i was just trying to make a point, that Physx is Nvidia only due to Nvidia.


----------



## Phxprovost (Mar 11, 2010)

W1zzard said:


> don't you agree that "can not run windows software" is a valid issue for apple mac ? the same as blackberry/nokia/google os has no iphone app store.
> 
> as soon as we see more widespread adoption of opencl, i'll be the most happy person to remove the no cuda remark. but if i looked at opencl right now i'd have to say "ati opencl drivers suck". look at all the stream sdk issues..



 apples do run windows and google does have its own app store.  I have to agree, i don't think that it should say that either.  Honestly it is like if you reviewed an xbox360 and had doesn't play playstation network games as a con   It would be different if it was open source but its not....its a proprietary nvida api   From now on are you going to have doesn't support Eyefinity on all nivida cards?  Judging by the precedent your setting i think it should be there.


----------



## W1zzard (Mar 11, 2010)

i mention eyefinity as a plus in ati reviews, i dont think it is important enough to be mentioned as minus in nvidia reviews. what about nvidia 3d vision technology? not mentioned in either nvidia or ati review because it's a load of crap imo. nvidia has their own multi screen technology btw


----------



## Nelly (Mar 13, 2010)

Hi there, I really enjoyed reading this review as well as many others.

Just something that seems to be an error it says 940 then 970 then back down to 965 . . . ?





> I reached 890 MHz at 1.15V, *940* at 1.20V, *970* at 1.25V, *965* at 1.30V and *970* at 1.35V which is the maximum in SmartDoctor.


I know alot of people with the Reference ATI 5850/5870 have been using the MSI Afterburner because it's supposed to be a better piece of software for overclocking, would be interesting to see how far it clocked using that instead.


----------



## W1zzard (Mar 13, 2010)

yes, 1.30v was slightly less stable than 1.25V


----------



## erixx (Mar 27, 2010)

Just to add another POV: I apologize for my ignorance. I have been using Cyberlink PowerDirector's CUDA functions since a while, it's amazing how fast a video is created. As a soon to be ATI user, I wonder if there's a something like that that you can do with an ATI 5xxx card. ( I mean computational work that the ATI card can do for applications, apart from games and movie watching)


----------



## erixx (Mar 28, 2010)

http://www.cyberlink.com/products/powerdirector/faster-performance_en_US.html

haha, just learned that ATI Stream also helps Power Director, never cared to check that before.


----------



## st0mp (Apr 10, 2010)

*Performance per Dollar*

Nice review. btw but i got a question on performance per dollar. on how you figure it? 
i was working on my own side note. i was taking the score in the res that i wanted in each mark and the cost of the card. i know current prices prob have moved some sense you did this review.. 

would the math be ($ amount of video card) divided by score to figure out how much your paying per point? 

reason is i have been thinking about tossing in this asus 5850 on this board its a asus p5n-32-e sli plus motherboard. to hold me over until i build my next i7 rig. (about a year) but according to your Performance per dollar i should just toss in 2 250's until next year. ?

system specs atm
Asus p5n32-e sli plus motherboard
E6600 cpu oc 2.9
4 gig mem 
8800gts 640mb card
25.5 " monitor @ 1920x1200 and a 23" 1920x1080 
Im big on getting the biggest bang for my buck..
I know my video card is old and when i got the 2nd monitor plugged in you can see a overall system performance hit.. atm 
I play games like modern warfare 2 bad co 2 world in conflict. wings of prey, bf2 operation flashpoint like to be able to play current games with better fps at better res atm


----------



## travva (Apr 10, 2010)

this is a fantastic card. i just acquired one a few days ago and i LOVE it. overclocks pretty well. i've got a second one coming in to get w/ it in crossfire, though it's not the TOP model but that's ok. either way seems to be a fantastic model thus far.


----------



## W1zzard (Apr 11, 2010)

st0mp said:


> would the math be ($ amount of video card) divided by score to figure out how much your paying per point?



score divided by price


----------



## shevanel (Apr 11, 2010)

I could see the "real point" of NV deciding to disable Physx Gpu acceleration on non-NV cards last year if there were actually enough games that used it..

I mean c'mon, *there are like as many GOOD physx games as there were GOOD sega CD games*.. why would you want to create such a ruckus over Physx and alienate certain gamers from the feature just because they won't play the game using the shit you're trying to sell them.. 

If physx was all that then there would actually be a solid reason to want to be able to use the feature. Out of all the games Ive played I could hardly tell the difference between Havok/Euphoria/Frostbite/Source engines vs Physx..  

I take that back, I could tell a huge difference because in the Physx games my nvidia gtx 275 would always take a slight FPS hit with physx enabled.. they even recommend us to buy a second gpu just to run a feature in the game. 

*Nobody is buying a GPU because it can or cannot do phsyx*.. but ATI users would buy the NV cards if they would at least allow a dedicated physx gpu to be used along side the gpu they have..

and anyone that even knows what a dedicated gpu is already knows where to score one cheap or they already have one... *so nvdia knows its not making much money on dedicated gpus for physx in ATI systems because not enough people are  going to buy a dedicated physx card to make it worth it for nvidia to keep building on the tek... * ..umm wake up... physx is dead.. physx games are not impressive in the physics department so why is it so exclusive nvidia users only? Shameful IMO.. it's not for profit so what does it matter.. *I can see if physx was a major selling point and the games that featured it were flying off the shelves and we had tons of cool physx benchmarks to play with.. fluidmark is about as cool as child neglect and what if physx was the talk all over the news and people loved it as much as ipods and wii's because it was just too damn cool..but it isn't.. *

What nvidia did is about the same as slapping a bumper sticker on a brand new ferrari.. it makes no sense! Physx is not the reason people choose to buy or not buy a particular video card. 

in these times a buyer is worried about heat or noise.. maybe power draw or brute power regardless of heat/noise.. or performance/$ vs overall price ...

and look what we are given... something hot, noisy, power hungry? more expensive but only slightly more powerful than non-physx capable video cards that really are capable but hey... the damn thing will do physx stuff!!  it can make smoke move in batman AA or make a flag wave in the wind even inside of a sealed off building or get a few extra 1000 points in 3dmark vantage so lets block it to non-nv cards so people will say DAMN!! Look at them physx i gotta have me one of them green things that even create real life physics by blowin da air across my desk like the paper you kick around in that damn physx supported batman AA.. now this is truly a 3d experience! 

Thanks NVIDIA, you really know how to take the fun out of gaming, If I want to kick paper in Batman AA or fart into a cloud of smoke and watch it blow to the wall I either have to buy what you have to offer and If I want something close to best thing that the competiton is selling then I must pay even more money because I need teh physx!

So, NO.. I do not think support for physx should be a con .. cuda.. sure.. but physx should be a con with or without being able to use it.



obv as an nvidia customer for over 6 years I am still bitter about the disabling of physx on ATI cards.. just because I bought a product that was the better choice for me over what youre selling  doesnt mean I shouldnt be able to kick the paper on the floor... it really distracts me away from not being able to destroy a coffee cup with a batarang... 

man i need to stop now or ill be here forever... i will edit and sumarize this later if it doesnt get deleted but i just wanted to try and explain my thoughts on this...

its like a restaurant, they have the right to refuse service to anyone, but they dont go around refusing service to the people that also like to eat at other establishments. they only refuse service to people that are a niusance or conflicting with business.. 

I guess its safe to say we the customer are conflicting with the business of NV since we do not always chose to use there stuff 100% of the time no matter how many $499 video cards we've bought every several years.


----------



## FreedomEclipse (Apr 11, 2010)

QQ - they have the 6pin plugs at the bottom of the card & not hanging off the bottom left hand side. this card might have issues fitting inside Mid-tower cases. my 4870's are already a tight fit inside my 902.

Bad move Asus, bad move!


----------



## FreedomEclipse (Apr 11, 2010)

On a side note - notice how the 4870X2 still creams the 5850 in about 7 out of 10 gaming tests??

lol

-----

W1zzard - why wasnt bad company 2 added to the list of gaming benches??


----------



## Deleted member 24505 (Apr 11, 2010)

Nice review W1zzard.

Boy is that fan quiet.


----------



## W1zzard (Apr 11, 2010)

FreedomEclipse said:


> W1zzard - why wasnt bad company 2 added to the list of gaming benches??



fermi used all the power so i couldnt operate the time machine


----------



## MT Alex (Apr 17, 2010)

Thanks for this review.  I have been wondering about this card compared to a reference design, and this cleared things up.  Great job, fine sir.


----------



## freaksavior (Apr 24, 2010)

Does this have to use the 8pin? Or can I use dual 6 pins?


----------



## Profitcare (Apr 27, 2010)

Hi all

I think that review is not complite because there was no eny information about VRM temperature. This parameter the most important during overclocking. It is understood that GPU temperature will be better then using reference one, BUT WHERE IS THE VRM TEMPERATURE. It was so easy to use GPU-Z to show us all temperatures.
Dear frends if somebody has already have this card please make GPU-Z screen shot with all temperarures ander loading.


----------



## OnBoard (Apr 27, 2010)

Profitcare said:


> Hi all
> 
> I think that review is not complite because there was no eny information about VRM temperature. This parameter the most important during overclocking. It is understood that GPU temperature will be better then using reference one, BUT WHERE IS THE VRM TEMPERATURE. It was so easy to use GPU-Z to show us all temperatures.
> Dear frends if somebody has already have this card please make GPU-Z screen shot with all temperarures ander loading.



This card doesn't have volterra chips in it -> no VRM temps. But it has a large row or regular mosfets as you can see from the pictures. They'll stay cool enough, only those volterra buggers run hot on aftermarket coolers, as they are difficult to cool, due to their size. Applies to reference design 4870/90, 260/280, 5850/70 and so on.


----------



## D007 (Apr 27, 2010)

and ty once again Wiz, when ya gonna do a reference model overclocked? lol
Or did I miss it? XD


----------



## Profitcare (Apr 27, 2010)

OnBoard said:


> This card doesn't have volterra chips in it -> no VRM temps. But it has a large row or regular mosfets as you can see from the pictures. They'll stay cool enough, only those volterra buggers run hot on aftermarket coolers, as they are difficult to cool, due to their size. Applies to reference design 4870/90, 260/280, 5850/70 and so on.



Thank you very mach for explaining. So this card has lower temperature VRM design compare to referense one. If I understand wright? Is there editional infor about it. May be there is another vay to take VRM temperature?
Another question: what do you think is it advantage that it has one 8 pin power connector, I mean for overclocking? Referense cards has 2x6 pin power but has almost the same overclocking ranges.
As I can see this card looks like ideal non referense card, or not? I just looking for two non referense cards to use as CrossFire, because two referense with Accelero or Trad plus funs is two big for my PC.
Thank you in advance and sorry for my English.


----------



## OnBoard (Apr 27, 2010)

Profitcare said:


> Thank you very mach for explaining. So this card has lower temperature VRM design compare to referense one. If I understand wright? Is there editional infor about it. May be there is another vay to take VRM temperature?
> Another question: what do you think is it advantage that it has one 8 pin power connector, I mean for overclocking? Referense cards has 2x6 pin power but has almost the same overclocking ranges.
> As I can see this card looks like ideal non referense card, or not? I just looking for two non referense cards to use as CrossFire, because two referense with Accelero or Trad plus funs is two big for my PC.
> Thank you in advance and sorry for my English.



You can always use the finger method, highly doubt it's even hot, meaning under 70C. Or infrared thermo the VRM sink, but I wouldn't worry as there is many phases and a sink.

This card uses more power, volterra is more energy efficient. It's just a safety thing not to run out of juice. Consumes less power than a 4870 for example which is 2x6pin.

And yes it's pretty ideal, as it's easier to cool (well already done) and still has voltage control. Here's couple posts for more info:

http://forums.techpowerup.com/showpost.php?p=1868759&postcount=222
http://forums.techpowerup.com/showpost.php?p=1868945&postcount=233


----------



## Profitcare (Apr 28, 2010)

OnBoard said:


> You can always use the finger method, highly doubt it's even hot, meaning under 70C. Or infrared thermo the VRM sink, but I wouldn't worry as there is many phases and a sink.
> 
> This card uses more power, volterra is more energy efficient. It's just a safety thing not to run out of juice. Consumes less power than a 4870 for example which is 2x6pin.
> 
> ...



Thank you. Usfull posts, but these posts make another question: there is info that this Asus do not support MSI Afterburner. It is very bed because now I have referense Asus 5850 and for me Smart Doctor is convenient utility with stupid design compare to MSI Afterburner.
From another side some of people says than thay can use MSI A with these card. http://www.newegg.com/Product/ProductReview.aspx?Item=N82E16814121375
Even if I know that Asus made custom PCB design it could be good to know where is the true. Ind I still hope that it is possible to use MSI After with this card.


----------



## travva (Apr 28, 2010)

msi afterburner does not work with this card at this time to adjust voltage. everything else works, e.g. fan speed, shader clock, memory etc. just not voltage adjustment.


----------



## Profitcare (Apr 28, 2010)

What do you think friends, if flash BIOS from referense ASUS to ASUS DirectCU? May be in this case we can adjast voltage using MSI Afterburner.


----------



## arroyo (Apr 28, 2010)

Don't do that Profitcare. You will brick your card. The DirectCU have totally different layout and voltage regulator. In best case your PC would not boot. In worst case you will fry your card.
Use SmartDoctor to adjust voltages.


----------



## Profitcare (Apr 28, 2010)

*arroyo*
Thank you for prevention. I have not already bought it but I am thinking about it. As I understend using SmartDoctor is only one way to adjast voltage. So this card I think will be the best to buy. Erlier I thought about MSI 5850 Twin Froz but I have read a lot of bed responses from users compare to ASUS DirectCU.

P.S. about voltage control
*The GPU voltage is managed by a uP6208 voltage controller, which does support I2C software voltage control with up to ~1.8V.* - This was taken from review. If GPU voltage suppirt I2C, It can be adjasted by Riva Tuner because this utility use I2C bus to pass VID to GPU. As we know MSI Afterburener was based on Riva Tuner and I think it can support this card, or may be Riva Tuner do.
What do you think about it friends.


----------



## probey13 (Jul 31, 2010)

Hello everyone, I just got non TOP version of this card. The card sits right below the 120mm fan of my cpu heatsink and I would like to add some more heatsink (i.e. http://www.enzotechnology.com/bcc9.htm) on the backside of this card.

But I can't find any ram chips on the backside of this card (see pic below). So I wonder can I still apply those heatsinks? Would they fall easily?


----------



## MT Alex (Jul 31, 2010)

The RAM is on the other side of the card, underneath the stock cooler.  You won't be able to use those heatsinks in conjunction with your cooler.  They are made to cover the mofsets and vram when an aftermarket cooler is used that specifically targest the GPU only.

Also, welcome to TPU


----------



## HossHuge (Aug 1, 2010)

probey13 said:


> Hello everyone, I just got non TOP version of this card. The card sits right below the 120mm fan of my cpu heatsink and I would like to add some more heatsink (i.e. http://www.enzotechnology.com/bcc9.htm) on the backside of this card.
> 
> But I can't find any ram chips on the backside of this card (see pic below). So I wonder can I still apply those heatsinks? Would they fall easily?



Most of the time they put them around the GPU.  Here are some pics of one card with chips on the front and back. And welcome to TPU as well....


----------



## probey13 (Aug 1, 2010)

Thanks for the info, you guys are great!
I guess I will stick with the stock cooler.


----------



## HossHuge (Aug 2, 2010)

probey13 said:


> Thanks for the info, you guys are great!
> I guess I will stick with the stock cooler.



That's the way we roll here at TPU....

Another way the guys here can help you if you have a problem is by filling in your system spec.  Go to User CP to do that.


----------



## Tatty_One (Aug 2, 2010)

I also have the non TOP DirectCU version (I can't honestly think why anyone would want to spend the extra on the TOP if they overclock), at 1035mhz (1.275V) things get pretty hot but not to the point where anything locks up, I have simply added an Antec Spot cooler that can be manipulated to blow cool air across the PCB underneath the shroud which drops temps by around 8C.


----------



## Anarchy0110 (Aug 7, 2010)

Love this card a lot. Nice design, that GPU Cooler looks like a sports car. Pair 2, 3 or 4 cards will give an amazing performance. Love ATI since they released the HD4850, after that my last favorite card - 8800GT is


----------



## Anderson1024 (Aug 8, 2010)

*EAH 5850 TOP Crossfire*

Mornin' hallo to all. I Have been reading through all the guys posts & the forum looks great with some.. let's just say intersting views oki to the point, I bought myself 2 of these cards, installed them & they stutter & jerk, is this the drivers? because according to me running crysis warhead at 1980*1200 at 4a should not really be causing this setup to crap itself & call for mommy?! Now i'm not a tech guru, i basically know the bare minimum in order to put a pc together but i have not blown up anything since i started doing it a couple of years ago

OKI system spec's if it will help:


OS Windows 7 64
Cpu - Q9300 2.6 Quad oc to 3.2 (stable for the last year)
Mobo - Blitz Extreme 775
Ram - 4Gig Corsair XmsIII Dominater 1333 oc to 1775
Cooling(CPU & MOBO) - Thermaltake Bigwater
Gpu - EAH 5850 Top 1g x 2
Psu - Corsair hx1000w
Display - Samsung 27 inch p2770h x 2
Keyboard - Logitech G15
Mouse - Logitech G9
Headset - Logitech G35
Drives - 5x Seagate 1T 1x Seagate 80g

All mashed toghether in a thermaltake shark case, when i say mashed i do mean mashed it was an ABSOLUTE cow to get those gpu's in there because of the length & size f the case(not the biggest 1 out there) but now that they are in airflow between them is pretty good because of the spacing of the pcie slots on this board & there are no heat issues as the cpu & mobo are cooled pretty well by the watercooling setup, still this is my first dual card install so i might have made a mistake somewere SO ID LOVE SOME HELP PALEASE thanks in advance guys. (BTW the cards use a 6 & 8 pin power connecter & seeing as the instructions in the box & on the web is pretty much none existing i used both on each card wich seems to be the way to go aslong as ur psu can handle it)


----------



## Anderson1024 (Aug 8, 2010)

my last favorite card - 8800GT is [/QUOTE]

With u there i only replaced my gt setup now was a sad sad day wil keep it in the archive as the best ever


----------



## Anarchy0110 (Aug 8, 2010)

Anderson1024 said:


> my last favorite card - 8800GT is





> With u there i only replaced my gt setup now was a sad sad day wil keep it in the archive as the best ever



Thumbs up. Even it's old but still performs great isn't it


----------



## Anderson1024 (Aug 8, 2010)

yip still runs everything even crysis though not on high but this card to me was the best ever from nvidia even better than the 8800gts if lower spec im gonna drop this into my wive's pc seeing as she only runs wow starcraft & some other rpg's & so far i have yet to meet the rpg this card cannot handle at max at 1980*1200


----------



## trt740 (Oct 10, 2010)

is anyone can dump the bios on this card I need it

 I'm looking for this cards bios ASUS EAH5850 Dire...


----------

