# Leaked BENCHMARKS from the ATI 3870X2



## Ati Addictive (Jan 9, 2008)

The people of Hothardware have gotten there hands on the new ATI 3870X2 card.
The card as show on the first foto contains 2 PCIe supplyconnectors, a 6 pin and a 8 pin.
The lenght of the card doesn´t look at all of the same size as the 8800 Ultra card or GS as i posted yesterday in a another thread, it is maby longer than your motherboard.

Also a Chineze site has leaked specifications of a Benchmark in 3Dmark 2006. The copy they have used contained 2 cores that ran on 775MHZ, while in the meantime the 1 GB did his job at 2250 MHZ. The only benchmark they posted has been done with 3Dmark 2006 with a resolution of 2560X1600, they established a score of 9573 points. (For the people who are wondering what a score would be on a resolution in 3Dmark 2006 of 1280x1024 will be the same score as 3870 in crossfire mode. Further of all the system contained a Intel C2D E6600 with 2 GB of DDR 2 memory. Further of all i have a foto without the cooling on it so that you can clearly see the PLX chip that exchanges data between the two GPU´s.

For the original article go to http://my.ocworkbench.com/bbs/showthread.php?p=425415#post425415

Also again many thanks to Hardware.info for getting use nice previews of the card.


----------



## strick94u (Jan 9, 2008)

Thats big, Green is a better color


----------



## Scrizz (Jan 9, 2008)

ehh...


----------



## Ati Addictive (Jan 9, 2008)

strick94u said:


> Thats big, Green is a better color



i dont´t think you understand but the resolutions where much higher with that card than your puni little benchmark! don´t really understand why u post here!
So green actually suxs compaired to this

PS: Ati Beat the hell out of nvidia with there 3850 GPU



:::EDIT::: Well strick94u has been posting here much longer than you and seeing what your 43 posts usually contain judging by the rudeness of this one id say
I don't understand why YOU post here.-Solaris17


----------



## phanbuey (Jan 9, 2008)

Ati Addictive said:


> PS: Ati Beat the hell out of nvidia with there 3850 GPU



errrm... yes.. with the 3850  ??? beat how? I don't understand... nVidia just got Forbes' Company of the year because the slaughtered AMD on the market.  Also, both companies use green logos. 

this X2 is awesome, i would definitely get it if i had a huge res monitor (which i want) but the 9800GX2 is coming out at same time and it will be 2 g92's in SLI, ATi is too late with this card... so in reality, it will depend on pricing.


----------



## Ati Addictive (Jan 9, 2008)

i do not know if ati will be faster we have 2 wait for that time, but soon we can messure them


----------



## zekrahminator (Jan 9, 2008)

10K in 3DMark06 at that huge resolution is awful impressive.


----------



## EastCoasthandle (Jan 9, 2008)

you also need to look at this: 






What are people getting with a 3870 in CF with a Q6600 at 3.30 using vista?


----------



## Tatty_One (Jan 9, 2008)

strick94u said:


> Thats big, Green is a better color



Kinda faster in green too


----------



## Ati Addictive (Jan 9, 2008)

Righht FAkeee


----------



## Tatty_One (Jan 9, 2008)

zekrahminator said:


> 10K in 3DMark06 at that huge resolution is awful impressive.



Agreed, TBH, NVidia GX2 or this monster, if either work out as fast or faster than their 2 card SLi/XFire counterparts and are cheaper then either would do me!


----------



## panchoman (Jan 9, 2008)

ah, the picture actually has memory on it this time.


----------



## Tatty_One (Jan 9, 2008)

EastCoasthandle said:


> you also need to look at this:
> 
> 
> 
> ...



Is that a 3870 Xfire score?....surely not.


----------



## EastCoasthandle (Jan 9, 2008)

Tatty_One said:


> Kinda faster in green too



You have a link?

2 GTX is SLI 17574


----------



## Ati Addictive (Jan 9, 2008)

It has to be i gues , because here http://www.hardware.info/benchmarks/cmpx/view/ 
you can the systems that get a score of 17000 they atleast have 2 cards before reaching this score. Also it can be easly Photoshopped.
I don´t trust the photo sorry.


----------



## EastCoasthandle (Jan 9, 2008)

Tatty_One said:


> Is that a 3870 Xfire score?....surely not.



That's what is being alleged


----------



## Tatty_One (Jan 9, 2008)

EastCoasthandle said:


> You have a link?
> 
> 2 GTX is SLI 17574



Lol, I was playin with Strick94U's comment above mine but dont bother quoting the GTX sli of 17574, I can more or less hit that with a single GTS 512MB!   I can only imagine those GTX's were severly bottlenecked cause I beleive there are some 20,000+ scores in the 2006 thread, although I would love to see a couple of GTS 512MB's on say a quad at 4gig.


----------



## Tatty_One (Jan 9, 2008)

EastCoasthandle said:


> That's what is being alleged



I actually thought that was too slow to be Xfire TBH however looking again I see to be fair the quad was not clocked that high.


----------



## imperialreign (Jan 9, 2008)

I'm curious . . . how would CCC look at a card that has 2 GPUs on it?  Would it actually consider the card in xFire all the time, or would it just look at it like any other single GPU card?


----------



## mandelore (Jan 9, 2008)

lols, we are gonna get such massive cards we are gonna need to underclock them just to play at speeds we can keep up with


----------



## Ati Addictive (Jan 9, 2008)

imperialreign said:


> I'm curious . . . how would CCC look at a card that has 2 GPUs on it?  Would it actually consider the card in xFire all the time, or would it just look at it like any other single GPU card?



The PLX chip makes sure that the CCC will see it as one card. PLX chip exchanges data between the two GPU´s so also information i presume.


----------



## EastCoasthandle (Jan 9, 2008)

Tatty_One said:


> Lol, I was playin with Strick94U's comment above mine but dont bother quoting the GTX sli of 17574, I can more or less hit that with a single GTS 512MB!   I can only imagine those GTX's were severly bottlenecked cause I beleive there are some 20,000+ scores in the 2006 thread, although I would love to see a couple of GTS 512MB's on say a quad at 4gig.



Here is a 640 GTS doing 5564 at 2560x1600 using XP 
And the GTX in SLI is a legitimate score using Vista Ultimate.  The R680 isn't OC'd using a moderate cpu OC (which you already figured out).


----------



## Tatty_One (Jan 9, 2008)

EastCoasthandle said:


> Here is a 640 GTS doing 5564 at 2560x1600
> And the GTX in SLI is a legitimate score using Vista Ultimate



Old gen....pfffftttt


----------



## EastCoasthandle (Jan 9, 2008)

Tatty_One said:


> Old gen....pfffftttt



New gen isn't that much higher if at all higher at 256-bit 
In any case anyone else have any 2560x1600 scores?


----------



## Tatty_One (Jan 9, 2008)

I forgot to ask.....does speculation suggest these will be cheaper than two HD3870's??


----------



## yogurt_21 (Jan 9, 2008)

Tatty_One said:


> Lol, I was playin with Strick94U's comment above mine but dont bother quoting the GTX sli of 17574, I can more or less hit that with a single GTS 512MB!   I can only imagine those GTX's were severly bottlenecked cause I beleive there are some 20,000+ scores in the 2006 thread, although I would love to see a couple of GTS 512MB's on say a quad at 4gig.



lol you can't compare stock to oced. sure your gts can hit 17k overclocked, but a stock one wont, and stock gts sli wont either. (as sli doesn't scal as well as crossfire) also at 3.3GHZ the cards are quite a bit more bottlenecked than your single. and to see a quad 4 gig plus dual 8800gts (g92) just got to futuremark lol.

http://service.futuremark.com/compare?3dm06=4608322
and I'd gather that these cards are far from stock as well.


----------



## Tatty_One (Jan 9, 2008)

EastCoasthandle said:


> New gen isn't that much higher if at all higher at 256-bit
> In any case anyone else have any 2560x1600 scores?



Most have the free version I spose so probably not, and hmmmmmm G92 GTS versus G80GTS on same system equates to over 4000 3D Mark 2006 points if you overclock both cards to the max without hardmodding, thats pretty significant in my book   Actually, not far off the differents between G80 in Sli!

Edit:  Think I missed something there, do I understand you right ....you dont think two 8800GTS 512MB at stock will hit 17000 in 3D Mark 2006?  The closest I could find in HWbot that was a quad doing 3.33gig was on two 8800GTS 512MB factory overclocked cards, there were a lot of GTS setups with MUCH higher scores than this but as you said earlier, both cards and CPU's were massively overclocked, here is a copy of the extract, agreed the score would be a bit lower with the CPU running alittle slower:

 3Dmark 2006 - 17935 marks - cky2k6 (XtremeSystems) 6.1 points - [  ] 
 Processor: Core 2 Q6600 (2.4Gh... @ 3420mhz no image 
 Videocard: 2x GeForce 8800 GT 512... @ 683/1000mhz 
 Global Rank: 344th - 3.7 points 
 Hardware Rank: 15th 2x GeForce 8800 GT 512 Mb - 2.4 points 
 Description: Q6600 @ 3420MHz - 2x 8800 GT 683/1000/1729 - tuniq 120 & vf900 
 Verfication: link to forumpost verification link  
 Scan date: 28-12-2007 04:35 
 Compare url: http://www.hwbot.org/compare.do?resultId=681273


----------



## Tatty_One (Jan 9, 2008)

yogurt_21 said:


> lol you can't compare stock to oced. sure your gts can hit 17k overclocked, but a stock one wont, and stock gts sli wont either. (as sli doesn't scal as well as crossfire) also at 3.3GHZ the cards are quite a bit more bottlenecked than your single. and to see a quad 4 gig plus dual 8800gts (g92) just got to futuremark lol.
> 
> http://service.futuremark.com/compare?3dm06=4608322
> and I'd gather that these cards are far from stock as well.



Couldnt agree more but then again, thats some of the advantages of having a single card, it does not change the fact that in reality there is not a huge amount in it, on top of that, just look at the price comparison....but yes you are right but it's only fair to let the single card overclock if the competition has 2 cards!


----------



## Ati Addictive (Jan 10, 2008)

Tatty_One said:


> I forgot to ask.....does speculation suggest these will be cheaper than two HD3870's??




count on a price between 300 and 350 euro so yes


----------



## yogurt_21 (Jan 10, 2008)

yeah, I've no problem with that, I'd much rather run single than mess with dual. the dualchip card war will come down to how well they oc really, as crossifre scales better which would give the dual 3870 and initial lead, but since the g92 base gts's overclock better, it might coem out as the better value. trouble is, neither side has launched a successful dual card previously soo.... I don't really think in either game support/drivers or cost that these will be a better buy than just getting two seperate cards, or better yet one single card and a good watercooler.


----------



## Tatty_One (Jan 10, 2008)

Ati Addictive said:


> count on a price between 300 and 350 euro so yes



Thats a nice price, interesting to see what good old rip off Britain will charge tho


----------



## EastCoasthandle (Jan 10, 2008)

Source
If this is any indication of what we can expect I understand why some are excited (return of competition) and why others are so worried (return of competition) .  Not because of it's sheer power but how competitive this card is using less power which is 40 watts less then GT using SLI and it will be even less using GTS in SLI.

But lets look at the scaling:















Side note: in order to see this side by side you may need a resolution of 1680x1050 and higher and have your web browser maximized. 
As you can see the 3870 scales much better then it's competitor.  Part of the success of the 3870 X2 will hinge on how well crossfireX drivers are in the end.  Time will tell.  However, by and far the whole design for X2 is much more innovative then GX2 IMO.


----------



## TonyStark (Jan 10, 2008)

Ati Addictive said:


> The people of Hothardware have gotten there hands on the new ATI 3870X2 card.
> The card as show on the first foto contains 2 PCIe supplyconnectors, a 6 pin and a 8 pin.
> The lenght of the card doesn´t look at all of the same size as the 8800 Ultra card or GS as i posted yesterday in a another thread, it is maby longer than your motherboard.
> 
> ...




Thread title is misleading. "BENCHMARKS" should be changed to "single 3DMark score" ....


----------



## imperialreign (Jan 10, 2008)

@EastCoast - thanks for that post, man!  It's nice being able to actually see the comparitive numbers side by side for once

I find it really interesting, based on those charts, that the 3870 setups appear to completely oust the 8800 GTs at a "gaming" resolution, while the 8800's seem to do better at a higher res.  Anyhow, again based on those charts, it looks like any actual difference in frame rates is negligable for the most part.  A 5-10 FPS difference between the cards is neck and neck, and hell, at those high FPS to begin with we wouldn't actually see the difference.  

If ATI bring the new cards in around the $500 mark (as they've also stated before they no longer plan to develop anything over that price), I think they'll have a winner on their hands.


----------



## strick94u (Jan 10, 2008)

Ati Addictive said:


> i dont´t think you understand but the resolutions where much higher with that card than your puni little benchmark! don´t really understand why u post here!
> So green actually suxs compaired to this
> 
> PS: Ati Beat the hell out of nvidia with there 3850 GPU


 

Let’s just say I hope the next generation of video cards would be able to outperform this generations. 2 GPU’s twice the memory buffer it is a huge card with nice numbers. I post on here because its an open forum, I don't understand your anger at such a tiny comment about color at least I am not so much a fanboy it would ever keep me from buying an ATI card, can you say the same? I usually make that decision based on reports like this one however I wait till the actual silicone is out and tested by trusted sites with drivers released with the card not built for running one or two benchmarks. That way I know for sure I know the card I am about to drop 10 percent of a paycheck on is worth buying.Now when you make a blanket statement about Ati beating the hell out of nvidia wither 3850 gpu what does that mean out perform,out sell,or did they get a 3850 and beat the hell out of Nvidia with it? Please elaborate


----------



## Tatty_One (Jan 10, 2008)

So bottom line, if you want 2+ independant card setups, the HD3870 is the better choice, it scales better in multi cards setups, is cheaper and the single card performance gaps betwen the 2 cards are pretty much eradicated, if you prefer just single card setups then the 8800GT is the faster card and the rest is down to how much price difference there are between the 2.

It will be interesting to see in that case, how much the GX2 beats (if it beats, but it should as scalability will be less of an issue than it is in Sli) this dual card by and what the price comparisons are.


----------



## EastCoasthandle (Jan 10, 2008)

imperialreign said:


> @EastCoast - thanks for that post, man!  It's nice being able to actually see the comparitive numbers side by side for once
> 
> I find it really interesting, based on those charts, that the 3870 setups appear to completely oust the 8800 GTs at a "gaming" resolution, while the 8800's seem to do better at a higher res.  Anyhow, again based on those charts, it looks like any actual difference in frame rates is negligible for the most part.  A 5-10 FPS difference between the cards is neck and neck, and hell, at those high FPS to begin with we wouldn't actually see the difference.
> 
> If ATI bring the new cards in around the $500 mark (as they've also stated before they no longer plan to develop anything over that price), I think they'll have a winner on their hands.



Nice observation!  A lot of people (specially those who are hooked on faster frame rates) simple don't know that most games show no difference between 33FPS and 38 FPS when that game requires a high frame rate in order to play it as intended.  In this case the game gets no faster, performs no better and is no smoother then at the lower frame rate (in my experience).  Another thing people forget is that each game requires a certain frame rate in order to be played properly.  If you don't achieve that frame rate consistently it's pointless which card is faster. 

If both achieve the frame rate needed then the faster card doesn't change the immersion of the game.  For example lets say WiC needs 50 FPS to play.  If both competiting cards play the game at 53 FPS and the other 58 FPS it provides no greater advantage then at 53 FPS.  The immersion of how the game plays and how it reacts to you playing it doesn't change.  All you did was pay more to justify why you run fraps!  

IMO it's the gamings benchmark greatest illusion if you don't properly understand what the numbers mean.  In most cases, it means nothing if you don't know what's generally required for each game benchmarked. Is it safe to generalize (take a guess) that all games benchmarked require somewhere between 45-50 FPS in order to run properly?  It's a tough call...


----------



## Ati Addictive (Jan 11, 2008)

Yeayy Duhh ofcourse but it is still cool to have a great card in your computer, videocards like these are just simply for benchmarking and smooth game rendering like crysis.


----------



## Wile E (Jan 11, 2008)

Hmmm, I hope Palit will send me a couple of these for testing. I'd like to see what they can do, and if CrossfireX can be enable with 2 of them on an X38 board.


----------



## imperialreign (Jan 11, 2008)

EastCoasthandle said:


> Nice observation!  A lot of people (specially those who are hooked on faster frame rates) simple don't know that most games show no difference between 33FPS and 38 FPS when that game requires a high frame rate in order to play it as intended.  In this case the game gets no faster, performs no better and is no smoother then at the lower frame rate (in my experience).  Another thing people forget is that each game requires a certain frame rate in order to be played properly.  If you don't achieve that frame rate consistently it's pointless which card is faster.
> 
> If both achieve the frame rate needed then the faster card doesn't change the immersion of the game.  For example lets say WiC needs 50 FPS to play.  If both competiting cards play the game at 53 FPS and the other 58 FPS it provides no greater advantage then at 53 FPS.  The immersion of how the game plays and how it reacts to you playing it doesn't change.  All you did was pay more to justify why you run fraps!
> 
> IMO it's the gamings benchmark greatest illusion if you don't properly understand what the numbers mean.  In most cases, it means nothing if you don't know what's generally required for each game benchmarked. Is it safe to generalize (take a guess) that all games benchmarked require somewhere between 45-50 FPS in order to run properly?  It's a tough call...




sadly, though, the whole market at this point has been built around a numbers war.  There are too many consumers out there that will buy a product because it "averages" 5 FPS more than it's competitor - based solely off 3rd party testing of the products on differing 'high-end' systems . . . speaking of which, how often do you see everyone's 3rd party testing producing the same numbers as the next guys?  For the most part, you don't.  While card A fares exceptionally well on review site X, card B instead beats out card A in the same tests on review site Y.

poorly informed consumers, IMO, are what keep the green vx red rivalry going at this point - granted, there's a lot of loyalists to one camp or the other, but we don't go around boggarting threads until someone else flames it up . . .


----------



## Makaveli (Jan 11, 2008)

Couldn't have said it any better myself!

Agree with you 100%


----------



## EastCoasthandle (Jan 11, 2008)

imperialreign said:


> sadly, though, the whole market at this point has been built around a numbers war.  There are too many consumers out there that will buy a product because it "averages" 5 FPS more than it's competitor - based solely off 3rd party testing of the products on differing 'high-end' systems . . . speaking of which, how often do you see everyone's 3rd party testing producing the same numbers as the next guys?  For the most part, you don't.  While card A fares exceptionally well on review site X, card B instead beats out card A in the same tests on review site Y.
> 
> poorly informed consumers, IMO, are what keep the green vx red rivalry going at this point - granted, there's a lot of loyalists to one camp or the other, but we don't go around boggarting threads until someone else flames it up . . .



Sad but true in the end some pay more and get the same gaming experience.  Some know it and are just loyalists while others just don't know better.  That's why I consider some of these benchmark results an illusion because some just don't know that a higher number doesn't always mean better performance.  You have to know what's required for that game first in order for that benchmark to make sense.


----------



## Ravenas (Jan 11, 2008)

Tatty_One said:


> I forgot to ask.....does speculation suggest these will be cheaper than two HD3870's??



Eh, that would be great, but kinda hard to believe without seeing it.


----------



## TonyStark (Jan 11, 2008)

8800GTX SLI 3DMark06 score (@2560x1600) = ~9.9k

CPU used: Core 2 Extreme X6800 2.93GHz


----------



## TonyStark (Jan 11, 2008)

8800GTX vs HD3870 X2 (3DMark06 @ 2560x1600)


*8800GTX + Core2 Quad (Q6700 @ 3.5GHz):*
SM2.0: 3244
SM3.0: 2828

*3870 X2 + Core2 Duo (E6600 @ 2.4GHz):*
SM2.0: 4494  +38%
SM3.0: 4476 +58%


source: http://www.xtremesystems.org/forums/showpost.php?p=2690580&postcount=281


----------



## EastCoasthandle (Jan 11, 2008)

TonyStark said:


> 8800GTX vs HD3870 X2 (3DMark06 @ 2560x1600)
> 
> 
> *8800GTX + Core2 Quad (Q6700 @ 3.5GHz):*
> ...



Oh yeah, nearly forgot about that:


----------



## erocker (Jan 11, 2008)

It's starting to be leaked that the card is doing 17000+ with 3d06.  I'm assuming it's with a q6600.


----------



## Tatty_One (Jan 11, 2008)

Not quite sure of the message at post 44?.....2 different generations of cards, one three times the price of the other etc etc, if the point is to say that 2 HD3870 are faster than two 8800GTX's in Sli/Xfire then fine, go find a similar bench with the 8800GTS G92, that will be closer.....has anyone said they are not, is that relative to the thread topic? my point earlier was that 97% of the population dont buy 2 cards, they buy one, if the only way you can stack up some positives for a card is to emphasis what 3% of the population can get out of 2 cards then IMO there is little point.  My other point was if I remember correctly, I pointed out that one 8800GTS G92 was close to 2 HD3870's score, which again if I remember correctly....was true, so again whats the point?  unless you are saying it is better to go out and by 2 cards costing more to get one cards worth of performance.

I agree with everything you have said about FPS....gaming experience etc etc, along with the benefits of better dual card scalability but as I said, so few people in the world buy 2 cards at a time (including most people who run Sli/Xfire as the whole point of that really is to buy one and when you can afford another, buy another) that the majority of people do not judge a cards performance by how it performs in XFire.....that was all.


----------



## Mussels (Jan 11, 2008)

hell even on this forum, a large bunch of techie nutcases... its maybe 1 in a 100 who have dual cards, and almsot every one of them has two high end cards on very  expensive systems (better than mine, some by a huge amount)

Also a very big reason why you see such big benchmarks with ATI? Because the best quad core platforms only support crossfire and not SLI...


----------



## xfire (Jan 11, 2008)

^^ you have nvidia to thank for that with the whole you can't support sli without our chips thing.


----------



## Mussels (Jan 11, 2008)

well its not like crossfire works on every board either...

Both companies need to get over it.


----------



## xfire (Jan 11, 2008)

Yea like a super merger Intel-AMD-ATI-Nvidia where intel&Nvidia makes the stuff and AMD/ATI set the price.


----------



## Scrizz (Jan 11, 2008)

xfire said:


> Yea like a super merger Intel-AMD-ATI-Nvidia where intel&Nvidia makes the stuff and AMD/ATI set the price.



uhhh, no


----------



## TonyStark (Jan 11, 2008)

Tatty_One said:


> Not quite sure of the message at post 44?.....2 different generations of cards, one three times the price of the other etc etc, if the point is to say that 2 HD3870 are faster than two 8800GTX's in Sli/Xfire then fine, go find a similar bench with the 8800GTS G92, that will be closer.....has anyone said they are not, is that relative to the thread topic? my point earlier was that 97% of the population dont buy 2 cards, they buy one, if the only way you can stack up some positives for a card is to emphasis what 3% of the population can get out of 2 cards then IMO there is little point.  My other point was if I remember correctly, I pointed out that one 8800GTS G92 was close to 2 HD3870's score, which again if I remember correctly....was true, so again whats the point?  unless you are saying it is better to go out and by 2 cards costing more to get one cards worth of performance.
> 
> I agree with everything you have said about FPS....gaming experience etc etc, along with the benefits of better dual card scalability but as I said, so few people in the world buy 2 cards at a time (including most people who run Sli/Xfire as the whole point of that really is to buy one and when you can afford another, buy another) that the majority of people do not judge a cards performance by how it performs in XFire.....that was all.



People asked for the scores, so I found them and posted them. You are over-analyzing....


----------



## Ati Addictive (Jan 11, 2008)

I do not think that ATI and Nvidia should work together, because there wouldn´t be much competitions in Benchmarking between Ati-Nvida lovers so keep it this way


----------



## InnocentCriminal (Jan 11, 2008)

The OC Workbench link doesn't work for me.


----------



## Ati Addictive (Jan 11, 2008)

InnocentCriminal said:


> The OC Workbench link doesn't work for me.



Send me the link ill try 2


----------



## asb2106 (Jan 11, 2008)

Tatty_One said:


> Not quite sure of the message at post 44?.....2 different generations of cards, one three times the price of the other etc etc, if the point is to say that 2 HD3870 are faster than two 8800GTX's in Sli/Xfire then fine, go find a similar bench with the 8800GTS G92, that will be closer.....has anyone said they are not, is that relative to the thread topic? my point earlier was that 97% of the population dont buy 2 cards, they buy one, if the only way you can stack up some positives for a card is to emphasis what 3% of the population can get out of 2 cards then IMO there is little point.  My other point was if I remember correctly, I pointed out that one 8800GTS G92 was close to 2 HD3870's score, which again if I remember correctly....was true, so again whats the point?  unless you are saying it is better to go out and by 2 cards costing more to get one cards worth of performance.
> 
> I agree with everything you have said about FPS....gaming experience etc etc, along with the benefits of better dual card scalability but as I said, so few people in the world buy 2 cards at a time (including most people who run Sli/Xfire as the whole point of that really is to buy one and when you can afford another, buy another) that the majority of people do not judge a cards performance by how it performs in XFire.....that was all.



Only a small percentage of people even buy 1 video card, and an even smaller amount of people play games on their computer,  and a very few amount of people run 2560x1600 rez.  With that in mind, anyone who does all those above things, they would want 2 cards


----------



## Tatty_One (Jan 11, 2008)

TonyStark said:


> People asked for the scores, so I found them and posted them. You are over-analyzing....



Ahhhh right sorry, I didnt see any requests so was kind of confused as to it's relevance.


----------



## Tatty_One (Jan 11, 2008)

asb2106 said:


> Only a small percentage of people even buy 1 video card, and an even smaller amount of people play games on their computer,  and a very few amount of people run 2560x1600 rez.  With that in mind, anyone who does all those above things, they would want 2 cards



There is some logic there, but seeing as several single cards can still perform better at Hi res than many mid ranged dual card setups........one card can fit all   Ohhhhhh and you are not just talking about 3% there, more like 0.5%!


----------



## asb2106 (Jan 11, 2008)

yah i agree with you that on two mid range cards.  Anytime i hear a debate between(for example) 1 3870 or 2 3850's, i think its not a debate, always get the top end so when the next gen comes out you can get one at a lower price and extend the life of the cards alittle, at a decent price.  I used a buddys 3870 to crossfire 2 and i noticed a great improvement when i went to high rez, at low 14x9 and lower it wasnt that great, but when i got up to 1920x1200 crysis went up from 18fps to 26, making it much more playable

and that was before the new patch and hotfix


----------



## Tatty_One (Jan 11, 2008)

imperialreign said:


> @EastCoast - thanks for that post, man!  It's nice being able to actually see the comparitive numbers side by side for once
> 
> I find it really interesting, based on those charts, that the 3870 setups appear to completely oust the 8800 GTs at a "gaming" resolution, while the 8800's seem to do better at a higher res.  .




May I add an end to your sentance seeing as there are some useful numbers being presented here?

"I find it really interesting based on those charts, that the 3870 setups appear to completely oust the 8800GT's at a "gaming resolution" providing that you dont turn on ANY AA at ANY resolution otherwise it's almost completely the opposite"  just wanted to add that from what I can see in those charts.  

On a serious note........do you game without AA enabled?


----------



## Tatty_One (Jan 11, 2008)

asb2106 said:


> yah i agree with you that on two mid range cards.  Anytime i hear a debate between(for example) 1 3870 or 2 3850's, i think its not a debate, always get the top end so when the next gen comes out you can get one at a lower price and extend the life of the cards alittle, at a decent price.  I used a buddys 3870 to crossfire 2 and i noticed a great improvement when i went to high rez, at low 14x9 and lower it wasnt that great, but when i got up to 1920x1200 crysis went up from 18fps to 26, making it much more playable
> 
> and that was before the new patch and hotfix




yeah, TBH.......I admit that I am biased, not towards NVidia or ATi, I have owned my fair share of both but I am Biased against XFire/SLi, it's just a personal thing, I see their use, now a dual card as posted in this thread may be more interesting to me though.......it's all good!


----------



## InnocentCriminal (Jan 11, 2008)

Ati Addictive said:


> Send me the link ill try 2



It was the link in your founding post to the thread.


----------



## asb2106 (Jan 11, 2008)

Tatty_One said:


> yeah, TBH.......I admit that I am biased, not towards NVidia or ATi, I have owned my fair share of both but I am Biased against XFire/SLi, it's just a personal thing, I see their use, now a dual card as posted in this thread may be more interesting to me though.......it's all good!



It is all good!!!  I wish the cards could perform well enough that I wouldnt need two, but then that would just make it all to easy right!  and nvidia would no longer make chipsets because people would have no need to buy them!(or they could just open up SLI on Intel, that would make me a Nvidia GPU user)


----------



## erocker (Jan 11, 2008)

Ati Addictive said:


> I do not think that ATI and Nvidia should work together, because there wouldn´t be much competitions in Benchmarking between Ati-Nvida lovers so keep it this way



I think they SHOULD work together to get games to work correctly on the sofware end.  I think this would also spur on more competition on the hardware side.


----------



## erocker (Jan 11, 2008)

3870X2 3DMark06 default settings:  Click thumbnail.


----------



## DaMulta (Jan 11, 2008)

erocker said:


> 3870X2 3DMark06 default settings:  Click thumbnail.



two x2s


----------



## erocker (Jan 11, 2008)

DaMulta said:


> two x2s



Are you sure?


----------



## DaMulta (Jan 11, 2008)

No I'm not sure, but that score is pertty damn high with the Quad not OCEd.


----------



## yogurt_21 (Jan 11, 2008)

it is oced, it's at 3.3GHZ and the score makes sense for a 3.3GHZ quad with dual 3870's so it should be good for one 3870x2


----------



## erocker (Jan 11, 2008)

yogurt_21 said:


> it is oced, it's at 3.3GHZ and the score makes sense for a 3.3GHZ quad with dual 3870's so it should be good for one 3870x2



Well in that case...  ATi ownage time!!!


----------



## asb2106 (Jan 11, 2008)

you know, heres a good one.  

Its said that you can crossfire a 3870 and a 3850, i wonder if you will be able to crossfire a 3870x2 with a 3870??

Any thoughts anyone?


----------



## Tatty_One (Jan 11, 2008)

asb2106 said:


> you know, heres a good one.
> 
> Its said that you can crossfire a 3870 and a 3850, i wonder if you will be able to crossfire a 3870x2 with a 3870??
> 
> Any thoughts anyone?



You can, or will be able to, I think it will come in a forthcoming driver release, I read an article on it but it is motherboard dependant I think, I am sure the x38 will do it, it might be only boards that support 16 x 16 PCI-E lanes, I will try and dig it out, I stumbled across it when I was doing some research into what board I should get a few weeks ago and was un decided between P35 and x38.


----------



## Tatty_One (Jan 11, 2008)

Yep, you will be able to, it's called "Hybrid Xfire" and comes as part of the new CrossfireX concept that ATI are/have developed, you can read a little about it here, the smaller text box at the bottom specifically mentions it........

http://www.techzonept.com/showthread.php?t=200748


----------



## asb2106 (Jan 11, 2008)

awesome, well that will make my next upgrade of much more value then!!


----------



## Dr. Spankenstein (Jan 11, 2008)

(Me likey the SM 2.0 and SM 3.0 scores!)


----------



## erocker (Jan 11, 2008)

I wonder if crossfireX uses all available memory?


----------



## imperialreign (Jan 11, 2008)

Tatty_One said:
			
		

> May I add an end to your sentance seeing as there are some useful numbers being presented here?
> 
> "I find it really interesting based on those charts, that the 3870 setups appear to completely oust the 8800GT's at a "gaming resolution" providing that you dont turn on ANY AA at ANY resolution otherwise it's almost completely the opposite" just wanted to add that from what I can see in those charts.
> 
> On a serious note........do you game without AA enabled?



older games, like FEAR, Doom3, Thief, yeah, I'll run AA/AF maxed, but only because my GPU(s) pick up the slack in regards to performance, and because of the older titles I get really great performance.  My P4 chokes the 1950s so friggin hard, though, it ain't right.  Newer games, though, I don't even think about it - Crysis, for example, even if I run 800x600 chokes down to 1 FPS with everything set on low if I turn on AA.

As to the whole xFire/SLI bit, it's defi a niche audience.  TBH, if I hadn't been able to secure my two 1950s for as cheap as I did, I never would've gotten hold of a second GPU (total spent between the two was about $235).  Granted though, most people that do go the dual card route tend to go with the 1337 graphics cards, too, which makes the average dual-setup appear even more expensive.

I think, though, for ATI to be able to offer a card with two GPUs on one PCB, which gives you the median performance between xFire and a single card, and to be able to offer it at a price that is cheaper than two cards - you've got the making of a great product.

Even if nVidia was fisrt to market with something like this, I give them acknowledgement for that.


----------



## trog100 (Jan 11, 2008)

erocker said:


> I wonder if crossfireX uses all available memory?



i doubt it.. card memory seem to be the catch.. to get a usable 512 it would need two gig with four cards.. 

so to make real use of four cards it would need 4 gig.. 1 gig per card.. 

a bit of a memory waste.. still they might pull some magic off..dunno..

four 256 cards being a waste of space.. 

trog


----------



## Tatty_One (Jan 11, 2008)

imperialreign said:


> older games, like FEAR, Doom3, Thief, yeah, I'll run AA/AF maxed, but only because my GPU(s) pick up the slack in regards to performance, and because of the older titles I get really great performance.  My P4 chokes the 1950s so friggin hard, though, it ain't right.  Newer games, though, I don't even think about it - Crysis, for example, even if I run 800x600 chokes down to 1 FPS with everything set on low if I turn on AA.
> 
> As to the whole xFire/SLI bit, it's defi a niche audience.  TBH, if I hadn't been able to secure my two 1950s for as cheap as I did, I never would've gotten hold of a second GPU (total spent between the two was about $235).  Granted though, most people that do go the dual card route tend to go with the 1337 graphics cards, too, which makes the average dual-setup appear even more expensive.
> 
> ...



Yeah, TBH that would seriously interest me also.  The only reason I dont personally like Xfire/Sli (although as I said, I recognise some benefits) is because I hate to think I am paying for 2 cards but only getting 1.4-1.6 cards worth of performance!    If they are mid or mid/high cards then I just feel that investing in the top of the range at a similar price point is better value, if the 2 cards are top of the range....well wow!  they are, as we say in the UK, "the dogs gonads"


----------



## trog100 (Jan 12, 2008)

and they used to be at "the dogs gonads" prices tatty.. all thats changed is things have gotten cheaper.. there aint gonna be any more new super chip single cards for a while.. one or more of these x2 things will be "top end"..

if it follows simple logic one x2 effort will cost the same as yesterdays single card top end did.. 

trog


----------



## Tatty_One (Jan 12, 2008)

trog100 said:


> and they used to be at "the dogs gonads" prices tatty.. all thats changed is things have gotten cheaper.. there aint gonna be any more new super chip single cards for a while.. one or more of these x2 things will be "top end"..
> 
> if it follows simple logic one x2 effort will cost the same as yesterdays single card top end did..
> 
> trog



You may well be right there!.....lets see when the 9800GTX is revealed supposidly in about 7 weeks.


----------



## Wile E (Jan 12, 2008)

I don't mind this multi gpu on a single card thing we have going here. Cpus followed the same path, and now we have powerhouse multi-core cpus, and the programs to use it. I feel gpus should follow the same path and go multi-core.


----------



## erocker (Jan 12, 2008)

I'm thinking and hoping that R700 is going to be a beast.  I have no idea where Nvidia is going with thier cards though.


----------



## trog100 (Jan 12, 2008)

Tatty_One said:


> You may well be right there!.....lets see when the 9800GTX is revealed supposidly in about 7 weeks.



interesting thought tatty.. if it follows the same path as the 9600 thing it wont go much better than the old one.. so if it appears.. where will it sit.. 

will it sit below the x2 and be upper mid range or will it be the super whooper card every nividia fanboy is dreaming of.. he he he

some how i feel the fanboys will be disappointed.. i could be wrong..

trog


----------



## imperialreign (Jan 12, 2008)

erocker said:


> I'm thinking and hoping that R700 is going to be a beast.  I have no idea where Nvidia is going with thier cards though.



I'm looking forward to the rumored R700, also.  If any rumor from ATI has offered any real potential for reclaiming part of the market, it's this.

Considering AMD's background with dual core CPUs, I'm sure they've lent a lot of R&D for a dual core GPU.  nVidia will have their work cut out for them with this tech, as they don't, nor have they, designed multi-core processors . . . unless Intel come to the rescue and decide to help them out with R&D.


----------



## EastCoasthandle (Jan 12, 2008)

*Remember!*

With 2 3870s in CF using a Q6600 you get 15800.










source

With the 3870x2 with a Q6600 you get 17414





that's a substantial boost using 2 dies on one PCB vs 2 video cards bridge together, both using windows vista.  I suspect more will be gained when you go from 2 CPUs to dual core solution the more efficient the design more more performance is had.  Rumor has it that the RV670 was updated since it's release but there is no confirmation on that.

The CPU score is of interest to me.  Even though one is at 3.33 and the other at 3.0 it still appears that X2 is slightly faster then CF but theres no way to know until it's been released.


----------



## EastCoasthandle (Jan 12, 2008)

Post your thoughts?


----------



## DaMulta (Jan 12, 2008)

I wonder if they are 512 instead 256bit bus on the X2s.


----------



## xfire (Jan 12, 2008)

Now it would be awesome if AMD use the mobilit series of the 3850 for their IGP's. It would sell like hot cake and with some cooling it will oc well.(I know IGP aint for gaming but 50-60% use IGP's AFAIK)


----------



## Mussels (Jan 12, 2008)

Tatty_One said:


> On a serious note........do you game without AA enabled?



Most games, i do. I prefer max graphics with no AA, than medium graphics with med/high AA.

The witcher is an example, i can run 4xaa but i get the odd lag with magic effects and nothing detracts from a game experience more than lag.

edit: oh and lag for me is anything below 60 FPS. i am quite the nut for repsonsiveness.


----------



## yogurt_21 (Jan 12, 2008)

EastCoasthandle said:


> With 2 3870s in CF using a Q6600 you get 15800.
> 
> 
> 
> ...



ummm.... you do realize the 3870x2 was benched witha q6600 at 3.*3*GHZ while the dual 3870's in crossfire were benched at 3.*0*GHZ right?
on bottlenecked hardware you'd be surprised how much of a difference cpu performance makes.  I'll give you the x2 is faster but by how much we can't say until the playing field is level. lol


----------



## EastCoasthandle (Jan 12, 2008)

yogurt_21 said:


> ummm.... you do realize the 3870x2 was benched witha q6600 at 3.*3*GHZ while the dual 3870's in crossfire were benched at 3.*0*GHZ right?
> on bottlenecked hardware you'd be surprised how much of a difference cpu performance makes.  I'll give you the x2 is faster but by how much we can't say until the playing field is level. lol



Yes, I already know this LOL.  If you read my post I didn't include the the clock rate of the CPU but concentrated on the scores.   I already made the correction (I thought I did it earlier before submitting it).  Even with the different OCs (as there is nothing closer to this so far) it's still worth noting.


----------



## Tatty_One (Jan 12, 2008)

trog100 said:


> interesting thought tatty.. if it follows the same path as the 9600 thing it wont go much better than the old one.. so if it appears.. where will it sit..
> 
> will it sit below the x2 and be upper mid range or will it be the super whooper card every nividia fanboy is dreaming of.. he he he
> 
> ...



The 9800GTX is the one that is supposed to be at least twice as fast as the 8800GTX....I'll beleive that when I see it but that is supposed to sit at the very top of the pile and expect if that is the case, 2nd mortgages I think.


----------



## Mussels (Jan 13, 2008)

Tatty_One said:


> The 9800GTX is the one that is supposed to be at least twice as fast as the 8800GTX....I'll beleive that when I see it but that is supposed to sit at the very top of the pile and expect if that is the case, 2nd mortgages I think.



With Nvidia, hte 8800GTX *WAS* twice as fast as the 7900GTX. While normally i'd never beleive these claims, Nv has done it with the 8800, so i'll give them benefit of the doubt on the 9800.


----------



## asb2106 (Jan 13, 2008)

Tatty_One said:


> The 9800GTX is the one that is supposed to be at least twice as fast as the 8800GTX....I'll beleive that when I see it but that is supposed to sit at the very top of the pile and expect if that is the case, 2nd mortgages I think.




No kiddin!! The 8800gtx is really expensive, I cant even imagine the price of the 9800 series!  And Im hoping this is the time that Nvidia will finally give SLI drivers to the intel chipset!  I can only hope!


----------



## Mussels (Jan 13, 2008)

asb2106 said:


> No kiddin!! The 8800gtx is really expensive, I cant even imagine the price of the 9800 series!  And Im hoping this is the time that Nvidia will finally give SLI drivers to the intel chipset!  I can only hope!



You must be new at this 

8800GTX/ultra = current top two cards, expensive.
9800GTX will come out, slightly more expensive than the 8800's and then the 88GTX/ultra will stop being made.

After 3 or so months, the 9800 will cost the same as the 8800 did, because its filling the same price niche.

And Nvidia wont give a 'driver' to intel for SLI, it doesnt work that way. Nvidia have deemed it neccesary to have an SLI chipset on the motherboard, so you have to buy an Nvidia mobo to run SLI.


----------



## asb2106 (Jan 13, 2008)

Mussels said:


> You must be new at this
> 
> 8800GTX/ultra = current top two cards, expensive.
> 9800GTX will come out, slightly more expensive than the 8800's and then the 88GTX/ultra will stop being made.
> ...



hey i must be eh.  
Im sure it will come out at a premium to the ultras, and I wouldnt never spend that kind of money on a video card, you have to have to much money or be stupid.

And intel chipsets can run SLI, did it with the 6800's but Nvidia locked drivers down not to allow it.  Intel Skullfield boards have SLI support but only because they have a Nvidia Chipset allowing the SLI.  I know its simply a driver standpoint that wont allow SLI, I have used a pair of 7800GTX's on my P965 board with modded drivers.  

Your right, I must be new to this

EDIT** not to be rude, im sorry about that


----------



## Mussels (Jan 13, 2008)

well you dont know about the pricing - why would it be a premium. EVERY generation ends up with the high end at a similar price. New replaces old, in the same price range. Happens every time.


 you dont appreciate high end hardware, and just called me stupid for buying a GTX.

Yes, you COULD run SLI on modded drivers on 965 chipsets - try reading my post where i said "Nvidia have deemed it neccesary to have an SLI chipset on the motherboard"
Skulltrail has an Nvidia chipset on the board, so it has SLI support.


----------



## asb2106 (Jan 13, 2008)

Mussels said:


> well you dont know about the pricing - why would it be a premium. EVERY generation ends up with the high end at a similar price. New replaces old, in the same price range. Happens every time.
> 
> 
> you dont appreciate high end hardware, and just called me stupid for buying a GTX.
> ...



well im not gonna argue, so im sorry i insulted you


----------



## Mussels (Jan 13, 2008)

fair enough, i respect that.


----------



## asb2106 (Jan 13, 2008)

Mussels said:


> fair enough, i respect that.



hey, unrelated question, how does your GT compare to your GTX?


----------



## xfire (Jan 13, 2008)

Any one used the omega drivers on their HD3850/70.
These drivers used to bring out some extra performance from my old IGP x200.


----------



## Ati Addictive (Jan 13, 2008)

That's pretty interesting ofcourse but omega drivers are only better if you only benchmark and don't do gaming. The reason for this is light corruption in games like Crysis(i tested it)
Further of all everybody who had a version of Vista running should never use omega i really get's you depressed than that all i will say about it


----------



## xfire (Jan 13, 2008)

I check the speed diffrence in games. I never tried crysis so I can't say about that.
but on my x200 omega did improve gaming performance.


----------



## Mussels (Jan 13, 2008)

xfire said:


> I check the speed diffrence in games. I never tried crysis so I can't say about that.
> but on my x200 omega did improve gaming performance.



modded drivers cant really change that much, except turn things on that were disabled for a reason. Lets say at one time there was a bug they couldnt solve, so they decided to generally turn some setting off to make things work faster and forgot to turn it back on - thats where these driver mods came from, as people turned them back on before ATI/nvidia did. The odd program had issues, but 99% of people didnt care.

Since then, these driver 'modders' compete really fiercely so they turn anything and everything on, and tweak things they dont understand - while say, 50% of people get a speed boost, 30% get graphical problems/errors, and the other 20% get totally non working drivers.

Most people just stay away from the modded ones, as few of them undergo any real quality testing these days.


----------



## mlutag (Jan 13, 2008)

Hey I know u guys aren't talking about this but I need help....what do u think I should buy...another 2600XT for crossfire gaming or get an Nvidia 8800gt. My mobo is an MSI P35 neo2 and one of the PCI-E buses is at 4x speed so will this affect performance or what. I don't have a lot so cash I cant buy two 8800gt's for SLI.....I think my board supports both SLI and Crossfire?.......my CPU is  Core 2 6550. I have 2GB of ram would an additional 1GB help me in crysis.


----------



## Tatty_One (Jan 13, 2008)

mlutag said:


> Hey I know u guys aren't talking about this but I need help....what do u think I should buy...another 2600XT for crossfire gaming or get an Nvidia 8800gt. My mobo is an MSI P35 neo2 and one of the PCI-E buses is at 4x speed so will this affect performance or what. I don't have a lot so cash I cant buy two 8800gt's for SLI.....I think my board supports both SLI and Crossfire?.......my CPU is  Core 2 6550. I have 2GB of ram would an additional 1GB help me in crysis.



Alledgedly (I say that from what I have read as opposed to personal experience)......16 x 4 will translate across the board to around a 15% performance hit in comparison to 16 x 16 on an ATI XFire setup.


----------



## Mussels (Jan 13, 2008)

Tatty_One said:


> Alledgedly (I say that from what I have read as opposed to personal experience)......16 x 4 will translate across the board to around a 15% performance hit in comparison to 16 x 16 on an ATI XFire setup.



the two ATI cards will run a little faster, i prefer single cards (less heat/power used) so i'd suggest buying the 8800GT and selling the 2600XT to lessen the cost.


----------



## Tatty_One (Jan 13, 2008)

Mussels said:


> the two ATI cards will run a little faster, i prefer single cards (less heat/power used) so i'd suggest buying the 8800GT and selling the 2600XT to lessen the cost.



Same here but I was answering his quesation about XFire differences between 16 x 4 and 16 x 16, I too would prefer the single card solution but if it was between the HD3870's I would not go for the 8800GT but the 8800GTS 512MB G92, still cheaper than the 2 3870's.


----------



## mlutag (Jan 13, 2008)

Thanx 4 ur messages replys guys....Im just trying to build a system that will last me for some time and play my games at descent resolutionds. My monitor has a default resolution of 1440X900 so I just want to be able to play Crysis at this resolution on the high level at least.
So becoz I cannot run sli on my board is it better to buy an ati so I can buy another in the future...even at 4X speed? I think ATI rock and hav always beena fan all my gaming life but of late I hav to say Im pretty disappointed with them.


----------



## Mussels (Jan 13, 2008)

i cant run crysis on full with my system. Crysis is too heavy on requirements, and runs like crap for everyone - people who say they run it maxed out, are generally lying, mistaken, or at very low resolutions.

a single 8800GT/GTS will let you run a combination of medium and high with a good 60+ framerate.


----------



## snuif09 (Jan 13, 2008)




----------



## trog100 (Jan 13, 2008)

Mussels said:


> i cant run crysis on full with my system. Crysis is too heavy on requirements, and runs like crap for everyone - people who say they run it maxed out, are generally lying, mistaken, or at very low resolutions.
> 
> a single 8800GT/GTS will let you run a combination of medium and high with a good 60+ framerate.



nicely put.. he he he

trog


----------



## mlutag (Jan 13, 2008)




----------



## asb2106 (Jan 13, 2008)

Mussels said:


> i cant run crysis on full with my system. Crysis is too heavy on requirements, and runs like crap for everyone - people who say they run it maxed out, are generally lying, mistaken, or at very low resolutions.
> 
> a single 8800GT/GTS will let you run a combination of medium and high with a good 60+ framerate.



Not at good resolutions at least!

I play 1680x1050 most on high, a few on med, and it looks just fine for me!


----------



## Ati Addictive (Jan 13, 2008)

a few days i was at a buddy of mine who got the new AMD Phenom 9600, after we instaled all the drivers and used a little OC on the card it ran pretty good on High performance. it ran pretty good sometimes a verry little lagg but it was soon over.

But you have 2 remember that crysis doesn´ t only render the gaming gaming area like most games but also the invorment(Nature) all around that takes alot of your PC so if you want to play it better go quadcore it work!


----------



## vaperstylz (Jan 13, 2008)

Mussels said:


> i cant run crysis on full with my system. Crysis is too heavy on requirements, and runs like crap for everyone - people who say they run it maxed out, are generally lying, mistaken, or at very low resolutions.
> 
> a single 8800GT/GTS will let you run a combination of medium and high with a good 60+ framerate.



Are you talking about before or after the Crysis performance patch?


----------



## asb2106 (Jan 13, 2008)

vaperstylz said:


> Are you talking about before or after the Crysis performance patch?



are you talking about the 1.1 patch that was just released, or some other patch??


----------



## vaperstylz (Jan 13, 2008)

1.1 is the only Crysis patch out right now.


----------



## wolf (Jan 13, 2008)

i get very playable fps @ 1440x900 no AA, min: 19, max: 55 avg: 28

and thats using the xp hack for very high, as to wether its real dx10 or just imitation, it looks daaaaaammmn nice, imo it looks and plays better than my mates does on his stock GTX on vista with full dx10, who can complain.

oh and thats v1.0, im getting the patch tonite, if i can crack 30fps avg and get rid of the semi-random slowdowns ill be stoked


----------



## Tatty_One (Jan 13, 2008)

I do not have the full game of Crysis.....just the demo so I appreciate that particular level may or not be the most graphically demanding, not even sure if the demo is DX10? Not sure where it would sit in the greater scheme of the game but it is outside with an aweful lot of grass.....trees etc and at 16xx x 10xx I can max everything and play smoothly, thats with my Quad at 3.8gig (although I dont think the game is CPU dependant, my graphics card IS at the speeds it's running) and my MSI 8800GTS 512MB at 825mhz core, 2100mhz shaders etc etc.


----------



## Solaris17 (Jan 13, 2008)

Mussels said:


> i cant run crysis on full with my system. Crysis is too heavy on requirements, and runs like crap for everyone - people who say they run it maxed out, are generally lying, mistaken, or at very low resolutions.
> 
> a single 8800GT/GTS will let you run a combination of medium and high with a good 60+ framerate.



not necessarily true i run mine all on high no af or AA at 1440x900 and it runs at like 25-30fps....it isnt 60 but its more than the human eye can pick up so its playable smooth for me am i lying?


----------



## Tatty_One (Jan 13, 2008)

Solaris17 said:


> not necessarily true i run mine all on high no af or AA at 1440x900 and it runs at like 25-30fps....it isnt 60 but its more than the human eye can pick up so its playable smooth for me am i lying?



No your not   (lying).......cause you have not maxed everything.....you said yourself, no AA   I was including AA maxed also in my comments with the demo.


----------



## Solaris17 (Jan 13, 2008)

o i see ya no AA or AF lol that would bring my system to its knees.


----------



## ShadowFold (Jan 13, 2008)

I play Crysis with all on High and 2x AA 1440x900 18-34fps


----------



## trog100 (Jan 13, 2008)

ShadowFold said:


> I play Crysis with all on High and 2x AA 1440x900 18-34fps



so.. u prefer laggy frame rates with maximum pretties to none laggy frame rates with less pretties.. most would say your frame rates are way to low for good playability.. but its your choice..

trog


----------



## ShadowFold (Jan 13, 2008)

I've always used midrange cards till now. Unplayable for me is under 25fps.


----------



## Mussels (Jan 14, 2008)

asb2106 said:


> hey, unrelated question, how does your GT compare to your GTX?



because its modded to 740 core and runs passively, in some ways its better. its above GTS speeds, and gets around 15k in 3dm05 compared to the 18k of my GTX.

Cant compare 06, because the HDTV its on cant run at 1280x1024.



			
				everyone elses stuff said:
			
		

> ...



Crysis is barely using dual cores, so quads doesnt help at all. overclocking to higher MHz is what will help this game.

people saying 'it runs great with an average of xx FPS' - i'm sorry, if your average is below 30 thats crap. underneath 30 is laggy, and having laggy as your average is just not fun.
Some people may not see it just like refresh rates - i can see flickering below 90Hz refresh, and when gaming i CAN feel a difference below 60 FPS.

I'm a high speed gamer, and a tiny spike of lag throws your aim/movements off, and you can lose health/die while you're in the middle of a lag spike.


P.S - what performance patch. Crysis was updated for bugs, but there was no performance patch. And for those getting confused, crysis has HIGH and VERY HIGH (DX10) and if you start talking maxed out, you better be talking DX10 very high, because THAT is max graphics in crysis (under vista, theres no real way to set it to DX9)


----------



## Solaris17 (Jan 14, 2008)

trog100 said:


> so.. u prefer laggy frame rates with maximum pretties to none laggy frame rates with less pretties.. most would say your frame rates are way to low for good playability.. but its your choice..
> 
> trog



well when like oblivion came out and i got my 1600XT i put all settings on crazy high with my athlon xp @ 2.3 it was sooo ridiculously slow 5fpsFTW but i did it for screen shots might as well have the pretties all the way up if you want a good background.


----------



## Dr. Spankenstein (Jan 14, 2008)

Not too hard to right-click and select "Play DX9" or add the -dx9 switch after the .exe in thg game's shortcut, is it?!?


----------



## asb2106 (Jan 14, 2008)

*Help*

I need help thats quite unrelated to the topic, Im looking to get drivers for the 3870 for OSX Leopard.  Im a PC guy all day but all the buzz has peaked my interest and I wanted to test it.  I have it up(infact im on it now) and I cannot give it a fair result without having proper drivers for the video.  On my 2 monitors(24 & 19) Im running 1024*768, and its horrible, if anyone has anything please let me know.

And Im sorry about posting here!


----------



## ShadowFold (Jan 14, 2008)

asb2106 said:


> I need help thats quite unrelated to the topic, Im looking to get drivers for the 3870 for OSX Leopard.  Im a PC guy all day but all the buzz has peaked my interest and I wanted to test it.  I have it up(infact im on it now) and I cannot give it a fair result without having proper drivers for the video.  On my 2 monitors(24 & 19) Im running 1024*768, and its horrible, if anyone has anything please let me know.
> 
> And Im sorry about posting here!



lol mac

anyways I dont think ATi makes drivers for Macs because no one needs them and how did you get one with a mac mobo


----------



## asb2106 (Jan 14, 2008)

ShadowFold said:


> lol mac
> 
> anyways I dont think ATi makes drivers for Macs because no one needs them and how did you get one with a mac mobo



Im actually using my computer in my sig, its up and running, audio, chipset, USB, everything but video!  Its not really that fun, but i figure i need to use it for alittle while to get used to it, I want to see what all the rave is about these video editing software they have, and the pics.  Havent yet cause im working on the vid drivers

**It even recognized my raids!


----------



## ShadowFold (Jan 14, 2008)

My grandma is a mac fangrandma(?) that has OSX and its not that great.


----------



## asb2106 (Jan 14, 2008)

ShadowFold said:


> My grandma is a mac fangrandma(?) that has OSX and its not that great.



I have this jacka*s at work who tells me his mac is super fast, and hes pissing me off so I told him Id figure out how to get it on my computer and we will run some benches and ill kick his ass.  I hate mac so this is more of making a point


----------



## Mussels (Jan 14, 2008)

Dr. Spankenstein said:


> Not too hard to right-click and select "Play DX9" or add the -dx9 switch after the .exe in thg game's shortcut, is it?!?



right click what exactly? the shortcut to the game?

I didnt know about the DX 9 switch, that'll help.


----------



## asb2106 (Jan 14, 2008)

Mussels said:


> right click what exactly? the shortcut to the game?
> 
> I didnt know about the DX 9 switch, that'll help.



In the Vista game folder you can just right click the game exe and click play as DX9.


----------



## Mussels (Jan 14, 2008)

oh the games explorer?

1. i dont use that. me likey shortcuts!
2. i dont use it cause half my games dont show up in it. I dont know why but its broken, crysis and CoH didnt appear in there OR give me normal shortcuts.


----------



## asb2106 (Jan 14, 2008)

Mussels said:


> oh the games explorer?
> 
> 1. i dont use that. me likey shortcuts!
> 2. i dont use it cause half my games dont show up in it. I dont know why but its broken, crysis and CoH didnt appear in there OR give me normal shortcuts.



I drag my shortcuts in there, and it automatically cleans em up!


----------



## trog100 (Jan 14, 2008)

my 1900xtx which was the best card there was when oblivion first came out had to be tweaked very carefully to make half a job of playing the game.. i cant play crysis now on dx9 high at a reasonable resolution.. 

as for dx 10 very high.. well forget it..

trog


----------



## Ati Addictive (Jan 15, 2008)

Crysis is a very heavy game to play everybody knows that! i mean even with my system if have troubles keeping lagg down even with OC on Processor and GPU


----------



## Monkeywoman (Jan 15, 2008)

i play the demo, 800x600 on high. i love it, whats the point of having a high rez?....HAHAHAHHAHAHAAHHA


----------



## phanbuey (Jan 15, 2008)

yeah oblivion used to be what crysis is now... it would tear up all cards when outdoors with goodies turned on (with maybe the exception of 1900 and 1950XTXs, but those were some rediculous cards)... but in all fairness, the crytek engine is absolute shit when compared to unreal tournament or even the hl2 engine, if you took a tweaked hl2 and made it dx10 with better textures, it would look almost the same and run 200% faster...


----------



## Mussels (Jan 15, 2008)

Crysis seems more of an engine than a game. So many features of that game arent even in there (several vehicles etc) that only show up/can be used after messing with the editor.


----------



## a_ump (Jan 16, 2008)

crysis is crazy, there isn't a system i've seen that can run it on very high 4xaa 16xaf at at least 1680x1050, let alone 1900x1200. i just hope my system that i ordered can run it at 1280x1024 very high,4xaa 16xaf, though i dout it :-(


----------



## trog100 (Jan 16, 2008)

its a game engine.. made for the future.. in truth we aint supposed to be able to max it out now.. 

as for oblivion.. i couldnt max that out even with my overclocked 1900xtx.. it had to be carefully tweaked just like crysis.. shadows were a big hit.. i ran it just for a test the other week.. i can just about max it out now at my 1680 x 1050 resolution.. there aint a lot to spare thow..

it still looks pretty now thow.. 

trog


----------



## candle_86 (Jan 16, 2008)

yogurt_21 said:


> yeah, I've no problem with that, I'd much rather run single than mess with dual. the dualchip card war will come down to how well they oc really, as crossifre scales better which would give the dual 3870 and initial lead, but since the g92 base gts's overclock better, it might coem out as the better value. trouble is, neither side has launched a successful dual card previously soo.... I don't really think in either game support/drivers or cost that these will be a better buy than just getting two seperate cards, or better yet one single card and a good watercooler.



the 7950GX2 singly handly took the cake from ATI, it was very successful in 2006.

Crysis is just eyecandy really anyway, i played it and sold it. Ill stick to games with good SP from now on, not FarCry redux


----------



## ShadowFold (Jan 16, 2008)

candle_86 said:


> the 7950GX2 singly handly took the cake from ATI, it was very successful in 2006.


----------



## erocker (Jan 16, 2008)

I haven't benched my system with Crysis, but ever since the patch, I've been able to OC my card with good results with the game.  I currently run my card at 700/1100/1650 with XP and the DX10 mod with high physics, sound, and game effects.  It plays rather nicely.


----------



## erocker (Jan 16, 2008)

It WILL be launched in one weeks time! http://www.tomshardware.com/2008/01/15/amd_moving_up_launch_of_dual_gpu_graphics_card/

I may have to get my new motherboard and quad core early?!


----------



## trog100 (Jan 16, 2008)

not in the UK it wont..

trog


----------



## Ati Addictive (Jan 16, 2008)

Jeejj great news saw it when i was at school was soooo happy  now only to wait for conformation from AMD i know this just is a false post i can feel it.


----------



## asb2106 (Jan 17, 2008)

Im really hoping it comes out on the 23rd!!! That would be awesome! 

I would have to guess it will work, but do you think the GPU cooling mount will be the same as the 3870(the 4 holes around the GPU).  I have a couple MCW60's around from my 1950pro Xfire and I see those fit on the 3870.  I would have to guess that it would have to be confirmed but I dont see why they would change that right?


----------

