# Dual 9600 GT's or Single 8800 GTS (g92)



## commandercup (Jun 5, 2008)

The title says it all!

I currently own a single PNY 9600 GT... should I buy another one to supplement it for the arrival of my new 22" monitor? Or should I sell it/trade it for a single 8800 GTS which will then leave a better upgrade path in the future?


----------



## ktr (Jun 5, 2008)

9600GT in sli is pretty good. They have one of the best scalers thus far.


----------



## cdawall (Jun 5, 2008)

just grab another 9600GT and wait for GT200


----------



## farlex85 (Jun 5, 2008)

Since your already almost there you might as well get the extra gt. Then yeah, sell the pair for a g200.


----------



## wolf2009 (Jun 5, 2008)

if u want to buy a 9600GT , take a look at this
one .


----------



## commandercup (Jun 5, 2008)

meh, nice deal there wolf, but I'll probably buy used to save money

and farlex, isn't GT200 supposed to be a letdown? I read about that on here and in MaxPC


----------



## Solaris17 (Jun 5, 2008)

i have dual 9600's and i get close to above or the same performance and 3dmark score as an 8800ultra and as ktr said they scale REALLY well


----------



## farlex85 (Jun 5, 2008)

commandercup said:


> meh, nice deal there wolf, but I'll probably buy used to save money
> 
> and farlex, isn't GT200 supposed to be a letdown? I read about that on here and in MaxPC



Why would it be a letdown? Its going to be expensive at the outset. I don't know, maybe it will be disappointing. Wait and see I guess.


----------



## commandercup (Jun 5, 2008)

well I heard that the GT200 will cost way too much because Nvidia can't manufacture it as reliably... and because the current graphics technology being used by Nvidia and AMD has reached the point of no return (where you pay a lot more for minimal gains), rasterization or w/e

intel + larrabee?


----------



## largon (Jun 5, 2008)

Don't bother with SLi. 
Or _any multi-GPU_ solution for the matter. They're only good for benchmarking, not for gamers. 
AFR (alternate frame rendering) will kill your brain.


----------



## cdawall (Jun 5, 2008)

largon said:


> Don't bother with SLi.
> Or _any multi-GPU_ solution for the matter. They're only good for benchmarking, not for gamers.
> AFR (alternate frame rendering) will kill your brain.



umm wrong it works good on my stuff


----------



## theonetruewill (Jun 5, 2008)

largon said:


> Don't bother with SLi.
> Or _any multi-GPU_ solution for the matter. They're only good for benchmarking, not for gamers.
> AFR (alternate frame rendering) will kill your brain.



I would usually agree with you completely largon, however I thought this article was really good. It was part of a massive compilation of dual card configurations and the overall analysiss of SLI and Xfire in general was that it was bugged to hell and shoddy value. The 9600GT'showever stood out as being pretty damn good. Just something to consider.
I have also added another card's results to convey the main article's criticisms.


----------



## largon (Jun 5, 2008)

*cdawall*,
Good for you. 
But then again, those boat loads of people that are disgusted by AFR microstuttering certainly agree with me. 

*theonetruewill*,
The problem is those framerates gained from multi-GPU systems are kind of "bloated". 
The methods that allow SLi/CF to work efficiently enough to be worth it causes the frames to get unsynchronized. Sure, the frame rates do often increase considerably but the actual user experience is clearly degraded.


----------



## commandercup (Jun 5, 2008)

lol thats actually quite interesting!

nice that they tested the SLI config on 1680 x 1050 which is what I'll be using

I guess you've sold me

9600 GT it is!

edit:

whats AFR? I think I've heard how SLI/Xfire is handled... although it does sound stupid, apparently it works with the 9600 gt


----------



## ktr (Jun 5, 2008)

here is what it is: http://en.wikipedia.org/wiki/Alternate_Frame_Rendering


----------



## largon (Jun 5, 2008)

Basically SLi/CF will get you higher numerical FPS but visually the stream of images is less pleasant than similar - or even lower - framerates rendered with a single GPU.


----------



## DaMulta (Jun 6, 2008)

largon said:


> Don't bother with SLi.
> Or _any multi-GPU_ solution for the matter. They're only good for benchmarking, not for gamers.
> AFR (alternate frame rendering) will kill your brain.



YOU LIE, or miss informed hehe

Need for speed pro street.

On my system STOCK 

QX9650
DDR3 1333
8800GT
Can run coppy at a high res

But if I install another 8800GT for Sli those issues go away.


----------



## wolf (Jun 6, 2008)

largon said:


> Basically SLi/CF will get you higher numerical FPS but visually the stream of images is less pleasant than similar - or even lower - framerates rendered with a single GPU.



+1, FRAPS will tell you your FPS is higher, and your 3dmark score will go up, however my experience of SLi/CF is as Largon states.

my user experience with these technologies was not nearly as good as any experiences i have had with getting 1 new GPU and banging it in and getting more FPS.

theres a but load more frame stutter, frame sync issues, and generally an unreliable playing experience.

also don't be fooled by the numbers benchies and fraps give you. the Average and Minimum FPS are what matter the most. and my experience of multi-gpu shows that those figures go mostly unchanged, perhaps even lower, at the gain of a high max FPS, which is for all intensive purposes, superfluous. 

-Wolf


----------



## niko084 (Jun 6, 2008)

I would hold out, but if you want the kick get the second 9600GT, or sell that 9600GT and buy a 9800gx2 or something lol..


----------



## wolf (Jun 6, 2008)

nah 9800GTX is the best card on the market today. dual gpu configs just don't count.

and its great value too. still in any case wait. get the GTX260 if you dont want to spend that much, it should still quite easily beat an 8800GTX/Ultra


----------



## niko084 (Jun 6, 2008)

wolf said:


> nah 9800GTX is the best card on the market today. dual gpu configs just don't count.
> 
> and its great value too. still in any case wait. get the GTX260 if you dont want to spend that much, it should still quite easily beat an 8800GTX/Ultra



Or see what the HD4850/4870 do, from the looks of it they will be smashing Nvidia's new cards so far, but only time can tell.


----------



## largon (Jun 6, 2008)

*DaMulta*,
I'm not saying SLi/CF wouldn't eliminate low framerates, it's a given it does just that. But there's a nasty drawback in having 2 separate GPUs drawing _alternate frames_. I guess the problem could be countered by sync'in the frames via drivers or using both GPUs in co-op rendering for the same frame (SFR) or tiling - the 2 latter aren't popular as they severely decrease the gain from using 2 GPUs. Dunno why the driver sync hasn't been done, maybe it causes major performance issues?


----------



## intel igent (Jun 6, 2008)

single card FTW!


----------



## wolf (Jun 6, 2008)

niko084 said:


> Or see what the HD4850/4870 do, from the looks of it they will be smashing Nvidia's new cards so far, but only time can tell.



do you honestly think that an R600 with 50% more shades, texture units and some extra memory bandwidth will really be able to take down an Nvidian beast with 240Sp's, 32 ROPS and 1024MB of 512-bit memory?

even with this newfound power, R700 still wont have as much texturing ability as G92 8800, let alone GTX2xx, and will still be severely lacking in pixel fillrate compared to GTX2xx

your right in saying that only time will tell, however from what weve seen of R600, G80 and G92, my bet is on nvidia staying on top.

and please nobody jump up and say how the 4870X2 will chop it, thats a really stupid argument, if you want to use a dual gpu config to compare, then so do i, and i really doubt that a 4870x2 will beat GTX280 SLi, ANYONE who thinks otherwise, please indulge me.

also ive included a very relevant quote to the dual ati gpu vs single/dual nv gpu argument:



> The G92 GPU's sheer potency creates a problem for Nvidia, though, when it becomes the building block for three- and four-way multi-GPU solutions. We saw iffy scaling with these configs in much of our testing, but I don't really blame Nvidia or its technology. The truth is that today's games, displays, and CPUs aren't yet ready to take advantage of the GPU power they're offering in these ultra-exclusive high-end configurations. For the most part, we tested with quality settings about as good as they get. (I suppose we could have tested with 16X CSAA enabled or the like, but we know from experience that brings a fairly modest increase in visual fidelity along with a modest performance penalty.) In nearly every case, dual G92s proved to be more than adequate at 2560x1600. We didn't have this same problem when we tested CrossFire X. AMD's work on performance optimizations deserves to be lauded, but one of the reasons CrossFire X scales relatively well is that *the RV670 GPU is a slower building block*. *Two G92 GPUs consistently perform as well as three or four RV670s*, and they therefore run into a whole different set of scaling problems as the GPU count rises.



see i suspect this will be much the same with the next gen, ATi may continue to scale better, but that will only be because nvidias cards are considerably better to start with.

my 2 cents.
  -Wolf


----------



## farlex85 (Jun 6, 2008)

I actually think the new ones may scale better. The G94 in the 9600gt scales at roughly 90%, far better than any nvidia card that came before. And I agree w/ you wolf, from where I've sitting nothing new is coming to the table w/ the next round of vc, just improvements, and nvidia seems like they will remain king of the hill, although ati can offer some solid price/performance.


----------



## wolf (Jun 6, 2008)

+1 farlex, its all just adding more of what they both know works.

but actually i have had 2 nvidia cards scale 95% in FEAR, (not 9600GT's) 10 fake dollars to the first to guess what cards they are


----------



## farlex85 (Jun 6, 2008)

8600gt maybe?


----------



## v-zero (Jun 6, 2008)

wolf said:


> and please nobody jump up and say how the 4870X2 will chop it, thats a really stupid argument, if you want to use a dual gpu config to compare, then so do i, and i really doubt that a 4870x2 will beat GTX280 SLi, ANYONE who thinks otherwise, please indulge me.



Well, since they are coming in at virtually the same price point they are in direct competition - so I say: 4870x2 will beat GTX280, and they are in direct competition.
Consider this: it's not as if you complain that quad-cores aren't in competition with dual-cores because they have more cores and so it isn't a fair comparison... The method of Crossfire on the new cards is more advanced than anything before.
Lastly, texture fillrate is far from as important as shader ability in modern games, and that is where ATi win out - transistor to transistor they will have a faster card.


----------



## mrw1986 (Jun 6, 2008)

largon said:


> Don't bother with SLi.
> Or _any multi-GPU_ solution for the matter. They're only good for benchmarking, not for gamers.
> AFR (alternate frame rendering) will kill your brain.



Not true at all. I have tested SLI with my 8800GT's and CrossfireX with my 3870X2 + 3870 and gaming performance increased a lot. There was no visible difference in image quality or anything. If you can play the game at acceptable framerates with a dual card setup who cares about the "problems" AFR has...problems which I or a boatload of my friends with multi-card setups NEVER experience.


----------



## wolf (Jun 6, 2008)

v-zero said:


> Well, since they are coming in at virtually the same price point they are in direct competition - so I say: 4870x2 will beat GTX280, and they are in direct competition.
> Consider this: it's not as if you complain that quad-cores aren't in competition with dual-cores because they have more cores and so it isn't a fair comparison... The method of Crossfire on the new cards is more advanced than anything before.
> Lastly, texture fillrate is far from as important as shader ability in modern games, and that is where ATi win out - transistor to transistor they will have a faster card.




surely enough it may cost the same, but the people out there who only settle for the best will no doubt choose nvidia. price may be a relevant argument for the general populous, however for the niche market out there who want the best, a 4870X2 wont touch GTX280SLi, and dual 4870X2's probably wont fare any better than dual 3870's.

not to mention you cannot guarantee dual GPU performance in 100% of games, whereas, for example, taking an 8800Ultra vs 3870X2, surely enough there are situations that the X2 wins, however it is still very close anyway, AND i guarantee you there are thousands of games out there with no multi GPU support where the X2 will flounder, leaving the Ultra with its awesome sheer grunt to cut thru any pixels.

imo you should never buy a dual gpu solution in 1 card. the day either company can guarantee 100% load splitting between gpu's for a 100% guaranteed gain, is the day you should buy that solution, and not before.

price is not an argument that should be used when we are trying to decide who is the gfx king. like ive said before surely enough the X2 is ONE solution, but it is still 2 GPU's that do NOT scale perfectly.

to decide whom is the king either compare 1gpu to 1gpu, or 2 vs 2, etc. price comes into play later, when we can actually get our greasy mits on either card.

and well done farlex it was indeed dual 8600GT's


----------



## largon (Jun 6, 2008)

mrw1986 said:


> Not true at all.


You can't disqualify experiences of other based on yours. Lot's of people see microstuttering a problem that renders multi-GPUs worthless _in their opinion_.





> I have tested SLI with my 8800GT's and CrossfireX with my 3870X2 + 3870 and gaming performance increased a lot. There was no visible difference in image quality or anything.


Performance increase is not a matter of debate. It's a given. Why else would you use multi-GPU if not for FPS? 
Image quality? Nobody mentioned that. 





> If you can play the game at acceptable framerates with a dual card setup who cares about the "problems" AFR has...problems which I or a boatload of my friends with multi-card setups NEVER experience.


Good for you and your friends. 
But it is clear "AFR microstuttering" does exist. Your trouble-free experience doesn't proove the problem doesn't exist. For example, some people are more sensitive to refresh rates, clearly this is a similar matter.


----------



## niko084 (Jun 6, 2008)

wolf said:


> do you honestly think that an R600 with 50% more shades, texture units and some extra memory bandwidth will really be able to take down an Nvidian beast with 240Sp's, 32 ROPS and 1024MB of 512-bit memory?
> 
> even with this newfound power, R700 still wont have as much texturing ability as G92 8800, let alone GTX2xx, and will still be severely lacking in pixel fillrate compared to GTX2xx
> 
> ...



Considering the HD4850 on pre-release drivers is beating the 3870x2 by a decent amount already.... Then add the fact that its only a $200 range card. Lots of people are saying Nvidia's new release is not going to stand up, lots of Nvidia fanboys even.


----------



## mrw1986 (Jun 6, 2008)

largon said:


> You can't disqualify experiences of other based on yours. Lot's of people see microstuttering a problem that renders multi-GPUs worthless _in their opinion_.Performance increase is not a matter of debate. It's a given. Why else would you use multi-GPU if not for FPS?
> Image quality? Nobody mentioned that. Good for you and your friends.
> But it is clear "AFR microstuttering" does exist. Your trouble-free experience doesn't proove the problem doesn't exist. For example, some people are more sensitive to refresh rates, clearly this is a similar matter.



You may not have said the FPS matter directly, but you hinted at it with "They're only good for benchmarking, not for gamers." It's great you want to state your opinion, but you also need to hear the facts. Fact of the matter is, a dual GPU setup will always be superior even if their is an AFR problem which I've never heard of or seen. I've never ONCE seen anyone complain of it.


----------



## Solaris17 (Jun 6, 2008)

iv never seen or had an AFR prob


----------



## largon (Jun 6, 2008)

That's funny cause it's quite widely known (atleast on other sites) and both AMD and nVIDIA publicly admit it's existence. 
For example UT3 is unplayable with CF.


----------



## Solaris17 (Jun 6, 2008)

huh thats odd i dont get it with UT3 or any other game..do you know if its AFR itself or only a certain mode largon?


----------



## DaMulta (Jun 6, 2008)

There are somegames that do not like it. 


With CF I had to run the game in 16x AA on some games and the normal 6xaa for others. With an SLi setup there is issues too. On some games your forced to run AA no matter if you want it or not. 

Overall I like running more than one card. I can run more AA on older games, and it really does keep you frame rates over low spots in a lot of games.


----------



## theonetruewill (Jun 6, 2008)

mrw1986 said:


> Fact of the matter is, a dual GPU setup will always be superior even if their is an AFR problem which I've never heard of or seen.


Sorry, are you saying dual cards are always better than one? If you really believe that prepare to be crushed my friend. That's just plain rubbish- do not generalize.


----------



## xu^ (Jun 7, 2008)

for me personally SLI/Xfire doesnt offer enough of a perfomance increase to justify the cost of another card.

in theory ud think ud get at least a 90% increase due to 2 cards being used but this is not so ,when they can guarantee 70%-80% in virtually all games then id consider it,until then i see it as a gimmick tbh.

back in the voodoo 2 days it was the same problem and ppl lost interest ,now they r flogging it to a whoel new generation with almost the exact same problems lol.


----------



## mrw1986 (Jun 7, 2008)

theonetruewill said:


> Sorry, are you saying dual cards are always better than one? If you really believe that prepare to be crushed my friend. That's just plain rubbish- do not generalize.



How are you telling me 2 aren't better than 1? If you have say a 9600GT adding a second would increase performance. As long as you are comparing apples to apples. Obviously 2 9600GT's aren't going to perform as good as say a 3870X2. Don't tell me its rubbish, its simple knowledge.


----------



## PaulieG (Jun 7, 2008)

mrw1986 said:


> How are you telling me 2 aren't better than 1? If you have say a 9600GT adding a second would increase performance. As long as you are comparing apples to apples. Obviously 2 9600GT's aren't going to perform as good as say a 3870X2. Don't tell me its rubbish, its simple knowledge.



It's more about diminishing returns. As we all know running 2 9600GT's are not going to double performance. I think 2 9600GT's are about equal to a overclocked 8800GTS 512. A 8800GTS can now be had for $200, and 2 9600's are around $275. So you're paying an extra $75 for the same peformance. Not worth it.


----------



## commandercup (Jun 7, 2008)

well there is the fact that two 9600 gt's = 1 gig memory? thus it'll help out more with higher resolutions... and I already have a 9600 GT...


----------



## theonetruewill (Jun 7, 2008)

mrw1986 said:


> How are you telling me 2 aren't better than 1? If you have say a 9600GT adding a second would increase performance. As long as you are comparing apples to apples. Obviously 2 9600GT's aren't going to perform as good as say a 3870X2. Don't tell me its rubbish, its simple knowledge.



In the 9600GT's case 2 seems to be better than one. But SLI and Crossfire in general are bugged quite a bit. I garantee you there are plenty of games where dual graphics configurations cause SLOWDOWNS due to poor drivers and compatibility. OK so lets play the evidence game shall we? I'll dredge up my evidence and you try to disprove it. Exhibit A, the 8800GTS 320MB. Exhibit B, the HD 3850. As I said do not generalize.


----------



## ntdouglas (Jun 7, 2008)

Two of them crank.


----------



## largon (Jun 7, 2008)

Solaris17 said:


> huh thats odd i dont get it with UT3 or any other game..do you know if its AFR itself or only a certain mode largon?


I'm not sure but it's possible SLi uses a different rendering method for UT3. AFR is the method that causes stuttering, maybe SLi uses frame sharing rendering for UT3. 





commandercup said:


> well there is the fact that two 9600 gt's = 1 gig memory? thus it'll help out more with higher resolutions... and I already have a 9600 GT...


No, SLi/CF _does not_ double the memory. Each GPU can only use it's own local RAM.


----------



## paulo7 (Jun 7, 2008)

The main problem with sli is the DRIVERS this may seem obvious to most but is CRUCIAL at the moment as with the new releases chances nvidea will move onto developing drivers specific to them and leave the older cards lacking in performance when doubled (8800 gts 320 for example)

My advice would be to wait and go for the 260 GT or amd's offering and have guaranteed performance that will last albeit at a price! Which will no doubt beat any current sli config..


----------



## theonetruewill (Jun 7, 2008)

largon said:


> No, SLi/CF _does not_ double the memory. Each GPU can only use it's own local RAM.


Just in case anyone thinks he's wrong, he damn well isn't- this is absolutely correct and to think otherwise is a common misconception.


----------

