Wednesday, November 14th 2007

3-way SLI Tested on nForce 790i SLI Board

As NVIDIA is preparing to launch 3-way SLI technology soon, it's time for the first "underground" scores to emerge. Initial testing results from the 3-way SLI configuration made up of C73 reference board (nForce 790i SLI) with 3x GeForce 8800 Ultra cards and a Core 2 Duo E6750 show that the performance improvement between 2-way and 3-way SLI in 3DMark06 at 1920x1440 resolution (8 sample AA with 16x AA Quality and 16x AF) is 26%. Without AA and AF, the performance boost is only 5.6%. Notice that Forceware 167.10 Vista beta drivers were used for 3-way SLI. NVIDIA is still working on final 3-way SLI drivers, that should improve performance.
Source: VR-Zone
Add your own comment

23 Comments on 3-way SLI Tested on nForce 790i SLI Board

#1
newtekie1
Semi-Retired Folder
No surprise, normal SLI gives more of a performance boost in AA and AF along with resolution. So it is logical that triple-SLI would be similar.
Posted on Reply
#2
patton45
and its also no suprise that the gains is just another 30% or so on super powerful card with two gpus on it is where i would spend my money
Posted on Reply
#3
Atnevon
Good to see an improvement. Shows you there is still room to grow. However, its like sports cars. YOu start paying 20k more just for .1 seconds faster in your 0-60. Even SLi is hinting at this line. Why pay $500 for 2 8600s when the 1 GTX will work better standing alone.

I guess benchmarking is a sport of its own. Buy, tune, and compete. But I'll never understand I guess.

I'm happy with ungodly FPS in CS:Source, and thats what my GTX is for. Leave the tri-SLi for those with $1k to blow for the benchies.
Posted on Reply
#4
SK-1
AtnevonGood to see an improvement. Shows you there is still room to grow. However, its like sports cars. YOu start paying 20k more just for .1 seconds faster in your 0-60. Even SLi is hinting at this line. Why pay $500 for 2 8600s when the 1 GTX will work better standing alone.

I guess benchmarking is a sport of its own. Buy, tune, and compete. But I'll never understand I guess.

I'm happy with ungodly FPS in CS:Source, and thats what my GTX is for. Leave the tri-SLi for those with $1k to blow for the benchies.
Great analogy with the cars and cards,so true these days.
Posted on Reply
#5
yogurt_21
probabaly an extreme cpu bottleneck at default res, seems to be the norm these days. even at 5.6GHZ on peryn these monsters still seem to be bottlenecked. lol
Posted on Reply
#6
jydie
I really do enjoy reading about the top-of-the-line products and benchmarks... but I have no desire (or extra cash) to experience it personally. The power supply needed to run that setup would have to be very pricy... let alone buying 3 8800 Ultras... plus a display that supports a huge resolution that will take advantage of the power these 3 cards privide. Oh, and you will need the best CPU(s) available to keep the bottlenecks to a minimum.

I would have to agree with Atnevon, it does not sense for the average PC gamer to pay so much for minor improvements... but I sure am glad people strive for the highest scores because it is fun to see just how far modern technology can be pushed. Hmmm... using the car analogy, would that make the 3-way SLI similar to putting 3 Hummer engines into one vehicle? :laugh: (That would be some SCARY power... but you would not be able to travel far from the gas station.)
Posted on Reply
#8
WarEagleAU
Bird of Prey
I didnt know nvidia was going with the 790i as their new chipset name. With ATI releasing the RD790, that is gonna confuse the hell out of alot of consumers.
Posted on Reply
#9
Atnevon
jydieHmmm... using the car analogy, would that make the 3-way SLI similar to putting 3 Hummer engines into one vehicle? :laugh: (That would be some SCARY power... but you would not be able to travel far from the gas station.)
Its like comparing a Lambo to a Corvette. So damn near close in performance, but then you look at the price tag. Sure you could go to a LAN party and say, "Check out what I got. I can score like 125,768 in 3DMark 06", but we would think that person is just a cocky bitch for doing so.

Don't get me wrong about SLi. Its a way to get a bit more juice out of your existing card. For example, you get 1, and ONLY one 8800 GT, then in like a year in a half or si, when the price drops down to like 200 or lower, you can pop one in instead of getting the top of the line_______(insert new GPU name here). I appreciate the technology to be able to give users more control over their hardware capibilities. However, outside of that, it just seems to impractical. Hell, I only got an 8800GTX because insurance covered it. Outside of that, I would have gotten an 7800 or so.

Technology advancement=cool, (practicality>budget) however is what really turns out.
Posted on Reply
#10
Atnevon
WarEagleAUI didnt know nvidia was going with the 790i as their new chipset name. With ATI releasing the RD790, that is gonna confuse the hell out of alot of consumers.
They could call it the B.O.B Board. Why Bob? Cause its a cool name. You could say, "Ah man, those BoB boards are so sweet. BOBs rock!"
Posted on Reply
#11
Mussels
Freshwater Moderator
tri-sli should be for lower end/mid range cards. 3x 7600GT, 3x8800GT etc.
Posted on Reply
#12
[I.R.A]_FBi
Musselstri-sli should be for lower end/mid range cards. 3x 7600GT, 3x8800GT etc.
no they wont allow that

they want you to come buy their shit.
Posted on Reply
#13
a_ump
i agree with mussels though im not sure y he said that they should let it happen to LOW END cards like the 8800GT, its a high-end card imo currently, im also curious to see how well 3-way SLI does against 4-way crossfire if ATI can even get it to work sufficiently and well enough that people with pay for 4 gpu's
Posted on Reply
#14
Unregistered
More SLI/CF, 3 way SLI/XCF, Quad SLI/CF, I've had enough of this marketing nonsense, rather than coming up with new graphics cards with significant performance boosts (remember these?) they try sell you more of the same crap.

Can't even run Crysis at high resolution at constant >30+ FPS (never mind 60FPS) because your brand new £200 card sucks?

Solution: add another crap card, or two even!

The market for SLI/CF is extremely limited to <1% of gamers.

People want real next generation graphics card, not rehashes of the same 8800GTX/X2900XT crap.
Posted on Edit | Reply
#15
a_ump
i agree with u man, come dec 3rd there's supposed to b a 8800GTS 512mb/1GB from nvidia with 128 stream proc it may b as good or better than the ultra, but either way for next gen i thk we have to wait till qt1 of 2008 :( though u'd thk since its xmas that they'd want to release the good gpu's b4 not after that that but some things never do make sense
Posted on Reply
#16
Unregistered
Another thing, nVidia boards sucks big time compared to Intel boards, the SLI feature is what sells their boards but people will never use SLI, their motherboard chipsets are simply second rate performance and stability wise compared to the Intel boards.

Best to wait and see what nVidia's 9800 and ATI's next gen. cards can do rather than this marketing nonsense.

SeriouslyLameIdiots/CrapFramerates or whatever they stand for, not much difference really.
Posted on Edit | Reply
#17
DrunkenMafia
lol....

I wonder how many of the above opinions would change if they, say, won a tri sli system... haahaaa.

Did I say it was shit??? oh I meant its the shit... TRI SLI FTW... haahaa :p

I think this multi gpu is the way that technology is going... they can only shink the die size so much, fair enough it will over time shrink, but maybe they just can't do it fast enough (eg. Crysis).... And with the GTX and more so the 2900XT's power req's going thru the roof it can only make sense to run more processors in parallel...

This I believe is the same reason we have Quad core cpu's rather than a 15ghz p4....

I for one loved having a crossfire setup, just because, no reason really just because I could look down in there and see 2 gfx cards.... :)

What would you spend your $700 on, an ultra or 2 GT's.... A 2900XT or 2 3870's....

just my opinion...
Posted on Reply
#18
hat
Enthusiast
Triple SLI is just crazy. Regular SLI is O.K. in my eyes, if you use two midrange cards, but that was a dumb decision IMO. I'd rather take a new 8800GTS 512MB with 128 shaders than two 8600GTS.
Posted on Reply
#19
a_ump
i have mixed opinions about SLI and crossfire. yes they r good for those that can't buy the best damn thing out there and r good for an upgrade if u already have a card, but instead of comin out with different named cards with the same performance as anythin else(8800GT) they should put out a new gen card as well, the only new thing about hte 8800GT is its price per performance ratio is 10x better than the GTX's

u say that the best we can really do shrink die size, well i thk nvidia should try to bring a card out with say a 512-bit bus with 1GB or 512mb memory and then maybe 192 stream proc just an idea.
Posted on Reply
#20
hat
Enthusiast
The 512-bit bus wasn't all that great. The 2900XT has it, and it still got pwned by the 8 series with the 320/384-bit. Now the 8800GT is 256-bit and it is pretty much near the 8800GTX with 384-bit.
Posted on Reply
#21
Unregistered
The reason nVidia (as well as Intel) can afford to do rehashes of their existing products line from 8800GTX to 8800GT (C2D to Penryn for Intel) is the competition (obviously AMD/ATI) is so darn far behind.

The likes of Intel are still milking their revamped C2D architecture in the form on the Penryn Wolfdale/Yorkfield chips literally only a few months before Nehalem is released, the new architecture with Intel's integrated memory controller!

nVidia is basically adapting the same strategy Intel used, milk the consumers for all their cash this xmas with thier 8800GTS/GTX revamps, before releasing their proper next gen. card almost immediately a month or two after.
Posted on Edit | Reply
#22
a_ump
hatThe 512-bit bus wasn't all that great. The 2900XT has it, and it still got pwned by the 8 series with the 320/384-bit. Now the 8800GT is 256-bit and it is pretty much near the 8800GTX with 384-bit.
but the probelm with ur statement is that ATI is who incorperated the 512-bit bus and their cards have been lookin good on paper and doin the opposite of wats expected in benchmarks so i don't u can fairly sya that nvidia couldn't make use of the 512-bit, and u used the comparison of the 256-bit 8800GT to the GTX,smaller bus yet the 8800GT is as good or a slightly worse, i thk if nvidia made a 512-bit card it could do wat would b expected cause they always do live up to expectations now a days.
another example is the 8800GT to the HD3870 they both have 256-bit bus yet the 8800GT does better.
Posted on Reply
#23
hat
Enthusiast
ATi video cards are like P4s. Yeah, they got high clocks, but they still aren't as fast as the lower clocked Nvidia cards. I think they purposely do that to lead most of the herd twords the higher clocked cards because they think it performs better.
Posted on Reply
Add your own comment
Feb 6th, 2025 03:03 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts