Tuesday, February 19th 2008

Confirmed: 9800GT Will Support 3-Way SLI

Well, really, the title says it all. Everyone buying a 9800GT will get the great and wonderful 3-way SLI technology, should they desire to run three graphics cards in one PC. While the 9800GT does run off of the G92 core, what really sets it apart to allow for three-way SLI is the second SLI connector on each card, and quite a few modifications to the G92 chip. The entire GeForce 9x00 series should be released by the middle of March.
Source: Nordic Hardware
Add your own comment

69 Comments on Confirmed: 9800GT Will Support 3-Way SLI

#51
trog100
i could never make sense of the new 8800gt card tatty when it first popped up.. then went into none existent availability.. it pretty much wrecked the entire nvidia price line up..

till the new ati 3870 arrived then it all made sense.. a small batch especially commissioned made by foxconn rushed out prematurely just to wreck the new ati card.. it worked..

i also put more store into the heat factor.. to me heat and power draw relative to performance is the key at the high end.. ati are setting the trend.. nvidia are reacting to what ati do.. producing the equivalent of the power guzzling 2900 card at half the power and heat i see as a big leap forward..

if multi gpus is what the future is about.. first has to come the low heat chip.. ati have it nvidia dont.. which is the only reason i think ati have nvidia worrried..

the other reason is price.. amd/ati seem to have made the decision that they cant compete in the out and out power race so have decided to go for low prices.. they dont have a hope in hell with the cpu side.. but with the grafix card its different.. intel can chuck out whatever it needs to at what ever price it needs to.. i dont think nvidia can or want to..

but reading crystal balls is fun but not always right.. he he

trog

ps.. even if multi gpus aint the future heat is still the key.. before u can make a new super chip that dont need a power station to run it.. perhaps first u have to produce a current (less than super) chip that runs cooler.. ati have done this nvidia havnt..
Posted on Reply
#52
SK-1
phanbueyTrue... but a the top-end G80 core (the old one) beats the top-end g92 (the supposed new one) which is a bit like vista, you know?
I am confused by the comment. 8800gts512MB G92 pretty much beats up the GTX and even the Ultra in games. I know this because I have both.
Posted on Reply
#53
Tatty_Two
Gone Fishing
trog100i could never make sense of the new 8800gt card tatty when it first popped up.. then went into none existent availability.. it pretty much wrecked the entire nvidia price line up..

till the new ati 3870 arrived then it all made sense.. a small batch especially commissioned made by foxconn rushed out prematurely just to wreck the new ati card.. it worked..

i also put more store into the heat factor.. to me heat and power draw relative to performance is the key at the high end.. ati are setting the trend.. nvidia are reacting to what ati do.. producing the equivalent of the power guzzling 2900 card at half the power and heat i see as a big leap forward..

if multi gpus is what the future is about.. first has to come the low heat chip.. ati have it nvidia dont.. which is the only reason i think ati have nvidia worrried..

the other reason is price.. amd/ati seem to have made the decision that they cant compete in the out and out power race so have decided to go for low prices.. they dont have a hope in hell with the cpu side.. but with the grafix card its different.. intel can chuck out whatever it needs to at what ever price it needs to.. i dont think nvidia can or want to..

but reading crystal balls is fun but not always right.. he he

trog

ps.. even if multi gpus aint the future heat is still the key.. before u can make a new super chip that dont need a power station to run it.. perhaps first u have to produce a current (less than super) chip that runs cooler.. ati have done this nvidia havnt..
I agree with most of what you have said there but 2 points if I may, in your previous post you mentioned heat/power and G92, firstly it's not the chip in the 8800GT, it's the crappy cooler but if you look around today in Feb 2008 almost half of all the 8800GT cards available now come with stock 3rd party cooling, the GTS with it's dual slot cooler does not have heat issues, I run mine at 825Mhz on the core with shaders exceeding 2000mhz and wityh the fan spinning quietly at 60% it does not even get out of the sixties C.

Lastly, I agree that ATi seem to be going for the more cost effective solution and not trying to compete "top end" but the problem here is.................it's not working, this policy is slowly bankcrupting them, they are working at a loss, and yes, thats not just the CPU division so my point is, they need to get back to the old ATi that even I love and start producing/developing high end power guzzling GPU's that cost a bomb but beat NViidia AS WELL as the really good mid ranged power/heat efficient pretty fast cards that many of us currently enjoy, only then will they really snatch back some of those sales from the Green camp because as I said, the masses do buy the mid/lower end cards but if there is no profit margin in them then the high end is the way to go also.
Posted on Reply
#54
Mussels
Freshwater Moderator
It is a good point newtekie made - compare G80 to G92. It was 'just' a die shrink, yet look at the heat/power gains. Small changes can make a big difference to the availability and value of a video card.
Posted on Reply
#55
Bluefox1115
Everyone just shut up, stop spreading rumors, and wait for the official release to judge. :rockout:
Posted on Reply
#56
candle_86
ChillyMystWRONG, thats like saying the 3800 is a totaly new design, when its just a tweaked and shrank 2900 core.......

they arent anything new, nvidia just die shrank the chip and renamed it.....wow thats orignal........same crap nvidiots where giving amd crap for with the 3800 line because its basickly just a slitly reworked/slitly updated shrank 2900.......



probbly slap a larger cooler on it and use the same core as the 8800gt/gts with all its pipes unlocked/working, very likely that they just upp the vcore and use better cooling on the gpu, this has been nvidias tact in the past.

need i remind eveybody of the 6800ultra/7800ultra/7900ultra/exct? they just push the chip to its limmit or neerly so, and cal it a new more powerfull card, common nvidia tactic, im sure the "tweaks" are just some more unlocked pipes........meh i dont really care, i get tired of this every few months them trying to sell me a newer "better" card with "go faster stripes"

screw them, my 8800gt is working fine, and when i replace it, probbly get whatever amd has to offer......i know nvidia isnt gonna fix the driver bugs i and BFG have reported.
Last i checked the G70 and G71 cores where new and supported features the NV4x did not and belive it or not the G92 supports things the G80 does not, also its not the same core, the G92 is 754 Transistors and you cant tell me that PCIe 2.0 logic uses that many lol. There are likly more locked stuff, maybe say 28rops and maybe say 160 shaders
Posted on Reply
#57
candle_86
Tatty_OneI would have to at least slightly disagree, firstly, how could ATi have NVidia worried, NVidia in the last 18 months have outsold ATi 4 fold, secondly, they could have just left the G80 lineup and dropped prices (and still saved on what some would "perceive" as hasty last minute development) and up until a few days ago when the x2 came out would have at least STILL topped the tables with the 8800GTX and Ultra, in fact, in some games the old 8800GTS 640MB is still a match for the HD3870. Ati's fantastic trump card was never going to be the 3870, even without G92......why? because it's performance in many things is too close to the 2900XT, it is it's price/performance ratio that makes it a good card, the real trump is the 3850 that took the low/mid sector by storm and in that price category, NVidia could not compete so perhaps they did release either the 8800GS or the 9600GT quicker to counteract that, but to say that it is NVidia reacting to ATi's movement all the time IMO is shallow otherwise as I said, would ATi have not waited to get a better card out than the 3870 to compete better with NVidia's offerings? Contrary to popular beleif, the majority of consumers are not concerend that the 8800GT draws an extra 35W at load or whatever, they are concerned in 2 things only, which is fastest and what is the price.

Now I happen to think it's a crap time for buying a gfx card, because there is far too much hybrid messed up stuff on offer with too many overlaps in performance, my point here is that we SHOULD NOT just look at performance increases as the way forward but instead.....performance to price ratio improvements, this is why the HD3870 is so appealing over the 2900XT and why the 8800GT/GTS is so appealing over the 8800GTS G80 640MB and 8800GTX G80.

If a next gen card can give us equal performance in a sector at a reduced price that has to be good, otherwise we should always expect to pay more for more if you get my meaning, as I have always said, it's about choices.

I happen to think that ATi at the moment has gotten it right and NVidia's approach is wrong, that much I agree with you, far too much cr*p out there with NVidia and it is putting even NVidia fanboi's off, why buy one of their cards today, when next week you migh get one at the same performance for cheaper.

As for the top end 9800 series.....you may well be right with that one, however, as i said, the 9800GTX is going to have a new core I beleive and that will probably be significantly faster than anything around ATM, also please remember that the GX2 was never going to be the top of the range, the 9800GTX was always going to fill that niche and therefore be faster as I understand it.
I disagree, the 8800GTS 640 was around a year before the GT reared its head, and to be frank thats a good run for it. How is that next week BS working there. Also the 8800GS and the 9600GT are price competive and from tests between me and soliars they tie in 3dmark and there is no idea how they do in the real world without offical drivers and what not. Its likly as i have said the 8800GS is a way to get rid of the 92 cores that didnt make the cut, and likly wont be a big card, but im very happy to have one none the less. Sure it Feb 08 but it preforms close to an older 8800GTS 640mb or a lil faster so whats the problem here really? Is the Geforce 8 to slow for todays games in all honesty?
Posted on Reply
#58
ChillyMyst
candle, the 6 and 7 seirse where damn neer identical, neither could do fp32 hdr and AA at the same time, the 7 cards where just a design on a theme, same with the 80 gx 92, they changed a few things, but didnt really do anything major to the base design, its not a really new chip, just as the 3800/r670 isnt a new chip vs the 2900/r600, its an update to the core, mind you in both cases power use was lowered and thats a good thing, but neither really brought anything "new" to the table. (dx10.1 support isnt a big change to the 2900 core....)
Posted on Reply
#59
Mussels
Freshwater Moderator
the 'new chip' deal is what was so exciting anyway... it takes them 5-6 years to make an entirely new core, and the only ones from Nv that come to mind was GF FX (failure), GF 6800 series (competitive in speed, if not quality), and 8800 (G80, hot, but awesome and win)

all the others were rehashes of something (i dont know about GF1/2/3 so i left them out of it)
Posted on Reply
#60
candle_86
agreed there


Look at this if you will


1999 NV10 arrives with Geforce 256
2001 NV15 Arrives with Geforce2
2001 NV20 Arrives with Geforce3
2002 NV25 Arrives with Geforce4
2002 NV30 Arrives with GeforceFX
2004 NV40 Arrives with Geforce6
2005 NV47/G70 Arrives with Geforce 7
2006 G80 Arrives with Geforce8
2007 G92 Arrives with Geforce 8 G92
2008 is just arrived and remember kids the Geforce 100 should arrive soon.
Posted on Reply
#61
ChillyMyst
so they are due for another fx seirse class card? last time it was 4 good one bad,so its now been 4 good again, another bad is due :P

oh and u forgot the tnt/tnt2 cards :P
Posted on Reply
#62
Mussels
Freshwater Moderator
ChillyMystso they are due for another fx seirse class card? last time it was 4 good one bad,so its now been 4 good again, another bad is due :P

oh and u forgot the tnt/tnt2 cards :P
Riva 128 came before those. TnT was a 'riva TnT'
Posted on Reply
#63
ChillyMyst
i wasnt gonna go back that far, and acctualy u missed the first nvidia card, the nv1 that was made for sega and then the same chip was used for a PC videocard!!!!!!! (didnt work out well, was only good for consol ports)
Posted on Reply
#64
candle_86
oh sorry

NV1 "Unused Saturn GPU"
NV2 1995
NV3 Riva 128 1996
NV4 Riva TNT 1997
NV5 Riva TNT2 1998
NV10 Geforce 256 1999
NV15 Geforce2 2000
NV20 Geforce3 2001
NV25 Geforce4 2002
NV30 GeforceFX 2002
NV40 Geforce6 2004
NV47/G70 Geforce7 2005
G80 Geforce8 2006
G92 Geforce8/9 2007
Posted on Reply
#65
candle_86
ChillyMysti wasnt gonna go back that far, and acctualy u missed the first nvidia card, the nv1 that was made for sega and then the same chip was used for a PC videocard!!!!!!! (didnt work out well, was only good for consol ports)
no the NV1 and NV2 where diffrent, both shared the same rendering tech but diffrent GPU;s
Posted on Reply
#67
candle_86
well for all intents and purposes they are diffrent, i just got it backwards the NV2 was the unused chip. But i promise they are diffrent
Posted on Reply
#68
Tatty_Two
Gone Fishing
candle_86I disagree, the 8800GTS 640 was around a year before the GT reared its head, and to be frank thats a good run for it. How is that next week BS working there. Also the 8800GS and the 9600GT are price competive and from tests between me and soliars they tie in 3dmark and there is no idea how they do in the real world without offical drivers and what not. Its likly as i have said the 8800GS is a way to get rid of the 92 cores that didnt make the cut, and likly wont be a big card, but im very happy to have one none the less. Sure it Feb 08 but it preforms close to an older 8800GTS 640mb or a lil faster so whats the problem here really? Is the Geforce 8 to slow for todays games in all honesty?
Firstly, where I mentioned the 1 week comment was on a different subject to the G80 640MB comment, they were not related, my point about the one week thing is simple.....8800GT >>> 8800GTS, yes a performance improvement but outside of synthetic benching, a minimal one then................8800GS>>>>>>>9600GT, my point saying that I thought ATi had it right was that their nomencalture seems more straight forward and less confusing at the moment is all, to much stuff coming out of the NVidia camp at almost the same time with similar performance levels at similar prices................or is that good? I am not suggesting it is good or bad meerly pointing out that ATi fanboi's like to use this as a weakness........is choice a weakness?
Posted on Reply
#69
largon
Tatty_One(...) is choice a weakness?
Only for the feebleminded.
:D :p
Posted on Reply
Add your own comment
Jul 20th, 2024 23:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts