Monday, February 25th 2008

NVIDIA GeForce 9800 GTX Card Specs and Pictures

VR-Zone has obtained some details and photos of the first single GPU high-end GeForce 9 series card. The 65nm G92 based GeForce 9800 GTX (G92-P392) will come with 12-layer PCB. It will be clocked at 673MHz for the core and 1683MHz for the shader while memory clock speed is yet to be determined. The memory interface is 256-bit with 512MB of 136-pin BGA GDDR3 memory onboard. The card will come with two DVI-I and one HDTV-out. As mentioned earlier all GeForce 9800 GTX will have two SLI connectors (3-way SLI ready) and two 6-pin PCIe power connectors. The card will be cooled by the CoolerMaster TM67 cooler where the fan is rated at 0.34A, 4.08W, 2900rpm, 34dBA. During 100% load the card will consume around 168W. The GeForce 9800 GTX is set to be released around April.
Source: VR-Zone
Add your own comment

51 Comments on NVIDIA GeForce 9800 GTX Card Specs and Pictures

#1
devguy
So is this basically an overclocked 8800GTS (512mb) with dual sli connectors?
Posted on Reply
#2
niko084
devguySo is this basically an overclocked 8800GTS (512mb) with dual sli connectors?
Thats about what it sounds like, same core so....
Posted on Reply
#3
Fitseries3
Eleet Hardware Junkie
it's nice they had to ad 2-3" to the card to add the second SLI connector.
Posted on Reply
#4
broke
I find it interesting that they are using a memory interface of 256-bit with 512MB on these Ultra High-end cards and still manage to get higher performance than "last gen" cards that shipped with 512-bit mem interface with 768MB vram. I mean wouldn't this slimmer interface cause a bottle neck now more than ever? And if this is a cost reduction measure,then thats an odd thing to do for a flagship card

broke_s

edit: nvm I didn't realize that they are using the same core as the 8800GT (which had the cost reduction measures applied to remain competitive) so I guess that we can just overclock a 8800GT and well know how these will perform?
Posted on Reply
#5
niko084
brokeI find it interesting that they are using a memory interface of 256-bit with 512MB on these Ultra High-end cards and still manage to get higher performance than "last gen" cards that shipped with 512-bit mem interface with 768MB vram. I mean wouldn't this slimmer interface cause a bottle neck now more than ever? And if this is a cost reduction measure,then thats an odd thing to do for a flagship card

broke_s
They are actually losing some performance with keeping to 256bit. It's not a lot but is noticeable in VERY large resolutions with high AA/AF.

The main idea for cutting back to 256bit, is they found that the price difference is really not worth the performance difference, and its costs a lot less to make a 256bit card compared to a 512bit card.
Posted on Reply
#6
broke
niko084They are actually losing some performance with keeping to 256bit. It's not a lot but is noticeable in VERY large resolutions with high AA/AF.

The main idea for cutting back to 256bit, is they found that the price difference is really not worth the performance difference, and its costs a lot less to make a 256bit card compared to a 512bit card.
true but couldn't they have realized that, what was it 2 years ago before the first 512bit cards came out?
Posted on Reply
#7
niko084
broketrue but couldn't they have realized that, what was it 2 years ago before the first 512bit cards came out?
Well they could have but they are always being pushed to getting cards out and quickly, plus they have to build something fast enough to make the people with the upper level cards want to replace them. So getting an idea that is indeed faster and throwing it out to get it done vs completely re-working it is a better idea...

Now that they learned their lesson, that wont be a mistake again.
Posted on Reply
#8
rodneyhchef
The old GTX/Ultras are 384bit IIRC, not 512. The old 8800GTS was 320bit as well.
Posted on Reply
#9
niko084
rodneyhchefThe old GTX/Ultras are 384bit IIRC, not 512. The old 8800GTS was 320bit as well.
Yes, the 2900XT was 512bit, 2900Pro's are both 256bit and 512bit.
Posted on Reply
#10
erocker
*
I think the 9800GTX is getting more ROP's and 12 more shaders than the G92 GTS. I think the GTX also uses a different core revision, which might explain the extra shaders and ROP's.
Posted on Reply
#11
snuif09
well i hope its another revision.

256bit also cuts down overclocking performance why do you think they where always using 2900xt's
for wr 3dmark06, i think its time for the gpu buisness to make 512bit a standard for high-end cards
Posted on Reply
#12
mab1376
the smart thing to do would be to wait till the 9800GTX comes out to buy a new 8800GTS (G92) and an aftermarket cooler or do water cooling.
Posted on Reply
#13
calvary1980
revision to core 9 series has an update to purevideo engine. crisper hasselhoff and 2 more frames in crysis for msrp $399

- Christine
Posted on Reply
#14
mdm-adph
calvary1980revision to core 9 series has an update to purevideo engine. crisper hasselhoff and 2 more frames in crysis for msrp $399

- Christine
Aye, but crisper suave, cool, 80's hasselhoff; or crisper drunken, overweight, and hamburger-eating today's hasselhoff?
Posted on Reply
#15
oli_ramsay
Is this thing longer than the 8800 gtx?
Posted on Reply
#16
erocker
*
calvary1980revision to core 9 series has an update to purevideo engine. crisper hasselhoff
I am all for THAT! :laugh:
Posted on Reply
#18
broke
[I.R.A]_FBiwhat a long turd of a card ...
haha word!
Posted on Reply
#20
bcracer220
even tho this is bad news for most, i find it to be just the opposite, as i just purchased a g92 gts, and it will remain one of the top cards for another year =) im sure others with a g92 based card can relate
Posted on Reply
#21
farlex85
bcracer220even tho this is bad news for most, i find it to be just the opposite, as i just purchased a g92 gts, and it will remain one of the top cards for another year =) im sure others with a g92 based card can relate
I guess thats kinda true, although I was hoping to make a step-up to this card. However, that doesn't look like its worth the trouble. :shadedshu
Posted on Reply
#22
Easy Rhino
Linux Advocate
man, nvidia and ati are just pooping out cards left and right. there are so many choices these days that it is hard to figure out which cards are legit and which are just marketing tools used to piss off the competition.
Posted on Reply
#23
trog100
time are changing.. folks used to want the card they have just bought to last a while.. now the majority seem to want the opposite.. a new toy every month.. i recon the average "enthusiast" is getting younger.. he he

trog
Posted on Reply
#24
zOaib
seems like nvidia is out of fresh ideas for now , no new architecture just revisions ......... a good time for Ati to deliver something like the x800 revolution .................. :nutkick:
Posted on Reply
Add your own comment
Dec 23rd, 2024 15:26 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts