Tuesday, February 26th 2008

NVIDIA GeForce 9800 GTX Scores 14K in 3DMark06

After taking some screenshots with a special version of our GPU-Z utility, the guys over at Expreview have decided to take their GeForce 9800 GTX sample and give it a try at Futuremark 3DMark06. Using Intel Core 2 Extreme QX9650 @ 3GHz, 2GB of DDR2 memory, ASUS Maximus Formula X38 and a single GeForce 9800 GTX @ 675/1688/1100MHz the result is 14014 marks.
Source: Expreview
Add your own comment

114 Comments on NVIDIA GeForce 9800 GTX Scores 14K in 3DMark06

#51
Psychoholic
WTF? My lowly 2900 Pro overclocked does 13,7xx

Maybe the gtx will have outstanding o/c abilities.
Posted on Reply
#52
newtekie1
Semi-Retired Folder
EastCoasthandleThis is funny, which is it for the G92 an 8 series or 9 series? It's an 8 series when they released the 8800GT and 8800GTS but now it's a 9 series when they release a 9600. Is the wool being pull here or what? Regardless of what tweaks/revisions are used (I can't find any information to show how a G92 from a GTS is different from a G92 from a 9600).
You mean 9800, the 9600 uses a G94 core. The tweaks aren't to the core, at least I don't think they are, I think they are more to the PCB to allow tri-SLI, and perhaps higher clocks.
JrRacinFanI was thinking more on the lines of the HD3850 and 8800GS (sub $150 cards).
The HD3850 isn't a sub-$150 card. They are sub-$170 cards at best right now, the cheapest one on newegg is $169.99+Shipping. At that price point the 9600GT on nVidia's side also outperforms the HD3850 for only $10 more. Of course if you really want a deal the 8800GS is actually a sub-$150 card, going for $149.99+Free Shipping right now on newegg and performance wise the 8800GS is pretty close to equal the HD3850. So I still don't see where you are going with this.
Posted on Reply
#53
Silverel
"Gale force winds are expected today throughout the southwest of the country as a massive sigh of relief is let out by AMD, as their rival nVidia has re-re-released yet another variant of the 8800GT that isn't really much better than the original...More news at 11!"
Posted on Reply
#54
jbunch07
Silverel"Gale force winds are expected today throughout the southwest of the country as a massive sigh of relief is let out by AMD, as their rival nVidia has re-re-released yet another variant of the 8800GT that isn't really much better than the original...More news at 11!"
:roll:
Posted on Reply
#55
JrRacinFan
Served 5k and counting ...
newtekie1The HD3850 isn't a sub-$150 card. They are sub-$170 cards at best right now, the cheapest one on newegg is $169.99+Shipping. At that price point the 9600GT on nVidia's side also outperforms the HD3850 for only $10 more. Of course if you really want a deal the 8800GS is actually a sub-$150 card, going for $149.99+Free Shipping right now on newegg and performance wise the 8800GS is pretty close to equal the HD3850. So I still don't see where you are going with this.
Meh, I am just saying, I am surprised at how they released a card in my eyes that doesn't perform. Oh and ...www.newegg.com/Product/Product.aspx?Item=N82E16814161211
Posted on Reply
#56
Xolair
Oh no, Nvidia's starting to lose it...



... perhaps not, but that result still seems quite pathetic. Indeed you could just get a 8800 GTS and juice it up to the same level. :shadedshu
Posted on Reply
#57
[I.R.A]_FBi
ATi, this is the moment you've been waiting for ....
Posted on Reply
#58
newtekie1
Semi-Retired Folder
JrRacinFanMeh, I am just saying, I am surprised at how they released a card in my eyes that doesn't perform. Oh and ...www.newegg.com/Product/Product.aspx?Item=N82E16814161211
Yes, technically $150, but also only 256MB and out of stock. The 256MB version doesn't even come close to competing with the 8800GS in performance. The 8800GS is overall about 7% faster than the HD3850 256MB, and is the same price. So I still don't see where you are coming from.

I think we can all agree the 9800GTX isn't exactly the card we were expecting. But saying that nVidia is beat because of a single card is kind of a big leap. Especially when that single card still outperforms the offerings of ATi.
Posted on Reply
#60
trog100
apart from green team suporters not having a new super toy to play with i recon its all right.. green keep winning the performance race but not by too much ati keep cheap which means green have to keep cheap.. i can keep buyng cheaper red and green fans can keep buying cheaper green..

high end is gonna be multiple gpus and multiple cards.. two or three x 2 cards red or green..

trog
Posted on Reply
#61
Tatty_Two
Gone Fishing
EastCoasthandle
Am I the only one that finds this a bit odd? Just look at the texture fillrate for the 9800GTX, how so high in comparision with the thether 2 G92 cards....same ROP's, similar clocks to the 8800GTS albeit a bit higher memory but how come three times the fillrate? Am I missing something here or what?
Posted on Reply
#62
JrRacinFan
Served 5k and counting ...
Look at the bus interface tatty. Do you think that would make the difference? **puzzled**
Posted on Reply
#63
Grings
That does look odd, however the 9800's is the most realistic of the 3, given that my 'old nail' g80 manages 30 gtexels, do gt's normally only get 9.6?
Posted on Reply
#64
Tatty_Two
Gone Fishing
JrRacinFanLook at the bus interface tatty. Do you think that would make the difference? **puzzled**
No i dont think so, I actually thought that was "odd" also, they are both PCI-E 2.0 cards with 16x enabled so I cannot see that there is any possible factor there, I can find nothing that would give a texture fillrate thats more than three times the 8800GTS, same ROP's, same amount of SP's etc, as far as I was aware, SP's and ROP's coupled with shader engine speeds were the key here to texture fillrate and this does not make any sense to me but I amy well be missing something vital.

Dont get me wrong.....I am not suggesting this is fake, am just a little suspicious based on what I can see..........until someone more knowledgable than me in this field comes up with a sensible solution of what I am missing and then of course my mind will be at rest :confused:
Posted on Reply
#65
newtekie1
Semi-Retired Folder
The older versions of GPU-z read the Texture Fillrate wrong.

Posted on Reply
#66
btarunr
Editor & Senior Moderator
Tatty_OneAm I the only one that finds this a bit odd? Just look at the texture fillrate for the 9800GTX, how so high in comparision with the thether 2 G92 cards....same ROP's, similar clocks to the 8800GTS albeit a bit higher memory but how come three times the fillrate? Am I missing something here or what?
The 8800 GTS is using just two PCI-E 2.0 lanes.
Posted on Reply
#67
Wile E
Power User
Well, in reference to the OP, my 8800GT scores higher in both SM2 and 3 than this card. I expected more. I thought they had a new beast coming, but this is more a mild tweak.
Posted on Reply
#68
Wile E
Power User
btarunrThe 8800 GTS is using just two PCI-E 2.0 lanes.
No, the high Texture fillrate is just a gpu-z bug.
Posted on Reply
#69
happita
This is weird, isn't the unspoken standard supposed to be at least 15% improvement whenever they go into the new gen of cards? What's going on at NV?
Posted on Reply
#71
JRMBelgium
happitaThis is weird, isn't the unspoken standard supposed to be at least 15% improvement whenever they go into the new gen of cards? What's going on at NV?
Only 15%. The 8800GTS 640MB was as fast as the dual-GPU 7950X2 so I think Nvidia gained more then 15% from Geforce 7 to 8 :D
Posted on Reply
#72
crow1001
IMO, the 06 score and GPUZ shot are fake, for a start the 9800GTX GPU is on the G94 process not the G92 as shown by GPUZ.:wtf:
Posted on Reply
#73
JRMBelgium
crow1001IMO, the 06 score and GPUZ shot are fake, for a start the 9800GTX GPU is on the G94 process not the G92 as shown by GPUZ.:wtf:
It has to be fake...no way that Nvidia will ruin their reputation by releasing such a crappy card...
Posted on Reply
#74
brian.ca
KreijWhat's special about their version of GPU-Z ?
Just wondering.
From the same site (en.expreview.com/2008/02/26/9800gtx-gpu-z-screen-shot/), "The GPU-Z 0.1.7 didn’t work with 9800GTX. After W1zzard@TPU make some change on his GPU-Z, the tool can finally read all the data of the card. Thanks, W1zzard."
Posted on Reply
Add your own comment
Dec 23rd, 2024 11:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts