Tuesday, February 26th 2008

NVIDIA GeForce 9800 GTX Scores 14K in 3DMark06

After taking some screenshots with a special version of our GPU-Z utility, the guys over at Expreview have decided to take their GeForce 9800 GTX sample and give it a try at Futuremark 3DMark06. Using Intel Core 2 Extreme QX9650 @ 3GHz, 2GB of DDR2 memory, ASUS Maximus Formula X38 and a single GeForce 9800 GTX @ 675/1688/1100MHz the result is 14014 marks.
Source: Expreview
Add your own comment

114 Comments on NVIDIA GeForce 9800 GTX Scores 14K in 3DMark06

#76
brian.ca
crow1001IMO, the 06 score and GPUZ shot are fake, for a start the 9800GTX GPU is on the G94 process not the G92 as shown by GPUZ.:wtf:
That's not the case, see from www.techpowerup.com/53358/GeForce_9_Series_Roadmap_Updated.html. Higher number doesn't mean the better chip, it's the 9600 that uses the G94, the 9500 uses the G96, the rest are all the G92 used within the 8x00 series cards released last fall/winter

Trog above pretty much nailed what happened.. the G9x chip was Nvidia's next gen chip. ATI had two good cards coming out within the 38x0s. As a pre-emptive strike Nvidia rushed out some versions of the G92 cards (remember how stock supply was such a problem for them? That was probably b/c of the rush job) as 8x00 series cards. The GT and GTS should have been 9800 cards, but for one reason or another (most likely they didn't want to introduce them as the new gen when they may not have been completely ready and the other variants were also not ready yet) they were solds as 8800s. It's less that Nv is trying to rebrand an old card as much as they improperly branded a new card. But one way or another though don't expect too much of a gain on any of the 9x00 series cards over what we saw of G92 chips among the 8x00 series b/c the changes will mostly be tweaks.
ATI pulled the same crap with their HD3xxx series...both companies are guilty of artificially increasing marketing numbers to make their graphics card look like it's a next-generation GPU. Thankfully us enthusiasts know them by their code names, and as long as you don't see a GT200 (formerly G100) or R7xx, we're still dealing with the "old" architecture.
While there's some truth to what you're saying that's probably not completely fair... first of all if you remember ATI did originally plan on releasing those cards as 2x00 series cards. In the end given all the changes (die shrink, consequently lowering power consumption, consequently fixing the considerable heat and noise issues with the original 2k series cards, and other changes that let them sell the cards considerably cheaper), and the fact that those changes represented a lot of the major problems with the 2k series cards it's hard to blame them for the name change (if I remember correctly they did state that they specifically wanted to distance the new cards from the 2k s).
Posted on Reply
#77
DrunkenMafia
HAAHAAHAAAAHAAAA

That just made me laugh... I mean really really made me laugh. They have a top end rig with this new 9800 card and only get 14k!!!!

Why in gods name is the card so friggin huge if it can only grab 14k, and why is it called 9800!!!

I remember nvidia saying that the 9600GT was supposed to be twice the performance of the previous gen 8600GT and it pretty much is so what the hell happened here!!!! Its barely 1.2 times the performance.

I will be a little surprised if there isn;t some issue with drivers or something with this card atm... I was expecting 18k AT LEAST on this new nvidia Monster.. especially with that setup they are using @ expreview....
Posted on Reply
#78
Black Panther
malwareAfter taking some screenshots with a special version of our GPU-Z utility, the guys over at Expreview have decided to take their GeForce 9800 GTX sample and give it a try at Futuremark 3DMark06. Using Intel Core 2 Extreme QX9650 @ 3GHz, 2GB of DDR2 memory, ASUS Maximus Formula X38 and a single GeForce 9800 GTX @ 675/1688/1100MHz the result is 14014 marks.
Being somewhat of a Nvidia fan-girl I'm sad to see this. With my weak E4300 overclocked at 3Ghz 2GB RAM and Asus P5B I get nearly 13K 3Dmarks with my 8800GT at the clock listed in my system specs under my avatar.

There wouldn't be much difference between "nearly 13K" as in my case and "barely 14K" as is the case with the 9800GTX on a more powerful proc and prolly faster ram. If that's supposed to be the next gen high end, nvidia should have definitely put much much more of an effort. :ohwell:

I'm definitely disappointed. The 'new' 9800GTX apparently is on the same shelf as the old 8800GTX?

As a call to Nvidia, please don't release new stuff unless it can be qualified as better than the stuff already present on the market.

Drunkenmafia: I don't know what they mucked up with the 9600GT. All I can say is that the 8800GT is like 3 times as much better than the 8600GTS (I had and benchmarked both cards).

I'd have expected a 96XX series to be at least twice as much more powerful than the 86XX series. But then look at the history: Is a 7600 twice as much powerful than a 6600?

And what about the ATI counterparts?
Posted on Reply
#79
farlex85
I must say this is disappointing. If those scores truly are to be believed, then there isn't much reason at all to buy a 9800gtx. I guess nvidia is trying to throw out something new to avoid staying stagnent in the market place, but come on, most of the people who would spend money on a gtx are enthuseists, and I don't really see any of the aforementioned running out to get these with scores and prices like that.
Posted on Reply
#80
brian.ca
DrunkenMafiaI remember nvidia saying that the 9600GT was supposed to be twice the performance of the previous gen 8600GT and it pretty much is so what the hell happened here!!!! Its barely 1.2 times the performance.

I will be a little surprised if there isn;t some issue with drivers or something with this card atm... I was expecting 18k AT LEAST on this new nvidia Monster.. especially with that setup they are using @ expreview....
Don't be too surprised... 9600 GT vs 8600 GT = new chip vs. old chip. 9800 GT/X vs. 8800 GT / GTS = new chip vs. same chip a few months later (probably with some small tweaks and revisions).
Posted on Reply
#81
phanbuey
Black PantherBeing somewhat of a Nvidia fan-girl I'm sad to see this. With my weak E4300 overclocked at 3Ghz 2GB RAM and Asus P5B I get nearly 13K 3Dmarks with my 8800GT at the clock listed in my system specs under my avatar.
Yeah i get about 13-14K in 3dmark with my OCd 8800GT as well... but hey, at least there is no need to upgrade right? :laugh:
Posted on Reply
#83
pentastar111
newtekie13 of them will destroy a 3870x2, but yeah, triple-SLI is pretty much the only thing this card has going for it. Not to mention that there probably isn't a whole lot of head room for overclocking with the 9800GTX. Maybe 1200MHz RAM? Or perhaps even the same RAM used on the 8800GTS512 but with loosened timings?

If nVidia was going to do this, they should have made the 8800GTS512 the 9800GTS instead. I don't know what nVidia is doing.

The funny thing is that with my 8800GTS 512 and my CPU only at 2.7GHz I just scored 14016 in 3DMark06...
Yea, BUT will 3 of them destroy two 3870X2's...hmmm....
Posted on Reply
#84
OnBoard
This is best news ever, if 9800GTX is so "weak", 9800GT much be weaker and my under week old 8800GT isn't obsolete already =) 11k stock is Vista x64, have to sink it before I start OCing.

8800GT and 8800GTS should become very short lived, kinda like x1900 vs x1950.
Posted on Reply
#85
hv43082
Guess I will not need to update my 8800GTX. I score roughly the same with my E6400 @ 3.2 Ghz. Then again, who cares about bench mark. How does it perform in games at 2560x1600???
Posted on Reply
#87
Black Panther
phanbueyYeah i get about 13-14K in 3dmark with my OCd 8800GT as well... but hey, at least there is no need to upgrade right? :laugh:
Sure, we can feel smug about that. :laugh:

(Though I have to admit that I had been planning that by the end of this year I'd hand down my pc to my daughter and hence have a nice good excuse to get a quad core with a 64 bit OS, 4GB RAM, a solid state HDD if they get cheaper and a ... 9800GTX ...)
Dream got shot now. :ohwell:
Posted on Reply
#88
Tatty_Two
Gone Fishing
DrunkenMafiaHAAHAAHAAAAHAAAA

That just made me laugh... I mean really really made me laugh. They have a top end rig with this new 9800 card and only get 14k!!!!

Why in gods name is the card so friggin huge if it can only grab 14k, and why is it called 9800!!!

I remember nvidia saying that the 9600GT was supposed to be twice the performance of the previous gen 8600GT and it pretty much is so what the hell happened here!!!! Its barely 1.2 times the performance.

I will be a little surprised if there isn;t some issue with drivers or something with this card atm... I was expecting 18k AT LEAST on this new nvidia Monster.. especially with that setup they are using @ expreview....
Agreed, ffs I get 17,211 on a GTS, if this thing dont overclock as well as the GTS then there will be so little difference (same SP's etc), certainly at GTS overclocked speeds it will be a total waste of time.....and money!
Posted on Reply
#89
Tatty_Two
Gone Fishing
brian.caThat's not the case, see from www.techpowerup.com/53358/GeForce_9_Series_Roadmap_Updated.html. Higher number doesn't mean the better chip, it's the 9600 that uses the G94, the 9500 uses the G96, the rest are all the G92 used within the 8x00 series cards released last fall/winter

Trog above pretty much nailed what happened.. the G9x chip was Nvidia's next gen chip. ATI had two good cards coming out within the 38x0s. As a pre-emptive strike Nvidia rushed out some versions of the G92 cards (remember how stock supply was such a problem for them? That was probably b/c of the rush job) as 8x00 series cards. The GT and GTS should have been 9800 cards, but for one reason or another (most likely they didn't want to introduce them as the new gen when they may not have been completely ready and the other variants were also not ready yet) they were solds as 8800s. It's less that Nv is trying to rebrand an old card as much as they improperly branded a new card. But one way or another though don't expect too much of a gain on any of the 9x00 series cards over what we saw of G92 chips among the 8x00 series b/c the changes will mostly be tweaks.



While there's some truth to what you're saying that's probably not completely fair... first of all if you remember ATI did originally plan on releasing those cards as 2x00 series cards. In the end given all the changes (die shrink, consequently lowering power consumption, consequently fixing the considerable heat and noise issues with the original 2k series cards, and other changes that let them sell the cards considerably cheaper), and the fact that those changes represented a lot of the major problems with the 2k series cards it's hard to blame them for the name change (if I remember correctly they did state that they specifically wanted to distance the new cards from the 2k s).
A lot of sensible stuff there, pretty much close to the truth I would guess, just cant understand (apart from the heat and power consumption issues) why ATi didnt wait a short while to bring out the HD3870 at least and make it more competative, I mean, in one or two benches it still gets beat by the 2900XT, if they had introduced the 3850 when they did.....fine, thats the star card IMO and filled a niche that NVidia couldnt compete with, then given themselves an extra bit of time to improve the 3870 even further then I think they would have had greater success against the likes of the 8800GT and GTS.
Posted on Reply
#90
warhammer
Well that sux for a GTX but looking at the info on the JPEG it just looks sus for a CPU @3 Ghz but I could be wrong
Posted on Reply
#91
Unregistered
There is something definitely wrong here. Im going to start off with that.

If this is the actual card then there are serious driver issues. I could also believe that this isnt the card and what expreview got a hold of was perhaps the 9800GT... Esp. because of those numbers: they are nearly the same as the 8800GTS 512MB.

If this is the card and its correct specs and numbers then shame on you, nvidia. That is pitiful. I understand this is simply a revision, but 14000 is barely an improvement over an 8800 Ultra/GTX (which this card is meant to replace). And 14000 is definitely no reason to get this over a 8800GTS 512MB; which is only about $280 by now.

I would laugh so hard if:
A) This is all BS or innaccurate and you all are giving nvidia a hard time over nothing
or
B) This card overclocks like a beast

If this card really is this bad then I'll prolly find myself continuing my tradition of wanting an nvidia card but then buying an ATI card in the end for whatever reason. If I don't go with an ATI card then I'll probably get an 9800GX2 just as long as I can get one for well under the MSRP ($600).

Lets just keep our fingers crossed...

-Indybird
Posted on Edit | Reply
#92
[I.R.A]_FBi
indybirdThere is something definitely wrong here. Im going to start off with that.

If this is the actual card then there are serious driver issues. I could also believe that this isnt the card and what expreview got a hold of was perhaps the 9800GT... Esp. because of those numbers: they are nearly the same as the 8800GTS 512MB.

If this is the card and its correct specs and numbers then shame on you, nvidia. That is pitiful. I understand this is simply a revision, but 14000 is barely an improvement over an 8800 Ultra/GTX (which this card is meant to replace). And 14000 is definitely no reason to get this over a 8800GTS 512MB; which is only about $280 by now.

I would laugh so hard if:
A) This is all BS or innaccurate and you all are giving nvidia a hard time over nothing
or
B) This card overclocks like a beast

If this card really is this bad then I'll prolly find myself continuing my tradition of wanting an nvidia card but then buying an ATI card in the end for whatever reason. If I don't go with an ATI card then I'll probably get an 9800GX2 just as long as I can get one for well under the MSRP ($600).

Lets just keep our fingers crossed...

-Indybird
either way .. why get worked up ...
Posted on Reply
#93
DrunkenMafia
I wonder if they really are out of ideas. I mean when you think of it neither ATI nor Nvidia have released a card that can break 15k in 06, apart from the 3870x2, but that is a twin core card so we will leave that out of the equation. And I don't mean ocing either, I mean straight out of the box, single core, gfx card.....

Maybe both companies really haven't got anything faster atm. :)

BUT... At least ATI are not releasing newer model cards with more or less the same performance, that is just rediculous...

If I went out a bought a HD3870 and it performed only 5% better than a X1950 I would be pissed.... I think the same things goes for nvidia..
Posted on Reply
#94
The Nemesis
Though the 3DMark Score is not overly impressive, it won't be considering the operating system was vista. Everyone saying they can hit over 14,000 with a GT or GTS, how many have done so easliy using vista. I could usinf a 640MB GTS and quad core clocked @ 3.6 Ghz and Gpu @700mhz. The 9800GTX score was done @ stock. It will be nice to see what it will do overclocked on xp with a quad core @ 4ghz :)
Posted on Reply
#95
phanbuey
The NemesisThough the 3DMark Score is not overly impressive, it won't be considering the operating system was vista. Everyone saying they can hit over 14,000 with a GT or GTS, how many have done so easliy using vista. I could usinf a 640MB GTS and quad core clocked @ 3.6 Ghz and Gpu @700mhz. The 9800GTX score was done @ stock. It will be nice to see what it will do overclocked on xp with a quad core @ 4ghz :)
probably not much more than an overclocked 8800GTS 512 with a quadcore @ 4GHz in XP. ... now compare the difference of that between the 8800GTX vs 7900GTX, or 7900GTX vs 6800Ultra, or 6800Utlra vs the 5***FX...
Posted on Reply
#96
DarkMatter
When Tri SLI was unveiled, I remember many people around the net (many here on TPU) were crying about 8800 GT/GTS not being able to do it. Now Nvidia is going to release those same cards with Tri SLI capability, plus some minor enhancements (around 7% faster clock for clock on 3DMark ~13000 vs ~14000 *) and people are crying again.
Many people here say that they do +14000 but either they have the CPU at +3,6Ghz or the card at +750Mhz. Of course they will be faster that way!!

That being said Nvidia screwed up with the naming again **. But remember this is not GT200, and this one will come Q3 this year IIRC. This cards are no more than a Tri SLI capable refresh until GT200 comes out. I don't think that nobody bought a X1950 if they allready owned a X1900, right? And so?

*Indeed if you take CPU score out of the equation:

13000 - 4600 = 8400
14000 - 4600 = 9400

9400/8400 = 1,19

So basically, we could say that 9800GTX is 19% faster than 8800GTS, and without launch drivers. Granted is not 2x as powerful as 8800, but I think it's not that bad for a refresh. And as many refreshes in history are not meant to replace your beloved 8800 on your rig but on the market. You don't have to buy it if you don't want to, do you?

** Anyway about the naming, it's possible that they didn't have any other chance than change it. Don't know there, but I know many non techie guys that think that Ati is a step ahead because they released HD3000 series. And since 8800 was in competition with HD2000 series they trully believe HD3000 series is almost double as fast as HD2000, following the tradition. Until I corrected them, of course. Nvidia is just playing the same game. A game that nobody wants but...
Posted on Reply
#97
Wile E
Power User
The NemesisThough the 3DMark Score is not overly impressive, it won't be considering the operating system was vista. Everyone saying they can hit over 14,000 with a GT or GTS, how many have done so easliy using vista. I could usinf a 640MB GTS and quad core clocked @ 3.6 Ghz and Gpu @700mhz. The 9800GTX score was done @ stock. It will be nice to see what it will do overclocked on xp with a quad core @ 4ghz :)
I willing to bet my OCed GT easily exceeds 14K in Vista.
Posted on Reply
#98
HaZe303
With a slight OC on my GTX (8800) i score somewhere around 14000, so I definetly wont buy 9800gtx if these score´s will be same on final release cards. The only thing im looking forward to right now is AMD´s 48x0 series. Booooo for Nvidia, trying to get away with these cheap tricks. Trying to sell us g8 cards in a new package and name. :(
Posted on Reply
#99
warhammer
The NemesisThough the 3DMark Score is not overly impressive, it won't be considering the operating system was vista. Everyone saying they can hit over 14,000 with a GT or GTS, how many have done so easliy using vista. I could usinf a 640MB GTS and quad core clocked @ 3.6 Ghz and Gpu @700mhz. The 9800GTX score was done @ stock. It will be nice to see what it will do overclocked on xp with a quad core @ 4ghz :)
I have done it in VISTA with the GTX and GTS video cards, cpu @3.6Ghz the artical shows there CPU @3Ghz its not right look at there CPU score of 4597.
Posted on Reply
#100
Wile E
Power User
warhammerI have done it in VISTA with the GTX and GTS video cards, cpu @3.6Ghz the artical shows there CPU @3Ghz its not right look at there CPU score of 4597.
A quad at 3.6GHz in XP does about 5500, mine @ 3.87 does about 6100. So a score of 4600 @ 3Ghz sounds about right for Vista.
Posted on Reply
Add your own comment
Dec 23rd, 2024 10:34 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts