# NVIDIA GeForce 9800 GTX Scores 14K in 3DMark06



## malware (Feb 26, 2008)

After taking some screenshots with a special version of our GPU-Z utility, the guys over at Expreview have decided to take their GeForce 9800 GTX sample and give it a try at Futuremark 3DMark06. Using Intel Core 2 Extreme QX9650 @ 3GHz, 2GB of DDR2 memory, ASUS Maximus Formula X38 and a single GeForce 9800 GTX @ 675/1688/1100MHz the result is 14014 marks.



 



*View at TechPowerUp Main Site*


----------



## choppy (Feb 26, 2008)

from that score and the price premium probably with this card, its defn not worth it imo


----------



## [I.R.A]_FBi (Feb 26, 2008)

meh, nvidia is toking ...


----------



## jagalino (Feb 26, 2008)

9800gtx = 8800gts + over memory  with triple sli.
i got 1470 with my 8800gts at 760mhz

Gp200 please come soon !!!!!!!!


----------



## Aeon19 (Feb 26, 2008)

OMG!!  this is make me fall in love with my 8800 GTS 512 (G92)!!


----------



## Aeon19 (Feb 26, 2008)

jagalino said:


> 9800gtx = 8800gts + over memory  with triple sli.
> i got 1470 with my 8800gts at 760mhz
> 
> Gp200 please come soon !!!!!!!!



yeah that's right!! and with that  qx9650 

i did *13756 *with my E6750 @ 3.40 GHz and 8800 gts 512 @ 740-1850-1090

OMG I LOVE MY CARD!!


----------



## vega22 (Feb 26, 2008)

lol i can beat that too.


----------



## crow1001 (Feb 26, 2008)

NV are having a laugh if this is the performance of the 9800GTX, may as well buy a GTS or even a GT and clock it.


----------



## Aeon19 (Feb 26, 2008)

marsey99 said:


> lol i can beat that too.



hey with everything in background (msn, mozilla, skype, TS, AVG antivirus + antispyware...) and drivers on "quality"..

no lame about 3d mark...i just only wanted to see what i could do with my config at +/- @ default... 

and notice that i have a 400W PSU (brand: "Trident"  )


----------



## Gambit_ZA (Feb 26, 2008)

So it's weaker 8800GTX? Memory bandwidth seems about 20GB/s less.


----------



## Aeon19 (Feb 26, 2008)

Gambit_ZA said:


> So it's weaker 8800GTX? Memory bandwidth seems about 20GB/s less.



It seems...


----------



## wolf (Feb 26, 2008)

its not very invigorating is it chaps? hell my 8800GT oc beats that sm2 and sm3 score....jsut not the cpu score.

hec if i had a quad at 3ghz, my score would be over 14k i can tell you that! considering i already beat sm2/3..


----------



## newtekie1 (Feb 26, 2008)

So basically it is a binned 8800GTS 512 clocked higher be defualt.  Why are you all surprised its performance is similar to other G92 cards?


----------



## Aeon19 (Feb 26, 2008)

newtekie1 said:


> So basically it is a binned 8800GTS 512 clocked higher be defualt.  Why are you all surprised its performance is similar to other G92 cards?



because of the number in the name *9*800


----------



## allen337 (Feb 26, 2008)

Im supprised they have the balls to show a score off like that after making people wait so long. I would be imbarrased.  ALLEN


----------



## Aeon19 (Feb 26, 2008)

allen337 said:


> Im supprised they have the balls to show a score off like that after making people wait so long. I would be imbarrased.  ALLEN



yeah ok...but it has got the 3 Way SLI!!!!!!!!! 

-_-


----------



## allen337 (Feb 26, 2008)

Aeon19 said:


> yeah ok...but it has got the 3 Way SLI!!!!!!!!!
> 
> -_-



Buy 3 of them see if its worth it. You got people scoring 20000 on a 3870x2 for less money?  Not to mention 8800gts (g92) .  ALLEN


----------



## tkpenalty (Feb 26, 2008)

9800GTX = 8800GTS 512MB with better memory and clockspeeds.... DEFINATELY not worth it as OC edition 8800GTS 512MBs are the same thing. 3 Way SLi... Cheh. Nvidia.... what on earth are you guys doing?!?!??! Stop chucking an AMD!


----------



## jbunch07 (Feb 26, 2008)

allen337 said:


> Buy 3 of them see if its worth it. You got people scoring 20000 on a 3870x2 for less money?  Not to mention 8800gts (g92) .  ALLEN



agreed


----------



## btarunr (Feb 26, 2008)

Extremely disappointed in NVidia. This turned out to be a 8800 GTS (G92) + slightly higher clocks + two gold-fingers. :shadedshu

There's absolutely nothing in it to look upto. There's no word on DX 10.1 implementation, all those "256 SP ; 1 GB GDDR5" rumours were extrapure BS.


----------



## philbrown23 (Feb 26, 2008)

To me this isn't news. I actually think it's kind of pathetic! the 9800GTX gets lower score than the 8800GTS 512, is what the tittle should be the bfg gts 512 in my sig at stock settings gets 14679 and with a small oc can break the 15k barrier. so this to me is not impressive at all.


----------



## newtekie1 (Feb 26, 2008)

allen337 said:


> Buy 3 of them see if its worth it. You got people scoring 20000 on a 3870x2 for less money?  Not to mention 8800gts (g92) .  ALLEN



3 of them will destroy a 3870x2, but yeah, triple-SLI is pretty much the only thing this card has going for it.  Not to mention that there probably isn't a whole lot of head room for overclocking with the 9800GTX.  Maybe 1200MHz RAM?  Or perhaps even the same RAM used on the 8800GTS512 but with loosened timings?

If nVidia was going to do this, they should have made the 8800GTS512 the 9800GTS instead.  I don't know what nVidia is doing.

The funny thing is that with my 8800GTS 512 and my CPU only at 2.7GHz I just scored 14016 in 3DMark06...


----------



## Azazel (Feb 26, 2008)

wow this is bad for nvedia...meh


----------



## wolf (Feb 26, 2008)

they might have one saving grace, the price.....you had better get what you pay for with this card.... and hopefully some o/cing headroom, but seriously this is pathetic, they couldhave EASILY clocked it at 800 core 2000 shaders 2500 memory and given us SOMETHING!!!


----------



## strick94u (Feb 26, 2008)

If I remember correctly the 8800 ultra at stock clocks were about the same with a quad core
now they cost 700 bucks as long as these new cards are less its better than or equile to an ultra whats wrong with that?


----------



## btarunr (Feb 26, 2008)

newtekie1 said:


> 3 of them will destroy a 3870x2



How about 3 of them against 2 HD3870 X2 cards?

Assuming you get these cards for $350 each, $1050 for three, whereas two HD3870-X2 units cost 2x $440 = $880. Hmmm.


----------



## [I.R.A]_FBi (Feb 26, 2008)

NV would get pwned


----------



## Azazel (Feb 26, 2008)

btarunr said:


> How about 3 of them against 2 HD3870 X2 cards?
> 
> Assuming you get these cards for $350 each, $1050 for three, whereas two HD3870-X2 units cost 2x $440 = $880. Hmmm.



thtas what i was going to say


----------



## Azazel (Feb 26, 2008)

ati are going to have nvedia beat for a couple years...


----------



## tofu (Feb 26, 2008)

azazel said:


> ati are going to have nvedia beat for a couple years...



Just a thought, not being superstitious or anything.

The number *9800*, as in the *Radeon 9800* , when ATI smacked nVidia around with their R300-series core.

Now that nVidia's using this nomenclature, it does not bode well for them.


----------



## strick94u (Feb 26, 2008)

[I.R.A]_FBi said:


> NV would get pwned



If you press the link in my sig you will notice I have not been pnwd by a x2 ati I have been beatin by some but all in all I think 2 g92s in sli holds its own. fits beat mine by about 1000 points but the secound highest x2 score is lower and its a quadcore I am 2 core cpu. alcopones 3dmark06 page. I don't see where y'all have a problem with 14001 at stock speeds


----------



## Kreij (Feb 26, 2008)

> After taking some screenshots with a *special version* of our GPU-Z utility



What's special about their version of GPU-Z ?
Just wondering.


----------



## mdm-adph (Feb 26, 2008)

So, it's about as fast as a fairly overclocked HD 3870.  Meh.

What I want to know is:  why wasn't the 8800 GT called the 9800 GT first?  Wasn't it a completely different core from the rest of the 8800's? (old GTS, GTX, etc.)


----------



## Gambit_ZA (Feb 26, 2008)

Just a quote I read, for people that might be wondering about the reason for going back 256bit interface:

"Don't bother getting upset with the 256-bit memory bus because it will not become any kind of bottleneck. The only time it might be even close to that is when you're pouring liquid nitrogen into the container and have the GPU running at +50% of stock frequency, which will happen though."


----------



## btarunr (Feb 26, 2008)

Kreij said:


> What's special about their version of GPU-Z ?
> Just wondering.



Maybe its a pre-release given to them with the database entry of the 9800 GTX included. It's just the good ol' G92.


----------



## JRMBelgium (Feb 26, 2008)

This just can't be true.

Nvidia should name it 8900GTX if this is actually true...


----------



## JrRacinFan (Feb 26, 2008)

Meh, like all cards, initial driver release. I doubt that drivers would help improve it though. This is really bad news for the green camp. 

If they don't do something soon about future cards and releases, they might, no wait *will*, find themselves behind in competition with the red camp.

EDIT: Speaking of the 9800GX2 and theory, I wonder how these cards would scale in SLI? Better than previous cards maybe?


----------



## newtekie1 (Feb 26, 2008)

btarunr said:


> How about 3 of them against 2 HD3870 X2 cards?
> 
> Assuming you get these cards for $350 each, $1050 for three, whereas two HD3870-X2 units cost 2x $440 = $880. Hmmm.



True, but I guess that is why nVidia is releasing the 9800GX2 to compete with the HD3870 X2.

But who knows, 3 9800GTX cards might beat 2 HD3870X2 cards, until we see the actual card releases and what they do in Tri-SLI we won't know.



azazel said:


> ati are going to have nvedia beat for a couple years...



How so?  Because ATi has a card out on the market that is in the top spot?  Remember, that card uses two cores to achieve that.  NVidia still has the fastest single core on the market, and will soon have the fastest double core also.  So I don't see why you are saying ATi will have nVidia beat.  I'll believe that when ATi can produce a single core that can outpace nVidia's single core.  Not a double cored card outpacing a single.


----------



## JrRacinFan (Feb 26, 2008)

Newtekie, in performance yes, nVidia does have upper hand. When it comes to budget, ATi comes to mind right now.


----------



## GSG-9 (Feb 26, 2008)

tofu said:


> Just a thought, not being superstitious or anything.
> 
> The number *9800*, as in the *Radeon 9800* , when ATI smacked nVidia around with their R300-series core.
> 
> Now that nVidia's using this nomenclature, it does not bode well for them.



I would buy a Nvidia 9800 just to tribute my 9800 
I hope ATi pulls something out now, like that quad crossfire that they talked about for a while, or 3 way crossfire on nvidia chipsets or something. Its not going to happen, but it would be good to see AMD/ATi getting ahold of the market again, seeing as how they are worth less than what ATi was bought for now.


----------



## mdm-adph (Feb 26, 2008)

Not to mention that most buyers could care less how many cores are in their card as long as it's the fastest...


----------



## SK-1 (Feb 26, 2008)

Time for ATI to go for the Nvidia jugular vein!


----------



## newtekie1 (Feb 26, 2008)

JrRacinFan said:


> Newtekie, in performance yes, nVidia does have upper hand. When it comes to budget, ATi comes to mind right now.



How so?  With the 8800GT handily outperforming the 3870 I don't see where you are getting that?  The 8800GT, has a better Priceerformance ratio, a beter Performance:Watt ratio, and handily outperforms the 3870 at ~$20 more.

The 9800GX2 will compete with the 3870X2, in fact it should hand the 38070X2 its ass, assuming the recent specs "leaked" are true.  It will probably cost more too, but performance comes at a price.  The 3870X2 costs more then the 8800GTS512 too, and is the only offering from ATi that can outperform it.


----------



## GSG-9 (Feb 26, 2008)

newtekie1 said:


> The 8800GT, has a better Priceerformance ratio, a beter Performance:Watt ratio, and handily outperforms the 3870 at ~$20 more.
> 
> The 9800GX2 will compete with the 3870X2, in fact it should hand the 38070X2 its ass, assuming the recent specs "leaked" are true.  It will probably cost more too, but performance comes at a price.  The 3870X2 costs more then the 8800GTS512 too, and is the only offering from ATi that can outperform it.




A 3870 is $180...

I see the Gt is now $200, Im gald Nvidia is bringing there (Lower-highend) price to a competitve point. I would think the 9800GX2 will perform 400-800 points better at stock than a 3870x2, (These numbers have no baring, just a gut feeling) but Its cost will be much higher. 

I still give ATi props for spearheading the pricedrop with the 38xx series


----------



## 1c3d0g (Feb 26, 2008)

Aeon19 said:


> because of the number in the name *9*800



ATI pulled the same crap with their HD*3*xxx series...both companies are guilty of artificially increasing marketing numbers to make their graphics card look like it's a next-generation GPU. Thankfully us enthusiasts know them by their code names, and as long as you don't see a GT200 (formerly G100) or R7xx, we're still dealing with the "old" architecture.


----------



## trog100 (Feb 26, 2008)

whats happened is very simple.. to take the wind out of atis 3xxx series cheapo launch.. the green team bought forward two 9xxx series cards at low relative prices.. the 8800gt and the 8800gts...

this worked very well.. but now the two fake 8800 card are being presented in their proper place its not looking so good..

the 8800gt being presented as the new 9600 is clever but the 9800 cards are just a let down..  they have to be.. there is nothing left in the hat to pull out..

also ati didnt cheat.. producing the same performance for half the power draw and heat is a new generation chip.. with multiple gpus heat and power draw is the key factor..  

trog


----------



## GSG-9 (Feb 26, 2008)

1c3d0g said:


> ATI pulled the same crap with their HD*3*xxx series...both companies are guilty of artificially increasing marketing numbers to make their graphics card look like it's a next-generation GPU. Thankfully us enthusiasts know them by their code names, and as long as you don't see a GT200 (formerly G100) or R7xx, we're still dealing with the "old" architecture.



Its true, and its crap.


----------



## [I.R.A]_FBi (Feb 26, 2008)

1c3d0g said:


> ATI pulled the same crap with their HD*3*xxx series...both companies are guilty of artificially increasing marketing numbers to make their graphics card look like it's a next-generation GPU. Thankfully us enthusiasts know them by their code names, and as long as you don't see a GT200 (formerly G100) or R7xx, we're still dealing with the "old" architecture.



at least there was a die shrink ... and reduction in power use and heat output ...


----------



## EastCoasthandle (Feb 26, 2008)

This is funny, which is it for the G92 an 8 series or 9 series?  It's an 8 series when they released the 8800GT and 8800GTS but now it's a 9 series when they release a 9800.  Is the wool being pull here or what? Regardless of what tweaks/revisions are used (I can't find any information to show how a G92 from a GTS is different from a G92 from a 9800).


----------



## GSG-9 (Feb 26, 2008)

[I.R.A]_FBi said:


> at least there was a die shrink ... and reduction in power use and heat output ...



Thats true, but when they did that in the past it was the 7900 series not the 8800 series.


----------



## JrRacinFan (Feb 26, 2008)

newtekie1 said:


> How so?  With the 8800GT handily outperforming the 3870 I don't see where you are getting that?  The 8800GT, has a better Priceerformance ratio, a beter Performance:Watt ratio, and handily outperforms the 3870 at ~$20 more.
> 
> The 9800GX2 will compete with the 3870X2, in fact it should hand the 38070X2 its ass, assuming the recent specs "leaked" are true.  It will probably cost more too, but performance comes at a price.  The 3870X2 costs more then the 8800GTS512 too, and is the only offering from ATi that can outperform it.



I was thinking more on the lines of the HD3850 and 8800GS (sub $150 cards).


----------



## Psychoholic (Feb 26, 2008)

WTF?  My lowly 2900 Pro overclocked does 13,7xx

Maybe the gtx will have outstanding o/c abilities.


----------



## newtekie1 (Feb 26, 2008)

EastCoasthandle said:


> This is funny, which is it for the G92 an 8 series or 9 series?  It's an 8 series when they released the 8800GT and 8800GTS but now it's a 9 series when they release a 9600.  Is the wool being pull here or what? Regardless of what tweaks/revisions are used (I can't find any information to show how a G92 from a GTS is different from a G92 from a 9600).



You mean 9800, the 9600 uses a G94 core.  The tweaks aren't to the core, at least I don't think they are, I think they are more to the PCB to allow tri-SLI, and perhaps higher clocks.



JrRacinFan said:


> I was thinking more on the lines of the HD3850 and 8800GS (sub $150 cards).



The HD3850 isn't a sub-$150 card.  They are sub-$170 cards at best right now, the cheapest one on newegg is $169.99+Shipping.  At that price point the 9600GT on nVidia's side also outperforms the HD3850 for only $10 more.  Of course if you really want a deal the 8800GS is actually a sub-$150 card, going for $149.99+Free Shipping right now on newegg and performance wise the 8800GS is pretty close to equal the HD3850.  So I still don't see where you are going with this.


----------



## Silverel (Feb 26, 2008)

"Gale force winds are expected today throughout the southwest of the country as a massive sigh of relief is let out by AMD, as their rival nVidia has re-re-released yet another variant of the 8800GT that isn't really much better than the original...More news at 11!"


----------



## jbunch07 (Feb 26, 2008)

Silverel said:


> "Gale force winds are expected today throughout the southwest of the country as a massive sigh of relief is let out by AMD, as their rival nVidia has re-re-released yet another variant of the 8800GT that isn't really much better than the original...More news at 11!"


----------



## JrRacinFan (Feb 26, 2008)

newtekie1 said:


> The HD3850 isn't a sub-$150 card.  They are sub-$170 cards at best right now, the cheapest one on newegg is $169.99+Shipping.  At that price point the 9600GT on nVidia's side also outperforms the HD3850 for only $10 more.  Of course if you really want a deal the 8800GS is actually a sub-$150 card, going for $149.99+Free Shipping right now on newegg and performance wise the 8800GS is pretty close to equal the HD3850.  So I still don't see where you are going with this.



Meh, I am just saying, I am surprised at how they released a card in my eyes that doesn't perform. Oh and ...http://www.newegg.com/Product/Product.aspx?Item=N82E16814161211


----------



## Xolair (Feb 26, 2008)

Oh no, *Nvidia*'s starting to lose it...



... perhaps not, but that result still seems quite pathetic. Indeed you could just get a *8800 GTS* and juice it up to the same level. :shadedshu


----------



## [I.R.A]_FBi (Feb 26, 2008)

ATi, this is the moment you've been waiting for ....


----------



## newtekie1 (Feb 26, 2008)

JrRacinFan said:


> Meh, I am just saying, I am surprised at how they released a card in my eyes that doesn't perform. Oh and ...http://www.newegg.com/Product/Product.aspx?Item=N82E16814161211



Yes, technically $150, but also only 256MB and out of stock.  The 256MB version doesn't even come close to competing with the 8800GS in performance.  The 8800GS is overall about 7% faster than the HD3850 256MB, and is the same price.  So I still don't see where you are coming from.

I think we can all agree the 9800GTX isn't exactly the card we were expecting.  But saying that nVidia is beat because of a single card is kind of a big leap.  Especially when that single card still outperforms the offerings of ATi.


----------



## EastCoasthandle (Feb 26, 2008)




----------



## trog100 (Feb 26, 2008)

apart from green team suporters not having a new super toy to play with i recon its all right.. green keep winning the performance race but not by too much ati keep cheap which means green have to keep cheap.. i can keep buyng cheaper red and green fans can keep buying cheaper green..

high end is gonna be multiple gpus and multiple cards.. two or three x 2 cards red or green..

trog


----------



## Tatty_One (Feb 26, 2008)

EastCoasthandle said:


>



Am I the only one that finds this a bit odd?  Just look at the texture fillrate for the 9800GTX, how so high in comparision with the thether 2 G92 cards....same ROP's, similar clocks to the 8800GTS albeit a bit higher memory but how come three times the fillrate?  Am I missing something here or what?


----------



## JrRacinFan (Feb 26, 2008)

Look at the bus interface tatty. Do you think that would make the difference? **puzzled**


----------



## Grings (Feb 26, 2008)

That does look odd, however the 9800's is the most realistic of the 3, given that my 'old nail' g80 manages 30 gtexels, do gt's normally only get 9.6?


----------



## Tatty_One (Feb 26, 2008)

JrRacinFan said:


> Look at the bus interface tatty. Do you think that would make the difference? **puzzled**



No i dont think so, I actually thought that was "odd" also, they are both PCI-E 2.0 cards with 16x enabled so I cannot see that there is any possible factor there, I can find nothing that would give a texture fillrate thats more than three times the 8800GTS, same ROP's, same amount of SP's etc, as far as I was aware, SP's and ROP's coupled with shader engine speeds were the key here to texture fillrate and this does not make any sense to me but I amy well be missing something vital.

Dont get me wrong.....I am not suggesting this is fake, am just a little suspicious based on what I can see..........until someone more knowledgable than me in this field comes up with a sensible solution of what I am missing and then of course my mind will be at rest


----------



## newtekie1 (Feb 26, 2008)

The older versions of GPU-z read the Texture Fillrate wrong.


----------



## btarunr (Feb 26, 2008)

Tatty_One said:


> Am I the only one that finds this a bit odd?  Just look at the texture fillrate for the 9800GTX, how so high in comparision with the thether 2 G92 cards....same ROP's, similar clocks to the 8800GTS albeit a bit higher memory but how come three times the fillrate?  Am I missing something here or what?



The 8800 GTS is using just two PCI-E 2.0 lanes.


----------



## Wile E (Feb 26, 2008)

Well, in reference to the OP, my 8800GT scores higher in both SM2 and 3 than this card. I expected more. I thought they had a new beast coming, but this is more a mild tweak.


----------



## Wile E (Feb 26, 2008)

btarunr said:


> The 8800 GTS is using just two PCI-E 2.0 lanes.



No, the high Texture fillrate is just a gpu-z bug.


----------



## happita (Feb 26, 2008)

This is weird, isn't the unspoken standard supposed to be at least 15% improvement whenever they go into the new gen of cards? What's going on at NV?


----------



## JRMBelgium (Feb 26, 2008)

I get more with my system lol:
http://i240.photobucket.com/albums/ff312/JelleMees/sshot-21.jpg


----------



## JRMBelgium (Feb 26, 2008)

happita said:


> This is weird, isn't the unspoken standard supposed to be at least 15% improvement whenever they go into the new gen of cards? What's going on at NV?



Only 15%. The 8800GTS 640MB was as fast as the dual-GPU 7950X2 so I think Nvidia gained more then 15% from Geforce 7 to 8


----------



## crow1001 (Feb 26, 2008)

IMO, the 06 score and GPUZ shot are fake, for a start the 9800GTX GPU is on the G94 process not the G92 as shown by GPUZ.


----------



## JRMBelgium (Feb 26, 2008)

crow1001 said:


> IMO, the 06 score and GPUZ shot are fake, for a start the 9800GTX GPU is on the G94 process not the G92 as shown by GPUZ.



It has to be fake...no way that Nvidia will ruin their reputation by releasing such a crappy card...


----------



## brian.ca (Feb 26, 2008)

Kreij said:


> What's special about their version of GPU-Z ?
> Just wondering.



From the same site (http://en.expreview.com/2008/02/26/9800gtx-gpu-z-screen-shot/), "*The GPU-Z 0.1.7 didn’t work with 9800GTX. After W1zzard@TPU make some change on his GPU-Z, the tool can finally read all the data of the card.* Thanks, W1zzard."


----------



## [I.R.A]_FBi (Feb 26, 2008)

SLi is going the way of the dinos 

http://www.xbitlabs.com/news/video/...o_License_Intel_s_Next_Gen_Processor_Bus.html


----------



## brian.ca (Feb 26, 2008)

crow1001 said:


> IMO, the 06 score and GPUZ shot are fake, for a start the 9800GTX GPU is on the G94 process not the G92 as shown by GPUZ.



That's not the case, see 
	

	
	
		
		

		
			





 from http://www.techpowerup.com/53358/GeForce_9_Series_Roadmap_Updated.html.  Higher number doesn't mean the better chip, it's the 9600 that uses the G94, the 9500 uses the G96, the rest are all the G92 used within the 8x00 series cards released last fall/winter

Trog above pretty much nailed what happened.. the G9x chip was Nvidia's next gen chip.  ATI had two good cards coming out within the 38x0s.   As a pre-emptive strike Nvidia rushed out some versions of the G92 cards (remember how stock supply was such a problem for them? That was probably b/c of the rush job) as 8x00 series cards.  The GT and GTS should have been 9800 cards, but for one reason or another (most likely they didn't want to introduce them as the new gen when they may not have been completely ready and the other variants were also not ready yet) they were solds as 8800s.   It's less that Nv is trying to rebrand an old card as much as they improperly branded a new card.  But one way or another though don't expect too much of a gain on any of the 9x00 series cards over what we saw of G92 chips among the 8x00 series b/c the changes will mostly be tweaks.



> ATI pulled the same crap with their HD3xxx series...both companies are guilty of artificially increasing marketing numbers to make their graphics card look like it's a next-generation GPU. Thankfully us enthusiasts know them by their code names, and as long as you don't see a GT200 (formerly G100) or R7xx, we're still dealing with the "old" architecture.



While there's some truth to what you're saying that's probably not completely fair... first of all if you remember ATI did originally plan on releasing those cards as 2x00 series cards.  In the end given all the changes (die shrink, consequently lowering power consumption, consequently fixing the considerable heat and noise issues with the original 2k series cards, and other changes that let them sell the cards considerably cheaper), and the fact that those changes represented a lot of the major problems with the 2k series cards it's hard to blame them for the name change (if I remember correctly they did state that they specifically wanted to distance the new cards from the 2k s).


----------



## DrunkenMafia (Feb 26, 2008)

HAAHAAHAAAAHAAAA  

That just made me laugh...   I mean really really made me laugh.  They have a top end rig with this new 9800 card and only get 14k!!!!   

Why in gods name is the card so friggin huge if it can only grab 14k, and why is it called 9800!!!

I remember nvidia saying that the 9600GT was supposed to be twice the performance of the previous gen 8600GT and it pretty much is so what the hell happened here!!!!  Its barely 1.2 times the performance.

I will be a little surprised if there isn;t some issue with drivers or something with this card atm...  I was expecting 18k AT LEAST on this new nvidia Monster..  especially with that setup they are using @ expreview....


----------



## Black Panther (Feb 26, 2008)

malware said:


> After taking some screenshots with a special version of our GPU-Z utility, the guys over at Expreview have decided to take their GeForce 9800 GTX sample and give it a try at Futuremark 3DMark06. Using Intel Core 2 Extreme QX9650 @ 3GHz, 2GB of DDR2 memory, ASUS Maximus Formula X38 and a single GeForce 9800 GTX @ 675/1688/1100MHz the result is 14014 marks.



Being somewhat of a Nvidia fan-girl I'm sad to see this. With my weak E4300 overclocked at 3Ghz 2GB RAM and Asus P5B I get nearly 13K 3Dmarks with my 8800GT at the clock listed in my system specs under my avatar.

There wouldn't be much difference between "nearly 13K" as in my case and "barely 14K" as is the case with the 9800GTX on a more powerful proc and prolly faster ram. If _that's_ supposed to be the next gen high end, nvidia should have definitely put much much more of an effort. 

I'm definitely disappointed. The 'new' 9800GTX apparently is on the same shelf as the old 8800GTX?

As a call to Nvidia, please don't release new stuff unless it can be qualified as better than the stuff already present on the market.

Drunkenmafia: I don't know what they mucked up with the 9600GT. All I can say is that the 8800GT is like 3 times as much better than the 8600GTS (I had and benchmarked both cards).

I'd have expected a 96XX series to be at least twice as much more powerful than the 86XX series. But then look at the history: Is a 7600 twice as much powerful than a 6600?

And what about the ATI counterparts?


----------



## farlex85 (Feb 26, 2008)

I must say this is disappointing. If those scores truly are to be believed, then there isn't much reason at all to buy a 9800gtx. I guess nvidia is trying to throw out something new to avoid staying stagnent in the market place, but come on, most of the people who would spend money on a gtx are enthuseists, and I don't really see any of the aforementioned running out to get these with scores and prices like that.


----------



## brian.ca (Feb 26, 2008)

DrunkenMafia said:


> I remember nvidia saying that the 9600GT was supposed to be twice the performance of the previous gen 8600GT and it pretty much is so what the hell happened here!!!!  Its barely 1.2 times the performance.
> 
> I will be a little surprised if there isn;t some issue with drivers or something with this card atm...  I was expecting 18k AT LEAST on this new nvidia Monster..  especially with that setup they are using @ expreview....



Don't be too surprised... 9600 GT vs 8600 GT = new chip vs. old chip.  9800 GT/X vs. 8800 GT / GTS = new chip vs. same chip a few months later (probably with some small tweaks and revisions).


----------



## phanbuey (Feb 26, 2008)

Black Panther said:


> Being somewhat of a Nvidia fan-girl I'm sad to see this. With my weak E4300 overclocked at 3Ghz 2GB RAM and Asus P5B I get nearly 13K 3Dmarks with my 8800GT at the clock listed in my system specs under my avatar.



Yeah i get about 13-14K in 3dmark with  my OCd 8800GT as well... but hey, at least there is no need to upgrade right?


----------



## SK-1 (Feb 26, 2008)

[I.R.A]_FBi said:


> SLi is going the way of the dinos
> 
> http://www.xbitlabs.com/news/video/...o_License_Intel_s_Next_Gen_Processor_Bus.html



Well the dinosaurs were the most prolific species to ever dominate the Earth.LOL
Something like 500 million years on the planet is no small feat!
I personally like a single card solution, so I do not care one way or the other.


----------



## pentastar111 (Feb 26, 2008)

newtekie1 said:


> 3 of them will destroy a 3870x2, but yeah, triple-SLI is pretty much the only thing this card has going for it.  Not to mention that there probably isn't a whole lot of head room for overclocking with the 9800GTX.  Maybe 1200MHz RAM?  Or perhaps even the same RAM used on the 8800GTS512 but with loosened timings?
> 
> If nVidia was going to do this, they should have made the 8800GTS512 the 9800GTS instead.  I don't know what nVidia is doing.
> 
> The funny thing is that with my 8800GTS 512 and my CPU only at 2.7GHz I just scored 14016 in 3DMark06...


 Yea, BUT will 3 of them destroy two 3870X2's...hmmm....


----------



## OnBoard (Feb 26, 2008)

This is best news ever, if 9800GTX is so "weak", 9800GT much be weaker and my under week old 8800GT isn't obsolete already =) 11k stock is Vista x64, have to sink it before I start OCing.

8800GT and 8800GTS should become very short lived, kinda like x1900 vs x1950.


----------



## hv43082 (Feb 26, 2008)

Guess I will not need to update my 8800GTX.  I score roughly the same with my E6400 @ 3.2 Ghz.  Then again, who cares about bench mark.  How does it perform in games at 2560x1600???


----------



## fanik (Feb 26, 2008)

*8)*

lol? http://service.futuremark.com/compare?3dm06=5219305


----------



## Black Panther (Feb 26, 2008)

phanbuey said:


> Yeah i get about 13-14K in 3dmark with  my OCd 8800GT as well... but hey, at least there is no need to upgrade right?



Sure, we can feel smug about that. 

(Though I have to admit that I had been planning that by the end of this year I'd hand down my pc to my daughter and hence have a nice good excuse to get a quad core with a 64 bit OS, 4GB RAM, a solid state HDD if they get cheaper and a  ... 9800GTX ...)
Dream got shot now.


----------



## Tatty_One (Feb 26, 2008)

DrunkenMafia said:


> HAAHAAHAAAAHAAAA
> 
> That just made me laugh...   I mean really really made me laugh.  They have a top end rig with this new 9800 card and only get 14k!!!!
> 
> ...



Agreed, ffs I get 17,211 on a GTS, if this thing dont overclock as well as the GTS then there will be so little difference (same SP's etc), certainly at GTS overclocked speeds it will be a total waste of time.....and money!


----------



## Tatty_One (Feb 26, 2008)

brian.ca said:


> That's not the case, see
> 
> 
> 
> ...



A lot of sensible stuff there, pretty much close to the truth I would guess, just cant understand (apart from the heat and power consumption issues) why ATi didnt wait a short while to bring out the HD3870 at least and make it more competative, I mean, in one or two benches it still gets beat by the 2900XT, if they had introduced the 3850 when they did.....fine, thats the star card IMO and filled a niche that NVidia couldnt compete with, then given themselves an extra bit of time to improve the 3870 even further then I think they would have had greater success against the likes of the 8800GT and GTS.


----------



## warhammer (Feb 27, 2008)

Well that sux for a GTX but looking at the info on the JPEG it just looks sus for a CPU @3 Ghz but I could be wrong


----------



## indybird (Feb 27, 2008)

There is something definitely wrong here.  Im going to start off with that.

_If_ this is the actual card then there are serious driver issues.  I could also believe that this isnt the card and what expreview got a hold of was perhaps the 9800GT...  Esp. because of those numbers: they are nearly the same as the 8800GTS 512MB.

If this is the card and its correct specs and numbers then *shame on you, nvidia.*  That is pitiful.  I understand this is simply a revision, but 14000 is barely an improvement over an 8800 Ultra/GTX (which this card is meant to replace).  And 14000 is definitely no reason to get this over a 8800GTS 512MB; which is only about $280 by now.

I would laugh so hard if:
A) This is all BS or innaccurate and you all are giving nvidia a hard time over nothing
or 
B) This card overclocks like a beast

If this card really is this bad then I'll prolly find myself continuing my tradition of wanting an nvidia card but then buying an ATI card in the end for whatever reason.  If I don't go with an ATI card then I'll probably get an 9800GX2 just as long as I can get one for well under the MSRP ($600).

Lets just keep our fingers crossed...

-Indybird


----------



## [I.R.A]_FBi (Feb 27, 2008)

indybird said:


> There is something definitely wrong here.  Im going to start off with that.
> 
> _If_ this is the actual card then there are serious driver issues.  I could also believe that this isnt the card and what expreview got a hold of was perhaps the 9800GT...  Esp. because of those numbers: they are nearly the same as the 8800GTS 512MB.
> 
> ...



either way .. why get worked up ...


----------



## DrunkenMafia (Feb 27, 2008)

I wonder if they really are out of ideas.  I mean when you think of it neither ATI nor Nvidia have released a card that can break 15k in 06, apart from the 3870x2, but that is a twin core card so we will leave that out of the equation.  And I don't mean ocing either, I mean straight out of the box, single core, gfx card.....

Maybe both companies really haven't got anything faster atm.  

BUT...  At least ATI are not releasing newer model cards with more or less the same performance, that is just rediculous...  

If I went out a bought a HD3870 and it performed only 5% better than a X1950 I would be pissed....  I think the same things goes for nvidia..


----------



## The Nemesis (Feb 27, 2008)

Though the 3DMark Score is not overly impressive, it won't be considering the operating system was vista. Everyone saying they can hit over 14,000 with a GT or GTS, how many have done so easliy using vista. I could usinf a 640MB GTS and quad core clocked @ 3.6 Ghz and Gpu @700mhz.  The 9800GTX score was done @ stock. It will be nice to see what it will do overclocked on xp with a quad core @ 4ghz


----------



## phanbuey (Feb 27, 2008)

The Nemesis said:


> Though the 3DMark Score is not overly impressive, it won't be considering the operating system was vista. Everyone saying they can hit over 14,000 with a GT or GTS, how many have done so easliy using vista. I could usinf a 640MB GTS and quad core clocked @ 3.6 Ghz and Gpu @700mhz.  The 9800GTX score was done @ stock. It will be nice to see what it will do overclocked on xp with a quad core @ 4ghz



probably not much more than an overclocked 8800GTS 512 with a quadcore @ 4GHz in XP. ...  now compare the difference of that between the 8800GTX vs 7900GTX, or 7900GTX vs 6800Ultra, or 6800Utlra vs the 5***FX...


----------



## DarkMatter (Feb 27, 2008)

When Tri SLI was unveiled, I remember many people around the net (many here on TPU) were crying about 8800 GT/GTS not being able to do it. Now Nvidia is going to release those same cards with Tri SLI capability, plus some minor enhancements (around 7% faster clock for clock on 3DMark ~13000 vs ~14000 *) and people are crying again. 
Many people here say that they do +14000 but either they have the CPU at +3,6Ghz or the card at +750Mhz. Of course they will be faster that way!!

That being said Nvidia screwed up with the naming again **. But remember this is not GT200, and this one will come Q3 this year IIRC. This cards are no more than a Tri SLI capable refresh until GT200 comes out. I don't think that nobody bought a X1950 if they allready owned a X1900, right? And so?

*Indeed if you take CPU score out of the equation:

13000 - 4600 = 8400
14000 - 4600 = 9400

9400/8400 = 1,19 

So basically, we could say that 9800GTX is 19% faster than 8800GTS, and without launch drivers. Granted is not 2x as powerful as 8800, but I think it's not that bad for a refresh. And as many refreshes in history are not meant to replace your beloved 8800 on your rig but on the market. You don't have to buy it if you don't want to, do you?

** Anyway about the naming, it's possible that they didn't have any other chance than change it. Don't know there, but I know many non techie guys that think that Ati is a step ahead because they released HD3000 series. And since 8800 was in competition with HD2000 series they trully believe HD3000 series is almost double as fast as HD2000, following the tradition. Until I corrected them, of course. Nvidia is just playing the same game. A game that nobody wants but...


----------



## Wile E (Feb 27, 2008)

The Nemesis said:


> Though the 3DMark Score is not overly impressive, it won't be considering the operating system was vista. Everyone saying they can hit over 14,000 with a GT or GTS, how many have done so easliy using vista. I could usinf a 640MB GTS and quad core clocked @ 3.6 Ghz and Gpu @700mhz.  The 9800GTX score was done @ stock. It will be nice to see what it will do overclocked on xp with a quad core @ 4ghz


I willing to bet my OCed GT easily exceeds 14K in Vista.


----------



## HaZe303 (Feb 27, 2008)

With a slight OC on my GTX (8800) i score somewhere around 14000, so I definetly wont buy 9800gtx if these score´s will be same on final release cards. The only thing im looking forward to right now is AMD´s 48x0 series. Booooo for Nvidia, trying to get away with these cheap tricks. Trying to sell us g8 cards in a new package and name.


----------



## warhammer (Feb 27, 2008)

The Nemesis said:


> Though the 3DMark Score is not overly impressive, it won't be considering the operating system was vista. Everyone saying they can hit over 14,000 with a GT or GTS, how many have done so easliy using vista. I could usinf a 640MB GTS and quad core clocked @ 3.6 Ghz and Gpu @700mhz.  The 9800GTX score was done @ stock. It will be nice to see what it will do overclocked on xp with a quad core @ 4ghz




I have done it in VISTA with the GTX and GTS video cards, cpu  @3.6Ghz the artical shows there CPU @3Ghz its not right look at there CPU score of 4597.


----------



## Wile E (Feb 27, 2008)

warhammer said:


> I have done it in VISTA with the GTX and GTS video cards, cpu  @3.6Ghz the artical shows there CPU @3Ghz its not right look at there CPU score of 4597.


A quad at 3.6GHz in XP does about 5500, mine @ 3.87 does about 6100. So a score of 4600 @ 3Ghz sounds about right for Vista.


----------



## strick94u (Feb 27, 2008)

Jelle Mees said:


> This just can't be true.
> 
> Nvidia should name it 8900GTX if this is actually true...



See I agree should have saved 9800 for later I mean 8900 would follow like the 7900 did


----------



## yogurt_21 (Feb 27, 2008)

strick94u said:


> See I agree should have saved 9800 for later I mean 8900 would follow like the 7900 did



nvidia's attempting to assasinate the 9800 name which has loomed over them as their worst thrashing ever for years.


----------



## phanbuey (Feb 27, 2008)

the is the graphics card equivalent of vista.


----------



## Soulja (Feb 27, 2008)

I cant belive all the fuz about this particulary piece of hardware  Guys remeber its just a inanimate object that will be uselles in a couple of years


----------



## Wile E (Feb 27, 2008)

Soulja said:


> I cant belive all the fuz about this particulary piece of hardware  Guys remeber its just a inanimate object that will be uselles in a couple of years



Yeah, but some of us live, breathe, eat and sleep benchmarking. lol


----------



## yogurt_21 (Feb 27, 2008)

Soulja said:


> I cant belive all the fuz about this particulary piece of hardware  Guys remeber its just a inanimate object that will be uselles in a couple of years



couple of years? it's useless now lol. the g92 gts os a few MHZ away from beating it's performance at stock, and to top it all of the gts is cheaper to boot and currently decending in price.


----------



## DarkMatter (Feb 27, 2008)

Hmm I just thought of this: what if these cards have been optimized to run Ageia physics? With some sort of extra instructions and cache logic, and such. I mean that they are 8800s with better physics support and Tri SLI capable?

Forget about the naming, they are just refreshes, Nvidia had to increase the number because Ati did first, and because they are doing it again with HD4000, that is going to be the same with 480 SPs instead of 320 and faster clocks. And according to leaked info 50% faster than HD3000. Neither a great increase mefinks.

EDIT: With that I don't want to downgrade Ati. I have nothing against increasing pipelines and clocks to the same architecture and calling it "next gen". In quotes because it's just an increment in the number. Both Ati and Nvidia never promised that "next gen" cards would be twice as fast as "old gen", just happened they followed that trend for some time lately. They don't "have" to deliver that kind of improvement, and if they don't deliver is not because they want to fool consumers. Anyway, anyone spending $300+ on a card without looking at benchmarks or knowing what exactly is buying, deserves to be fooled by the naming squeme IMO.


----------



## [I.R.A]_FBi (Feb 27, 2008)

DarkMatter said:


> Hmm I just thought of this: what if these cards have been optimized to run Ageia physics? With some sort of extra instructions and cache logic, and such. I mean that they are 8800s with better physics support and Tri SLI capable?
> 
> Forget about the naming, they are just refreshes, Nvidia had to increase the number because Ati did first, and because they are doing it again with HD4000, that is going to be the same with 480 SPs instead of 320 and faster clocks. And according to leaked info 50% faster than HD3000. Neither a great increase mefinks.



phys-x is gong to be put on 8 series so thats a nonfactor

re 50% .. 50% of 3XXX to 4XXX is greater than 0% of 9 series vs 8 ... only product that seems to have an increase is 8600 to 9600 ...


----------



## DarkMatter (Feb 27, 2008)

[I.R.A]_FBi said:


> phys-x is gong to be put on 8 series so thats a nonfactor
> 
> re 50% .. 50% of 3XXX to 4XXX is greater than 0% of 9 series vs 8 ... only product that seems to have an increase is 8600 to 9600 ...



As I said above, if we do a clock for clock comparison 9800GTX is 20% faster than 8800GTS 512. IMO either you believe in 3DMark's capability in comparing graphics cards or not. If you do the 20% increase in SM2+SM3 is there. If you don't, you don't have to worry about this thread at all and wait until gaming benchmarks come.

PhysX will run on 8 series as an implementation in CUDA, but what I was saying is some kind of hardware support (which is not the same as hardware implementation, I'm not saying that). I could think of something like different SSE versions on CPUs. You won't notice the difference in the surface, the chip is the same in transistor count, power consumption, pipelining, etc. Even the ALUs are the same. Neither you will see any difference in old programs that don't use the new instructions, but new programs can greatly benefit.


----------



## InnocentCriminal (Feb 27, 2008)

Very good point DarkMatter. However, wouldn't it have been too late for nVIDIA to implement that sort of support so soon after acquiring Ageia?


----------



## DarkMatter (Feb 27, 2008)

InnocentCriminal said:


> Very good point DarkMatter. However, wouldn't it have been too late for nVIDIA to implement that sort of support so soon after acquiring Ageia?



Maybe. But take into account that Nvidia could be planning the purchase long before it was made public. And take into acoount that they were trying to do GPU physics along with Havok in the past. Definately what was good for Havok, is good for Ageia.


----------



## trog100 (Feb 27, 2008)

benchmark freaks have to realize.. the future means single gpu is mid range.. mid range buyers are the winners here.. decent single (what used to be high end) gpu cards that will play games nicely at affordable prices.. 

the benchmark freaks.. (a genuine minority) are gonna have to dig deep in their pockets for mulitple gpus and mulitple cards.. 

the days of a new power guzzliing super chip every so often are gone.. times have changed..

trog


----------



## Soulja (Feb 27, 2008)

Wile E said:


> Yeah, but some of us live, breathe, eat and sleep benchmarking. lol



Oh i forgot im on a tech forum  Well im still pleased with my x1900xt and i dont see myself changing it anytime soon, so i presume im not that demanding when it comes to graphic cards. I just upgrade when i feel the need, not when a company releases something new that has a higher PR number and it tells you that its "so much faster" than your card


----------



## DarkMatter (Feb 27, 2008)

trog100 said:


> benchmark freaks have to realize.. the future means single gpu is mid range.. mid range buyers are the winners here.. decent single (what used to be high end) gpu cards that will play games nicely at affordable prices..
> 
> the benchmark freaks.. (a genuine minority) are gonna have to dig deep in their pockets for mulitple gpus and mulitple cards..
> 
> ...



Couldn't have said it better.


----------

