Thursday, March 13th 2008

NVIDIA GeForce 9800 GTX Priced

According to NordicHardware, several leaked slides have revealed the price of NVIDIA's upcoming GeForce 9800 GTX card to be $349 (one even said $299-$349, but the higher value seems much more likely). The card should go on sale on March 25th, and the reference specifications have a G92-420 core running at 675MHz with 512MB of GDDR3 memory at 2000MHz.
Source: NordicHardware
Add your own comment

47 Comments on NVIDIA GeForce 9800 GTX Priced

#26
JJ_Sky5000
Ati is leading on beachmarks on paper. They suck In real gaming. I keep laughing everytime i see somebody saying they are out performing nvidia 3870gx2 you get 20% gain over a single 8800 gts Lol Weak oh you do get 10.1 direct x for buggy vista. maybe in 3 years you can use it... which do you want 3870gx2 crossfire or 3 9800gtx for the next games that are coming. Nvidia doesnt need to make a new card Ati wants to sale to the middle market not high end any way. ask them they will tell you that..
Posted on Reply
#27
eidairaman1
The Exiled Airman
you keep on dreamin kid, that was during the 2900XT series btw, with the 3870X2 actually taking on 8800 Ultra and then some, the cards price is in a better spot than the ultra, nvidia dropped the price on the GTX to try to steal the X2s thunder but for the price you get alot of card out of the X2, altho driver support seems to be subpar on both sides.
JJ_Sky5000Ati is leading on beachmarks on paper. They suck In real gaming. I keep laughing everytime i see somebody saying they are out performing nvidia 3870gx2 you get 20% gain over a single 8800 gts Lol Weak oh you do get 10.1 direct x for buggy vista. maybe in 3 years you can use it... which do you want 3870gx2 crossfire or 3 9800gtx for the next games that are coming. Nvidia doesnt need to make a new card Ati wants to sale to the middle market not high end any way. ask them they will tell you that..
Posted on Reply
#28
JJ_Sky5000
eidairaman1you keep on dreamin kid, that was during the 2900XT series btw, with the 3870X2 actually taking on 8800 Ultra and then some, the cards price is in a better spot than the ultra, nvidia dropped the price on the GTX to try to steal the X2s thunder but for the price you get alot of card out of the X2, altho driver support seems to be subpar on both sides.
You dont have a clue Do you own either one of these cards I have Maybe some hands on will do you some good. You keep living in your ati fantasy world The bugging direct x 10.1 one with no drivers
Posted on Reply
#30
candle_86
DanishDevilI personally think the major difference between ATi and nVidia right now is their marketing strategy.

For the first time, people can buy ATi's best single GPU for under $200. IMO, that's awesome. nVidia hasn't changed their marketing strategy, but ever since AMD got a hold of ATi, they seem to be appealing to the crowd that doesn't have $2000 to blow on updated PC components every 3 months. And that's the crowd that I'm a part of.
let me think the G92 costs 119 after mir, get the 8800GS. Thats 92 or D9E.
Posted on Reply
#31
eidairaman1
The Exiled Airman
thats the problem with fanboys always trying to downplay new technology, beyond that MS is working on the DX10.1 update anyhow, so its practically inevitable now.

Explain why the 9800GTX is the same size as the 8800 Ultra? I thought vid cards were supposed to be upgrades not downgrades (Remember ISA/EISA boards being full length)

At Least ATi is finally making a stand and not sitting still.
Posted on Reply
#32
erocker
*
Blah blah, enough of the "fanboy" talk please it gets nobody nowhere.:shadedshu The cards are what they are, let the reviews and benchmarks settle it.
Posted on Reply
#33
Darkmag
newtekie1ATi has yet to release a single core that can outperform nVidia. They haven't had a GPU that can outperform nVidia in well over a year, almost a year and a half. Their only chance at outperforming nVidia was to make a card with two of their highest end GPUs on it, and even then it is only roughly 5% better than nVidia's high end single GPU offering. Even still, it is only 8% faster overall than the 8800GTS 512MB, but costs about $180 more.
WOW dude your argument is just about 2 GPU < 1 GPU. So let me break this to you it's irrelevant whether or or not your GFX has 1/10/100/1000000 cores on it, It's irrelevant whether or not 1 core can beat 2/10/100/1000000 cores. What matter is that I saved $200 on a card with a millions cores that 5% slower than one with 1 core on it.

Every one that says 1 GPU > 2 GPU's is a narrow minded Neanderthal, that should be expunged from the gene pool, for been ignorant simpletons.

Secondly the 3870x2 is 35+% faster than the Ultra at 2560x1920, but I guess that's irrelevant cause every one with a high end GFX card plays at low end resolutions with -4xAA -16xAF. (down side is that performance isn`t consistent and every PC it performance differently)

ok now we've reached the end of my rant.
--------------------------------------------------------------------------------------

I do believe that the 9800x2 will be faster than the HD3870x2
www.pconline.com.cn/diy/graphics/reviews/0803/1241871.html
some game benchies. I just thought I would addd that.

I just wanna say that the 9800GTX might not end up been as crappy as everyone thinks, sure its 3dmark score is lack luster, but don`t forget that the R600 had a higher 3dmark score than the 8800GTX, and we all now that in games(were it counts) is was a different story.

I don`t think that it would have a better price/performance than the 8800GT, however I don`t think you`ll be an idiot if you do buy a 9800GTX(if you have an old GFX) since it would still have a rather good $/FPS ratio.

It all depends on what ATI's answer is, and this post is seriously getting f****ing long. But thanks to ATI/AMD the GFX prices have dropped considerable.
Posted on Reply
#34
3870x2
People are complaining about each side only making small jumps. Technology takes time to progress, yet the market right now is lightening fast, with a whole new series coming every 6 to 8 months. I dont expect a whole lot these days in excess to the last card that came out, instead I recognize that each side are making their own valiant battles to stay in the market. GO ATI! but I just ordered an 8800gts G92. EVGA step-up pwnz, in a month I will be rocking a gx2 for only $150 or so more
Posted on Reply
#35
newtekie1
Semi-Retired Folder
DanishDevilI personally think the major difference between ATi and nVidia right now is their marketing strategy.

For the first time, people can buy ATi's best single GPU for under $200. IMO, that's awesome. nVidia hasn't changed their marketing strategy, but ever since AMD got a hold of ATi, they seem to be appealing to the crowd that doesn't have $2000 to blow on updated PC components every 3 months. And that's the crowd that I'm a part of.
I have to disagree with you. Yes, you can guy ATi's highest single GPU card for under $200. However, that is simply because ATi can't sell their highest end single GPU card for any more than that. If they did it wouldn't sell. Yes, nVidia is releasing very high end cards, and charging a premium for them. But ATi would be doing the exact same thing IF they could, their highest margin cards are the high end cards. They make the most profit on the super high end cards. Just because you can buy ATi's best single GPU card available for under $200 doesn't mean anything. You can buy cards from nVidia that are equal, or even better in performance for under $200. ATi's marketing isn't any better, they just can't managed to make a single GPU card that can outperform nVidia's so they have to sell the cheap, the problem is nVidia has cards that are just as cheap and better.
flashstarThe R600 core is much more logically designed than the G92 or G80. Now, ATI just has to add more brute force and they will have a winner. It appears that the R700 will do just that. I doubt that Nvidia can achieve a 60% performance boost even on their G100 like ATI is supposedly going to with their R7xx really soon.
The G92 and G80 are wonderfully designed cores, they seem to be doing the job against R600 just fine. The problem with ATi's design is that it is incredibly hard to add that brute force that they need so dearly. The choice by nVidia to unlock the shader clock speeds from the core clock speed was probably the best move possible. Instead of just increasing the shader count to boost shader performance, they raised their shader's clock speed. The problem ATi has is that it is hard to get so many shaders stable at any kind of high clock speeds and why we have nVidia cards with less than half the shaders of ATi cards destroying ATi cards in performance. That is why nVidia has a card with 64 Shaders outperforming ATi's cards with 320.

I don't see how anyone can say that ATi's design is more logical than nVidia's when nVidia is releasing mid-range cards with 64 shaders that are beating ATi's best offerings with 320.

And that takes me back to the "marketing" issue. Yes, the 3870 can be had for under $200. That is all good, but the 9600GT is overall 4% faster(it is 3% faster than even the 2900XT) AND it can be had for ~$25 less than the 3870. Personally, I think nVidia releasing a cheaper mid-range card that outperforms your competitors best offering is damn good marketting.
DarkmagWOW dude your argument is just about 2 GPU < 1 GPU. So let me break this to you it's irrelevant whether or or not your GFX has 1/10/100/1000000 cores on it, It's irrelevant whether or not 1 core can beat 2/10/100/1000000 cores. What matter is that I saved $200 on a card with a millions cores that 5% slower than one with 1 core on it.

Every one that says 1 GPU > 2 GPU's is a narrow minded Neanderthal, that should be expunged from the gene pool, for been ignorant simpletons.

Secondly the 3870x2 is 35+% faster than the Ultra at 2560x1920, but I guess that's irrelevant cause every one with a high end GFX card plays at low end resolutions with -4xAA -16xAF. (down side is that performance isn`t consistent and every PC it performance differently)
I never said 1 GPU is greater than 2. However, I was comparing GPU to GPU, apples to apples if I need to break it down into simplier terms for you. NVidia's dual GPU card is on the way, and when that comes we can compare 2 GPUs to 2 GPUs. My argument was about who is technologically ahead, and part of that argument is comparing apples to apples and seeing who's technology is in the lead. Yes, performance wise ATi is in the lead, but my argument wasn't about performance leads, it was about technology. My point was that if ATi can't produce a GPU that can outperform nVidia's GPU, then how are they in the lead technologically. Because, as we already see, if you take a weaker GPU and stick two of them together on a single card, then the other side is going to exactly the same thing with their stronger GPUs and end up with a stronger card. To get a fair guage one who is in the lead technolgoy wise you need to look at Apples to Apples, or in this case GPU to GPU, not GPU to multiple GPUs.

Now to correct some of your misinformation about the 3870x2:

Personally, I don't even know anyone that owns a monitor that runs at that resolution, and if I did, I doubt they would be using a single 3870x2 to power it becuase they would probably be rich enough to have more than that. However, you can make any card look good if you just look at one single aspect. However, I prefer to look at overall performance, at multiple resolutions, on multiple different benchmarks. Lets take a look shall we:


Some interesting things there. When you look at overall performance, a very different story is told compared to your single narrow bit of information. You see a story where the 3870x2 is only 5% faster than the 8800GTX, and only 8% faster than the 8800GTS 512MB. Not exactly worth the $180+ price premium IMO.
Posted on Reply
#36
Darkmag
newtekie1I don't see how anyone can say that ATi's design is more logical than nVidia's when nVidia is releasing mid-range cards with 64 shaders that are beating ATi's best offerings with 320.
The shaders works completely differently, what your doing now is comparing apples to pairs.
Posted on Reply
#37
Rey17
well i wont bet my money on getting a 9XXX series because well like



Can the Techpowerup forums plz delete this post............ its a accident !!!

my real post is the other one.........post 39 !!
Posted on Reply
#38
Rey17
Well i wont bet all my money on the 9xxx series and say wooooow 9XXX series.... how cool..??

i still think that the 8800 ultra can still kick ass when Oced...... no really...... it might have some good shaders, nice mhz but overall if u see at the whole picture, its nothing compared to the BFG 8800....lol

and just like flashstar said, its just a bunny hop, not a Building hop.... but it could be ok for a person who does not like to OC his system to buy this because it is not that bad, and with triple sli as a option, thats pretty good.... but still i stay with 8800 series.... they are just too gooooooood.... i know alot of people agree with me !!

but if u want to buy a 9xxx series, then feel free to do so.

Well this is Just my opinion !! so dont take it seriosly lol.....
Posted on Reply
#39
newtekie1
Semi-Retired Folder
DarkmagThe shaders works completely differently, what your doing now is comparing apples to pairs.
I know their shaders work completely differently, however they achieve the same goal, which is why they are compared and can be considered a comparision between Apples and Apples. All the arguing in the world isn't going to change the fact that at the end of the day nVidia achieves better performance out of their cores than ATi does, it doesn't matter how either works, what matter is the performance that either gives. So I still don't see how the argument that ATi's core is more logical is valid, there is nothing to support that.
Rey17Well i wont bet all my money on the 9xxx series and say wooooow 9XXX series.... how cool..??

i still think that the 8800 ultra can still kick ass when Oced...... no really...... it might have some good shaders, nice mhz but overall if u see at the whole picture, its nothing compared to the BFG 8800....lol

and just like flashstar said, its just a bunny hop, not a Building hop.... but it could be ok for a person who does not like to OC his system to buy this because it is not that bad, and with triple sli as a option, thats pretty good.... but still i stay with 8800 series.... they are just too gooooooood.... i know alot of people agree with me !!

but if u want to buy a 9xxx series, then feel free to do so.

Well this is Just my opinion !! so dont take it seriosly lol.....
I don't know, the 8800Ultra wasn't exactly a good overclocker. I do believe it can hang with the 9800GTX performance wise though, and I don't think anyone with an 8800GTX/Ultra should even consider replacing it with a 9800GTX. However, for some that have weaker cards and looking for an upgrade, I do believe the 9800GTX is a decent option, though I am one of the group that never buys cards that cost moer then $300, so it wouldn't fit for me, but I know a lot of people that would consider it.
Posted on Reply
#40
DarkMatter
^^ +1. Both arguments. Hmm... +2 then. :D
Posted on Reply
#41
eidairaman1
The Exiled Airman
If im not mistaken nvidia is also using transistor density.
newtekie1I have to disagree with you. Yes, you can guy ATi's highest single GPU card for under $200. However, that is simply because ATi can't sell their highest end single GPU card for any more than that. If they did it wouldn't sell. Yes, nVidia is releasing very high end cards, and charging a premium for them. But ATi would be doing the exact same thing IF they could, their highest margin cards are the high end cards. They make the most profit on the super high end cards. Just because you can buy ATi's best single GPU card available for under $200 doesn't mean anything. You can buy cards from nVidia that are equal, or even better in performance for under $200. ATi's marketing isn't any better, they just can't managed to make a single GPU card that can outperform nVidia's so they have to sell the cheap, the problem is nVidia has cards that are just as cheap and better.



The G92 and G80 are wonderfully designed cores, they seem to be doing the job against R600 just fine. The problem with ATi's design is that it is incredibly hard to add that brute force that they need so dearly. The choice by nVidia to unlock the shader clock speeds from the core clock speed was probably the best move possible. Instead of just increasing the shader count to boost shader performance, they raised their shader's clock speed. The problem ATi has is that it is hard to get so many shaders stable at any kind of high clock speeds and why we have nVidia cards with less than half the shaders of ATi cards destroying ATi cards in performance. That is why nVidia has a card with 64 Shaders outperforming ATi's cards with 320.

I don't see how anyone can say that ATi's design is more logical than nVidia's when nVidia is releasing mid-range cards with 64 shaders that are beating ATi's best offerings with 320.

And that takes me back to the "marketing" issue. Yes, the 3870 can be had for under $200. That is all good, but the 9600GT is overall 4% faster(it is 3% faster than even the 2900XT) AND it can be had for ~$25 less than the 3870. Personally, I think nVidia releasing a cheaper mid-range card that outperforms your competitors best offering is damn good marketting.



I never said 1 GPU is greater than 2. However, I was comparing GPU to GPU, apples to apples if I need to break it down into simplier terms for you. NVidia's dual GPU card is on the way, and when that comes we can compare 2 GPUs to 2 GPUs. My argument was about who is technologically ahead, and part of that argument is comparing apples to apples and seeing who's technology is in the lead. Yes, performance wise ATi is in the lead, but my argument wasn't about performance leads, it was about technology. My point was that if ATi can't produce a GPU that can outperform nVidia's GPU, then how are they in the lead technologically. Because, as we already see, if you take a weaker GPU and stick two of them together on a single card, then the other side is going to exactly the same thing with their stronger GPUs and end up with a stronger card. To get a fair guage one who is in the lead technolgoy wise you need to look at Apples to Apples, or in this case GPU to GPU, not GPU to multiple GPUs.

Now to correct some of your misinformation about the 3870x2:

Personally, I don't even know anyone that owns a monitor that runs at that resolution, and if I did, I doubt they would be using a single 3870x2 to power it becuase they would probably be rich enough to have more than that. However, you can make any card look good if you just look at one single aspect. However, I prefer to look at overall performance, at multiple resolutions, on multiple different benchmarks. Lets take a look shall we:


Some interesting things there. When you look at overall performance, a very different story is told compared to your single narrow bit of information. You see a story where the 3870x2 is only 5% faster than the 8800GTX, and only 8% faster than the 8800GTS 512MB. Not exactly worth the $180+ price premium IMO.
Posted on Reply
#42
newconroer
crow1001Doh, have you not seen the specs and 06 scores, revised core or not, its on par with an overclocked GTS, it may have more room for overclocking but this release sucks big time. I would probably take the age old GTX over this, bigger interface and memory, real embarrassment.
That's what naysayers said about the new GTS, and look at how it cripples the old one, even an old 640 OCd to 620/2120 doesn't stand up to it(though I do say, the 640 was no laughing matter, even stock).

The GTS and GTX survived as supreme leaders for almost a full year. That's quite an accomplishment. It also brought a new and noticeably evolved, almost revolutionary architecture to the market. It seems people have been assuming that this is what Nvidia is going to be doing upon each new tech release, and I'm not sure why. It's not a trend of the past, these such break-throughs were not on a timed schedule.

Maybe they do have something revolutionary, and just do not wish to share it yet.

If the cards they sell play the games that exist, then who cares?
Posted on Reply
#43
DarkMatter
newconroerThat's what naysayers said about the new GTS, and look at how it cripples the old one, even an old 640 OCd to 620/2120 doesn't stand up to it(though I do say, the 640 was no laughing matter, even stock).

The GTS and GTX survived as supreme leaders for almost a full year. That's quite an accomplishment. It also brought a new and noticeably evolved, almost revolutionary architecture to the market. It seems people have been assuming that this is what Nvidia is going to be doing upon each new tech release, and I'm not sure why. It's not a trend of the past, these such break-throughs were not on a timed schedule.

Maybe they do have something revolutionary, and just do not wish to share it yet.

If the cards they sell play the games that exist, then who cares?
Benchmark freaks care, but let them cry. :D
Posted on Reply
#44
newconroer
Yes, I was trying to avoid mentioning .. them...

And yes, let them cry. The day that benchmarkers control the direction of the GPU market, is the day I stop giving any concern about the development of new architecture.

Hobbyists are not economists and businessmen for a reason.
Posted on Reply
#45
phanbuey
Rey17Well i wont bet all my money on the 9xxx series and say wooooow 9XXX series.... how cool..??
.
My money says they call it the 9XXX series because theyre gonna run out (retire) the entire GeForce Brand name altogether. No hardware company will name their product the 10800GTX or the 10950GTX... Its always stuff like E6600, AMD 3000+, Voodoo 5500, X1900XTX, ATI 9550.. they probably want to replace the GeForce brand with another brand name, and that brand name's product will be the real deal, whereas this is just a die shrink (penryn anyone?).

Nvidia's marketing peeps know what products are in the pipeline, and they choose current naming with some strategy in mind. They probably saw an opportunity to run out GeForce's naming scheme so that the next acrhitecture/generation of GPUs is differentiated, and it will probably blow away anything 'GeForce'.

Intel will do the same thing after the Core 2 Q9xxx/E8xxx series, i highly doubt nehalem will have a name like Core 3 N9XXX or N6XXX.
Posted on Reply
#46
Rey17
R u sure that the geforce range is gunna die out ??

well i have to say that the name GEFORCE is good, i like it !!

but if they are making a better and newer version of their graphics card, they better be REALLY GOOD !!!! in the performance, speed, quality and PRICES !!! places....... i hope they call it something good and catchy like The Xtreme series, or the UltiMa8 series or the watever series.......
Posted on Reply
#47
phanbuey
Rey17R u sure that the geforce range is gunna die out ?...i hope they call it something good and catchy like The Xtreme series, or the UltiMa8 series or the watever series.......
:roll: you should send that to them. I'm not sure that theyre gonna retire the GEFORCE name, although that might happen. But i am 99% sure that theyre gonna retire the model numbering scheme like the 6xxx, 7xxx, 8xxx... And they wanted the next product to have a unique name, as its going to be a new "series" - oooOOOoo.
Posted on Reply
Add your own comment
Nov 26th, 2024 21:20 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts