Thursday, February 14th 2008

No GeForce 9800 GX2 at CeBIT 2008

Some last minute news from the informer VR-Zone. According to this information, NVIDIA has canceled the GeForce 9800 GX2 introduction during CeBIT 2008. All NVIDIA partners are ordered not to show the new card during the German computer show. The official launch date is set for March 11th and partners will receive the reference cards about a week before this date, meaning that working samples will already be available during CeBIT, but none of them will show its power.
Source: VR-Zone
Add your own comment

42 Comments on No GeForce 9800 GX2 at CeBIT 2008

#26
yogurt_21
actually if the 9800 gx2 dies it'll be more likely due to the fact that the next geforce is due out and will probably beat it's performance and cost less to produce. many cards have died this way. It's always about the good ol' dollar. If nvidia sees that current sales haven't been hit by the 3870x2, then theres no reason to launch a counterpart and can instead focus on the next series release and the recent implimentatio of ageia physics.
Posted on Reply
#27
CrAsHnBuRnXp
btarunrhow can you say so?

Something like that happens and everyone's with tin-hats on. Was the 9800 GTX there?
He's a fanboy. Just look at his sig.
Posted on Reply
#29
imperialreign
btarunrOr maybe it's just all the suspenseful marketing. Make people die to own it when it comes.
if so, nVidia is playing a truly risky game with it right now - the 3870x2 has commanded everyone's attention, and with the GX2 not having any "real" benchmarks or even being visible to the public yet, many people will just assume the card(s) aren't up to snuff with ATI's offering.

To use the suspense tactic, one would need hardware that has proven itself in some small way to be a major impact - and the GX2 just comes across as a scrambled counter to ATI's 70x2.


. . . not saying that the GX2 is going to suck, just pointing out that nVidia aren't really playing their cards all that well right now.
Posted on Reply
#30
Lu(ky
imperialreignif so, nVidia is playing a truly risky game with it right now - the 3870x2 has commanded everyone's attention, and with the GX2 not having any "real" benchmarks or even being visible to the public yet, many people will just assume the card(s) aren't up to snuff with ATI's offering.

To use the suspense tactic, one would need hardware that has proven itself in some small way to be a major impact - and the GX2 just comes across as a scrambled counter to ATI's 70x2.


. . . not saying that the GX2 is going to suck, just pointing out that nVidia aren't really playing their cards all that well right now.
Ii think AMD/ATI are holding back the there Drivers and crossfire support tell the see the 9800 GX2 scores. It is like a pissing contest to see who can piss the furthest right now. Unless the GX2 has better scores, same price of $450-$500, and supports SLI it will be a loss for nvidia. Come on now stop freaking stalling and just come out with your products already. Delay delay delay...
Posted on Reply
#31
AddSub
same price of $450-$500
I doubt that. Add at least another $150-$200 to that. At least.
Posted on Reply
#32
Lu(ky
AddSubI doubt that. Add at least another $150-$200 to that. At least.
But would you pay $550.00 for some thing like that when you get buy a 3870 X2 for $450.00 with the same performance? I do not think Nvidia had planned for a 9800 GX2 SLI setup. So only time will tell on PRICE vs PERFORMANCE etc...
Posted on Reply
#33
mandis
Maybe Nvidia is paying the price for not planning ahead. If it was possible to hook up 2 9800gx2s that would mean quad sli. Is that even possible with the G92 core? And if so how well does it scale? If the performance of the 7950GX2 is anything to go by then the whole attempt would seem futile.

I think, based on the information published so far, ATI has been the better strategists. Crossfire truly shines in comparison to SLI and although it still has some way to go in terms of optimization it still scales much better. The 3870x2 is now the most powerful graphics card on the market because of crossfire alone.
Posted on Reply
#34
imperialreign
mandisMaybe Nvidia is paying the price for not planning ahead. If it was possible to hook up 2 9800gx2s that would mean quad sli. Is that even possible with the G92 core? And if so how well does it scale? If the performance of the 7950GX2 is anything to go by then the whole attempt would seem futile.

I think, based on the information published so far, ATI has been the better strategists. Crossfire truly shines in comparison to SLI and although it still has some way to go in terms of optimization it still scales much better. The 3870x2 is now the most powerful graphics card on the market because of crossfire alone.
I agree, and I'm interested to see how well 2 3870x2 work when paired with each other.
Posted on Reply
#35
phanbuey
mandisMaybe Nvidia is paying the price for not planning ahead. If it was possible to hook up 2 9800gx2s that would mean quad sli. Is that even possible with the G92 core? And if so how well does it scale? If the performance of the 7950GX2 is anything to go by then the whole attempt would seem futile.

I think, based on the information published so far, ATI has been the better strategists. Crossfire truly shines in comparison to SLI and although it still has some way to go in terms of optimization it still scales much better. The 3870x2 is now the most powerful graphics card on the market because of crossfire alone.
+1... as i have been reading some reviews lately, with the new drivers the hd3870 has definitely caught up to within 5% of the 8800GT performance at stock... and with crossfire scaling the way it does, this is gonna be fun to watch (since i cant afford either one).
Posted on Reply
#36
imperialreign
phanbuey+1... as i have been reading some reviews lately, with the new drivers the hd3870 has definitely caught up to within 5% of the 8800GT performance at stock... and with crossfire scaling the way it does, this is gonna be fun to watch (since i cant afford either one).
agreed. ATI is taking their mid-range performance oriented stance to all new levels. They're quickly starting to dominate the "bang for the buck" category quite nicely, and really putting the heat back on nVidia, too.
Posted on Reply
#37
candle_86
yogurt_21actually if the 9800 gx2 dies it'll be more likely due to the fact that the next geforce is due out and will probably beat it's performance and cost less to produce. many cards have died this way. It's always about the good ol' dollar. If nvidia sees that current sales haven't been hit by the 3870x2, then theres no reason to launch a counterpart and can instead focus on the next series release and the recent implimentatio of ageia physics.
well it hasnt hurt it at all, with the recent price drops and such, an 8800GT is at MSRP now and offers decent power over the 3870x2 in Sli. 449 for an x2 or 510-550 for SLI 8800GT need i say more with what Crysis has showed us, Nvidia right now is simply faster
Posted on Reply
#38
imperialreign
candle_86well it hasnt hurt it at all, with the recent price drops and such, an 8800GT is at MSRP now and offers decent power over the 3870x2 in Sli. 449 for an x2 or 510-550 for SLI 8800GT need i say more with what Crysis has showed us, Nvidia right now is simply faster
TBH, I really think Crysis is poor grounds for comparison. There are way to many issues on ATI's side thanks to nVidia optimizations within Crysis that tilt the scale way to far in their favor - much moreso than with most other "optimized" titles. I really think, also, perhaps these massive optimizations also have a little something to do with how hyped the game was months and months before release.
Posted on Reply
#39
ChillyMyst
btarunrhow do you link the 7950GX2 to this? That was very much an interim before the GeForce 8 and this card will be one of the flagship GeForce 9 cards. Oh Applying that logic, the ATI Rage Fury Maxx (ATi's first attempt at dual GPU cards) was an utter fiasco, they did come up with an excellent HD3870 X2 in due course, didn't they?
hate to tell you this but the gx2 was a FLOP and was ULTRA FAIL from nvidia, a HUGE misstake, the drivers SUCKED TOTAL ASS, it didnt work in sli worth a damn, i personaly have SEEN this and it is quite dissapointing to be very honest, the cards run HOT and perform like crap.

btw, the rare 1950pro x2 would likely with some minor tweaking would likely beable to walk allover the 7950gx2, my friend has 2 of those cards, and they stand pretty much toe to toe with todays cards in most games.

and the rage fury maxx was a huge flop, because at the time ati had a SHITTY driver team behind it, nvidia at the time had a steller driver team, but since the FX line came out their driver teams been on crack or something.......and this from an 8800gt owner!!!!
AssimilatorThe 7950 GX2 was a piece of crap? Tell that to the guy who got over 14,000 in 3DMark06 with said video card (check the ORB). Can any Radeon X1900 series card match that? Not bloody likely.

I'm guessing nVidia is just trying to clear as much "old" 8800 series stock as possible before they release this monster and kill off the 8800 GTX and Ultra for good.
not it wasnt a peice of crap it was a peice of utter shit, i dont care what orb says, i have seen people hack 3dmark and get scores that are impossable(100k+ on 06 with a 6600gt...rofl) if you know how to hack/tweak/mess with programs you can do alot of things to give yourself a better score, 3dmark is meaningless, a SMALL change can give you 1k extra marks or loose you 3k, when in real games you would see NO DIFFRANCE AT ALL.

god why do morons alwase cling to the "it gets good 3dmark scores" argument when 3dmark for most REAL GAMERS is just a good stability tester.
Posted on Reply
#40
ChillyMyst
imperialreignTBH, I really think Crysis is poor grounds for comparison. There are way to many issues on ATI's side thanks to nVidia optimizations within Crysis that tilt the scale way to far in their favor - much moreso than with most other "optimized" titles. I really think, also, perhaps these massive optimizations also have a little something to do with how hyped the game was months and months before release.
exectily what i have said a few hundred times to people, i have an nvidia card, 8800gt infact, guess what, crysis runs like ass, the game needs optmized, and im SURE that crytek is working on that, along side fixing buggs in the MP and SP game as well. eventuly im sure it will become a good/great game, for now its not a game to use to compair hardware, maby compare driver sets for quility/buggs and crysis only performance, but not to compair diffrent cards/brands.

do i need to bring up doom3 again?

when doom3 came out it had this lovely little optimization in it that crippled ati cards, BUT had positive effects on nvidia cards, now Id could have made doom3 see that "hey this is an ATI card i can run pure math shaders not constant texture compair lookups" but insted they made it so that ati users had to edit a file in the game to change the shader code to the math equivlant, once you did that ati cards got the same of better perf in most cases as nvidia cards.

www.tweakguides.com/Doom3_10.html

the 1-2fps thing he says is false, it was true if u moded the game AFTER the drivers already had the tweak built in tho.

i saw about a 12-18 fps boost from the tweak on my x800xt pe at 1280x1024 everything maxed out(not ultra mode as the card i had only had 256mb texture ram and 512 is requiered to use true ultra mode)

i wouldnt be suprised if a few months from now crysis is running ALOT better even on older hardware, it just takes time to optimize a game, specly one as complex as crysis that was, lets all be honest, rushed out due to demand caused by alot of hype.
Posted on Reply
#41
btarunr
Editor & Senior Moderator
ChillyMysthate to tell you this but the gx2 was a FLOP and was ULTRA FAIL from nvidia, a HUGE misstake, the drivers SUCKED TOTAL ASS, it didnt work in sli worth a damn, i personaly have SEEN this and it is quite dissapointing to be very honest, the cards run HOT and perform like crap.
Okay, I didn't comment on the 7950 GX2 not being ultra fail did I? Iwas just telling you couldn't pre-judge the 9800 GX2 on the grounds of the 7950 GX2 being an ultra fail. In that case, ATI's first dual-GPU on single card (The Rage Fury MAXX) was ULTRA FAIL too but nobody thought the R680 would be a bad offering just because their previous one was a bad one, did they? In the same way, yes the 7950 GX2 was bad but you CANNOT pre-judge the 9800 GX2 based on that. Don't make comments on the 9800 GX2 saying "oh they already are saying it runs hot...blah blah" and flood me with links of PRE-RELEASE reviews because I don't buy/determine to buy things based on pre-release reviews. I'll wait till I see a review post-release from the likes of X-bit, Tom's, TPU, Anand, etc., draw a consensus and then draw a conclusion on whether the 9800 GX2 is bad or not, as on today you can't prove it to be based on pre-release reviews.
Posted on Reply
#42
ChillyMyst
btarunrOkay, I didn't comment on the 7950 GX2 not being ultra fail did I? Iwas just telling you couldn't pre-judge the 9800 GX2 on the grounds of the 7950 GX2 being an ultra fail. In that case, ATI's first dual-GPU on single card (The Rage Fury MAXX) was ULTRA FAIL too but nobody thought the R680 would be a bad offering just because their previous one was a bad one, did they? In the same way, yes the 7950 GX2 was bad but you CANNOT pre-judge the 9800 GX2 based on that. Don't make comments on the 9800 GX2 saying "oh they already are saying it runs hot...blah blah" and flood me with links of PRE-RELEASE reviews because I don't buy/determine to buy things based on pre-release reviews. I'll wait till I see a review post-release from the likes of X-bit, Tom's, TPU, Anand, etc., draw a consensus and then draw a conclusion on whether the 9800 GX2 is bad or not, as on today you can't prove it to be based on pre-release reviews.
yes but the ragefury maxx was NEVER pushed hard, it was avalable but ati didnt market it as the be all and end all of videocards, nvidia did market the 7950gx2 as the be all end all of gfx cards at the time.......

and yes it was ultra fail, tho under 9x it worked "ok" it never got proper drivers, well i did hear apple had working drivers for it, if you owned an apple and could get the apple version of the card(or flash the pc ver with the moded apple bios,,,,,prety easy acctualy the old bios backup/flash tool could acctualy turn the bios from pc to mac when saving them :P )
Posted on Reply
Add your own comment
Dec 20th, 2024 07:34 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts