Friday, February 22nd 2008

GeForce 9 Series Roadmap Updated

Enjoy. The little dashes indicate unavailable data, and the asterisks indicate unconfirmed data (which should still be reasonably accurate).

Please follow the source link for a chart of the current speculated clocks and/or specifications.
Source: Nordic Hardware
Add your own comment

66 Comments on GeForce 9 Series Roadmap Updated

#26
kylew
Seems kinda disappointing that they're doing this. Usually with the gen change you get a 2x bump in speed while maintaining the same price (not in ATi's case, half the price with about 10% more speed :D) I can't believe they're still using G92, I though they'd hammered that with the GT/GTS? Personally I think the GX2 is a poor example of engineering, ATi's dual card seems so much more impressive on an engineering level, both cores on the same PCB + superior scaling. The 9800GX2 really looks like a "botched" rush job to get the card out. They've had the performance crown for over a year, by rights they should be out with something more impressive. Maybe they got lazy, slacked off and are only now jumping back because of the 3870X2.
Posted on Reply
#27
hat
Enthusiast
No, they rolled out Geforce 9 cards under-the-radar as Geforce 8 cards to compete with ATi. It looks like I was wrong, and the 9800GX2 really is going to be king of the hill for the 9 series.
Posted on Reply
#28
kylew
hatNo, they rolled out Geforce 9 cards under-the-radar as Geforce 8 cards to compete with ATi. It looks like I was wrong, and the 9800GX2 really is going to be king of the hill for the 9 series.
Same thing though isn't it? Using their already existing cores as the next gen high end with what seems to be barely any change (atleast with the GX2 using 2 GTSs). It's still disappointing because if nvidia did release "9 series under the radar" they're still not enough improvement for them to use them as the next gen.
Posted on Reply
#29
springs113
needless to say...this jus tbrings me right back to te point when the 2900xt came out...and anandtech did that article that really took an advanced look at the r600 core marchitecture... and words out of anandtechs (editors) mouth were that the r600 technology was very intireresting and that it had more growth potential if ati doesnt abandon it... and so far it hasn't. They also spoke of the g80 tech' and stated that this design was really a great(mature) process but had little room to improve upon in a sense that the technology was so mature any upgrade to it would yield improvements but not by a big standard...

I did and will continue to believe anands perspective about the r600 design and do believe that ati did have a better design...that was underperforming for its specs...and if you look at the 2900xt throughout its life time you can see noticeable improvements in every driver release...and when you look at the the 3800 series especially in perf/per watt in comparison to the 2900xt =big improvement... and in the performance improvements over the 2900xt although not as great as we would want the 3800 series is much more efficient in all aspects of the tech'.

the r770 core has been said to be 50% than that of the r670, and i believe that ati has stated that the r700 will be multiple cores and ati has given the indication that they want to continue to push mutiple gpu thing and this was why the 3870 x2s performance and acceptability was so important... and so far it looks good eventhough it needs work..

this i believe Nvidia knows that they can only push the g80(92) core so much....and i believe that is why we see so much delays in their products. ati can only move up so no matter what they cant go wrong...and they did say that back in november ati was making these 3800 series cards for hardly anything and with the recent price drops to 189 just simply state how much cheaper the 3800s were compared to 2900s( to build). and from what i have heard from a couple of my friends in the industry they are still making a decent profit even at this price point. NVIDIA 'got' its hands full.
Posted on Reply
#30
Nicksterr
$599+ retail for gx2? christ...back to the $600+ gfx cards at release, just like the 8800gtx. I remember paying $625 for my gtx when it came out.
Posted on Reply
#31
flashstar
Lol, I payed $200 shipped for my 2900pro/xt.

ATI rules... The R770 is supposedly going to have 1 Tflop of processing power and will be over 1 billion transistors. I expect to see quite noticeable performance increases over the 8800 and 9xxx series.
Posted on Reply
#32
ShadowFold
The 9800GX2 will flop(not the good kind) I can smell it.
Posted on Reply
#33
bombfirst885
Does anyone have specs for the 9800GTX yet? Is this thing suppose to be much faster than the previous gen GTX?
Posted on Reply
#34
15th Warlock
What's with both Ati and nVidia using previous gen technology and just sticking a higher model number to their cards? Is there no more innovation left in the PC market :shadedshu :wtf::(
Posted on Reply
#35
Fitseries3
Eleet Hardware Junkie
15th WarlockWhat's with both Ati and nVidia using previous gen technology and just sticking a higher model number to their cards? Is there no more innovation left in the PC market :shadedshu :wtf::(
where's the dual, tripple and quad core GPU's at?

it's just like CPU's.... single core/single socket came first, then single core/dual socket(2xcpu's)
then dual core/single socket.... dual core/dual socket....quad core/single socket...now quad core/dual socket. i expect GPU's will be the same fashion. soon we will hear that someone has a dual core GPU that runs better than a sli/crossfire setup.
Posted on Reply
#36
jbunch07
it looks to me like gpus will go the same way cpus did when they max out a single core they will start using dual core and after that multi core, in my opinion amd putting 2 3870's on a single pcb is far better engineering than nvidia's dual pcb idea, that just seems lazy if ya ask me!
Posted on Reply
#37
ChillyMyst
nvidia like intel are to secure in their "power" they many times take the lazy rout.

look at the pentium-d and core2 quads, they are not native dual(p-d) or quad(c2q) they are 2 chips on one pcb, now it dosnt mean they didnt work, just as 2 pcb gx2's work, but, the designs lazy and as normal with nvidia its about pure bruit force not about doing things smarter.

we shal see, but i look at the 9xxx cards as fail,i have an 8800gt 512, works fine, but drivers are still buggy on xp 64 :P
Posted on Reply
#38
jbunch07
ChillyMystnvidia like intel are to secure in their "power" they many times take the lazy rout.

look at the pentium-d and core2 quads, they are not native dual(p-d) or quad(c2q) they are 2 chips on one pcb, now it dosnt mean they didnt work, just as 2 pcb gx2's work, but, the designs lazy and as normal with nvidia its about pure bruit force not about doing things smarter.

we shal see, but i look at the 9xxx cards as fail,i have an 8800gt 512, works fine, but drivers are still buggy on xp 64 :P
yup that sounds about right,
i guess thats why AMD's logo is "A Smarter Choice"
Posted on Reply
#39
ChillyMyst
pretty much, its why i left nvidia and intel for that matter.

i was neither a fan of amd or intel back inthe p3 and older days, both where great chip makers, i loved my k6-2, and my dual celeron and my p3 550e@733, then the p4 came out, and was total crap, all about pure clocks and bruit forceing the market.

i was a huge nvidia fan till.....the fx line, 5800ultra did my respect for them in hard core.....

nvidia tryed to cheat with the fx lines dx9 support, and it ended in EPIC FAIL, the cards where slow as hell, unless you cheated and forced games to run un partial persission mode (made them look worse then normal full persission mode) it was just....arg......a card 1/4 the price of the 5800ultra was about 3-4x as fast in dx9 mode......thats INSAIN.

well they learned but then the 6-7-8 endup being evolutions on a common design theme, just more bruit force involved.

blah, i run an 8800gt, its a good card with blah driver support.

amd/ati are at least trying to innovate and creat new designes insted of just sticking more popes/rops on a core and more shaders.

hell if it had been nvidia going from the x1900 to the 2900 they would have just combine 2 cores and called it a new core, 32rops and 96 shader units..........god that would be a beastly card tho :P

but ati went a diffrent way, and now that ati and amd are one, amd is continuing to head that way, and i think it will work out, the r6*0 cores to me really look like an intermediat design core, they do the job, they have their flaws, but they have alot of potential.

im happy where i am for now, but im excited to see where this "war" will go next, i hope its to ati focing nvidia to innovate again insted of just add more rops/shader units to current designs....
Posted on Reply
#40
jbunch07
ChillyMystpretty much, its why i left nvidia and intel for that matter.

i was neither a fan of amd or intel back inthe p3 and older days, both where great chip makers, i loved my k6-2, and my dual celeron and my p3 550e@733, then the p4 came out, and was total crap, all about pure clocks and bruit forceing the market.

i was a huge nvidia fan till.....the fx line, 5800ultra did my respect for them in hard core.....

nvidia tryed to cheat with the fx lines dx9 support, and it ended in EPIC FAIL, the cards where slow as hell, unless you cheated and forced games to run un partial persission mode (made them look worse then normal full persission mode) it was just....arg......a card 1/4 the price of the 5800ultra was about 3-4x as fast in dx9 mode......thats INSAIN.

well they learned but then the 6-7-8 endup being evolutions on a common design theme, just more bruit force involved.

blah, i run an 8800gt, its a good card with blah driver support.

amd/ati are at least trying to innovate and creat new designes insted of just sticking more popes/rops on a core and more shaders.

hell if it had been nvidia going from the x1900 to the 2900 they would have just combine 2 cores and called it a new core, 32rops and 96 shader units..........god that would be a beastly card tho :P

but ati went a diffrent way, and now that ati and amd are one, amd is continuing to head that way, and i think it will work out, the r6*0 cores to me really look like an intermediat design core, they do the job, they have their flaws, but they have alot of potential.

im happy where i am for now, but im excited to see where this "war" will go next, i hope its to ati focing nvidia to innovate again insted of just add more rops/shader units to current designs....
ha i couldn't agree more!
The drivers for my 8600 gts's are crap! they have big stability problems especially when oced
but yea its def going to be very interesting watching the war over the upcoming months,

currently building a spider platform cant wait till its finished!
Posted on Reply
#41
OrbitzXT
I don't have an allegiance to one company, whichever company has the product that fits my needs I go with that one. Right now I'd have to say its ATI, but probably 4-5 months from now I plan to build a monster and if nVidia still has the most performance I'll be back with them. I always think its so stupid to side with one company, unless you have some sort of stake in it...why do you cheer for it on the internet like it matters. Those people who wanted to stick with AMD while Intel was crushing them, fine...have fun, its stupid but go ahead. Likewise years ago when AMD were destroying Intel's P4s. Now I have to say in this moment of time AMD has the better price/performance ratio compared with nVidia. However, I don't think nVidia will be in a hurting situation anytime soon.
Posted on Reply
#42
farlex85
I'm sort of new to this whole tech game, but it seems to me nvidia is following a relatively sound business plan. As of right now, nvidia still has most of the med-high to high end market on lock. The g92 has only appeared in 3 of their cards thus far, and I'm sure they are gonna tweak it out for the 9xxx series, and y not. Its a solid chip that still reigns supreme in its price range.

Did anybody get mad when intel released the wolfdale. After all, it was just a die shrink. If they had more competition we would have the 45 quads, thats just how it goes. Nvidia is getting alot of competition from ati in the mid to low end market, so they are focusing on that while throwing out a few cards that will continue to dominate on the high end.

And, personally, I would rather the technology be perfected before being put on the market, as long as some tweaks allow for increased performance. Plus, after the 9xxx series, you would expect the numbering system to change, and probably the tech will as well. Its all just speculation at this point though.
Posted on Reply
#43
candle_86
i wouldnt say that the 8800GS undercuts the HD3850 on price and the 9600GT ties the 3870 on price and those are the targets, the 8800GS will vanish id imagine with 9600GS. What i do find funny is the new lowend is die shurnk G84 cores pretty much
Posted on Reply
#44
mandelore
candle_86ati doesnt have a chance in hell
:roll:

if ATI can trump NV by simply redesigning and making a dual gpu card, their nxt gen cards coming out, especially if dual core are gonna crush some NV

they already have the innovation and experience at designing all-in-one card pcb's, so they are already a step ahead of nv in that respect.

so when it comes to multiple gpu cards, are NV still gonna be using multiple pcbs? coz thats just not smart, and seriously lacks innovation.

So whenever they tout 9XXX x2 or whatever, its really 2 cards, which therefore should be compared to 2x HDxxx x2 cards ;)
Posted on Reply
#45
mandis
SiXx`Lot's of performance tweaks to the architecture, just look at the 9600GT, it's on par with the 3870; or about 3/4th's the 8800GT performance and if that card can do that well, it's going to be pushing 30% to 50% faster than the 8800 series.
The 9600GT is SLOWER than the 3870. The 8800GT is about 5% faster than the 3870 on code optimised for nvidia.
Posted on Reply
#46
jbunch07
mandelore:roll:

if ATI can trump NV by simply redesigning and making a dual gpu card, their nxt gen cards coming out, especially if dual core are gonna crush some NV

they already have the innovation and experience at designing all-in-one card pcb's, so they are already a step ahead of nv in that respect.

so when it comes to multiple gpu cards, are NV still gonna be using multiple pcbs? coz thats just not smart, and seriously lacks innovation.

So whenever they tout 9XXX x2 or whatever, its really 2 cards, which therefore should be compared to 2x HDxxx x2 cards ;)
Thats what im talking about!
Posted on Reply
#47
candle_86
its a single card, im sorry you can't realize that. Dual PCB offers an advantage over single PCB, and that is length, in case you didnt know the 8800GTS 512 board is already long and to double that circutity will make it 75% longer. So a single PCB won't work. Also the GPU isnt geting cooled by the same hot air, oh and a water block should be easier for the dual PCB also as you just stick one in between them and its cooled. What i smell is a bunch of ATI fanbois of yester year grasping at straws
Posted on Reply
#48
jbunch07
candle_86its a single card, im sorry you can't realize that. Dual PCB offers an advantage over single PCB, and that is length, in case you didnt know the 8800GTS 512 board is already long and to double that circutity will make it 75% longer. So a single PCB won't work. Also the GPU isnt geting cooled by the same hot air, oh and a water block should be easier for the dual PCB also as you just stick one in between them and its cooled. What i smell is a bunch of ATI fanbois of yester year grasping at straws
length is an issue i agree but the 3870 x2 is only as long as a gtx
the x2 was not 75% longer than the 3870
Posted on Reply
#49
brian.ca
candle_86its a single card, im sorry you can't realize that. Dual PCB offers an advantage over single PCB, and that is length, in case you didnt know the 8800GTS 512 board is already long and to double that circutity will make it 75% longer. So a single PCB won't work. Also the GPU isnt geting cooled by the same hot air, oh and a water block should be easier for the dual PCB also as you just stick one in between them and its cooled. What i smell is a bunch of ATI fanbois of yester year grasping at straws
I've never played with water cooling but that doesn't sound quite right.. doesn't the block have to be relatively tight against the chip? That would suggest that you can't just slide it inbetween but would probably have to slide it in then expand it to a tight fight. While possible I can't imagine that being easier or more practical than the current convention. Along the same lines of reasoning I'd think removing the fan in the first place would be a pain in the ass b/c the usual / cleanest access point is completely blocked off. The only thing that should be able to change that would be if the two PCBs would be detachable but I haven't heard anything along those lines.

The length arguement also sounds off for the reason someone else mentioned. For Nvidia that might be an advantage but by itself it's not really an advantage b/c it's appearently possible to do it while not exceeding previous precendents.
Posted on Reply
#50
ChillyMyst
candle_86its a single card, im sorry you can't realize that. Dual PCB offers an advantage over single PCB, and that is length, in case you didnt know the 8800GTS 512 board is already long and to double that circutity will make it 75% longer. So a single PCB won't work. Also the GPU isnt geting cooled by the same hot air, oh and a water block should be easier for the dual PCB also as you just stick one in between them and its cooled. What i smell is a bunch of ATI fanbois of yester year grasping at straws
funny, the x2 3870 isnt any longer then the normal cards.......its surly not longer then my 8800gt.

remmber their are ways with smart design to avoidthe pitfalls you speek of.

more pcb layers for example allow higher complexity and lower noise, thats why most ram sticks today at 6 or even 8 layer pcb's, so your argument tho "logical" is also false.
Posted on Reply
Add your own comment
Dec 23rd, 2024 20:36 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts