• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce 9900 Series Set for July Launch?

I guess he is taking these specs as reference...

http://forums.techpowerup.com/showthread.php?t=56339

Considering 9900 is 65nm, such a chip will probably sell for $500 the least and that could place it head to head with HD4870 X2 if it launches around the same time.



True. Same with HD3000. But IMO anyone that spends $200+ on a card not knowing what he is buying, it's not very clever. Today with internet it's really easy to find some reviews. It's your fault if you get cheated. If you want just a card, that's OK you don't need to bother looking for anything, but when you are spending over $200, well...

i highly doubt those specs are accurate
 
9800gtx should be 8900gtx. This way vendors can sell it to some naive bastards claiming it's near to 9900gtx and keep it's price high. Marketing scam. :shadedshu

I've been saying this since the first G92 news. I was going by what NV did between the 7800 -> 7900. The 7800GT vs. 7900GT. What do you get? Die shrink, more pixel pipelines, higher clock speeds.

What are the differences between the new/old 8800GTS's? Die shrink, more stream processors, higher clock speeds, 256bit compared to old 320bit memory bus width. It makes perfect sense...except for the reason why nvidia named the 8800gt an 8800gt as opposed to 8900GT.

NVidia must be focusing on other, better things. Or at least that's what i'd like to think :p
 
I've been saying this since the first G92 news. I was going by what NV did between the 7800 -> 7900. The 7800GT vs. 7900GT. What do you get? Die shrink, more pixel pipelines, higher clock speeds.

What are the differences between the new/old 8800GTS's? Die shrink, more stream processors, higher clock speeds, 256bit compared to old 320bit memory bus width. It makes perfect sense...except for the reason why nvidia named the 8800gt an 8800gt as opposed to 8900GT.

NVidia must be focusing on other, better things. Or at least that's what i'd like to think :p

there focusing on taking over intel! :mad:
 
both nvidia and ati are pooping out cards the past 12 months. its sorta pathetic but makes for a lot of selection.
 
So Many CARDS!!!
i CANT Decide!
 
Damn! nVidia seriously screwed up the naming convention this time. They should have used 8700 and 8900 moniker for the current 9600 and 9800 cards... This is really confusing now!
 
both nvidia and ati are pooping out cards the past 12 months. its sorta pathetic but makes for a lot of selection.

Why do you think its not worth jumping at everything brand new.
 
i guess this is why they say pc gaming is dieing
 
i guess this is why they say pc gaming is dieing

I think games like Crysis are responsible. High demanding, low performing games are whats killing PC gaming. Let's fall back to older, more reliable engines (*cough* source *cough*) and focus on the ease of use of games over how cool they look on $2k PCs. Why do you think World of Warcraft is doing so well? If it couldn't run on Intel Integrated graphics it would probably lose half of its player base.
Just my two cents.
 
I think games like Crysis are responsible. High demanding, low performing games are whats killing PC gaming. Let's fall back to older, more reliable engines (*cough* source *cough*) and focus on the ease of use of games over how cool they look on $2k PCs. Why do you think World of Warcraft is doing so well? If it couldn't run on Intel Integrated graphics it would probably lose half of its player base.
Just my two cents.

that is so true!
it crazy what hardware games require to run these days :ohwell:
 
that is so true!
it crazy what hardware games require to run these days :ohwell:

It's even more crazy how useless the req are. I guess it's like this every time a new game comes out, I remember when Doom 3 first came out the only computer I could run it on max settings on was my cousins super-expensive rig. (aka AGP 5-6k series card and a P4)
 
So, first they added another SLI connector to the 8800 and named it 9800.
Now, they finally change the core but only change the name from 9800 to 9900...
nVidia lost it! They should stop listening to their marketing department and get their game together!
 
jesus... if they double the bus AND the sp's... that would be intense... the g92 is handicapped quite a bit, IMO the 9600GT with 64 sp's performing so well is an indication of just how bad the G92 with 128sp's is handicapped by the other attributes of the g92 cards..

Ehm...double the what bus? Memory? I don't get why people are emphasising so heavily on the memory bus width. It's bandwidth that matters end of the day. How much of a performance increment does the 512-bit R600 have over the 256-bit RV670? Not to forget that the RV670 uses faster GDDR4 memory. You can have a relatively narrow bus but faster memory and achieve high bandwidth by cutting mfg costs. Or the expensive route would be broad bus, more low latency memory banks. Latency is the key here because the high latency banks used in the 1G GDDR4 variant of the HD2900 XT didn't provide a very significant performance increment over the 512M GDDR3.
 
i dont get why people think the 9800GTX is a good option when the 8800GTS 512MB offers the same perf for a lower price and lower power usage. God some people dont have brains. Who wants a card with more plastic and costs like $100 more? Not me/.

Too many rumors as well. The RV770 stuff is FUD afaik. GT200 is believable but even reliable sources such as VR zone as of late are just grabbing FUD from everywhere.
 
9900 GTX 25352 points

:twitch:3DMarK 2006 : 25352 points avec une POV 9900 GTX
Yo :
Just go check this web site
http://www.generation-3d.com/
3DMarK-2006-25352-points
 
Ehm...double the what bus? Memory? I don't get why people are emphasising so heavily on the memory bus width. It's bandwidth that matters end of the day. How much of a performance increment does the 512-bit R600 have over the 256-bit RV670? Not to forget that the RV670 uses faster GDDR4 memory. You can have a relatively narrow bus but faster memory and achieve high bandwidth by cutting mfg costs. Or the expensive route would be broad bus, more low latency memory banks. Latency is the key here because the high latency banks used in the 1G GDDR4 variant of the HD2900 XT didn't provide a very significant performance increment over the 512M GDDR3.

Latency is the killer of ddr2 and ddr3 performance. I used to have some good 2x512MB ddr stix for my 939. Back then @466MHz with 2.5-3-3-6 1T timings it used to kill most of ddr2 configurations out there, I know Athlon64 has an integrated memory controller, but that was still a huge difference compared to 533MHz+ frequency of ddr2.
good ol ddr->:nutkick:<-ddr2,3,4,5,... :rockout:
 
Latency is the killer of ddr2 and ddr3 performance. I used to have some good 2x512MB ddr stix for my 939. Back then @466MHz with 2.5-3-3-6 1T timings it used to kill most of ddr2 configurations out there, I know Athlon64 has an integrated memory controller, but that was still a huge difference compared to 533MHz+ frequency of ddr2.
good ol ddr->:nutkick:<-ddr2,3,4,5,... :rockout:

Since then things have change a lot. DDR2 at 5-6-6-12 equals your latencies and so does DDR3 10-12-12-24 . Remember that latencies are expressed on how many clock cicles the memory needs to ready out, and that DDR2 is double as fast than DDR. I can't remember DDR running at 1.5-2-1.5-4.5, but Corsair has DDR2-800 with 3-4-3-9. More so 4-4-4-12 is mainstream while DDR 2-2-2-X was hard to find and expensive.
Things get worse as we move up on frequencies. There's plenty of DDR2 1000+ Mhz 5-5-5-15 and DDR3 1800+ Mhz 7-7-7-20 and better latencies, while high OC DDR was almost always 2.5/3T.
 
I must say, Nvidia should have taken advantage of the fact that DX10 required a complete redesign of the core to support shaders, over pipelines, and renamed their graphics cards. It would have created less confusion, as buyers would know that the Geforce 7900 was older, and the Wazledoozle Extreme 3 was the newer version. It would also have been a chance for them to adopt a newer driver architecture and naming scheme, not to mention, getting rid of the need to think of what the hell they would do once their numbers got into the tens of thousands, with the Geforce series, which will happen soon enough. ATi might tell them to quit that naming scheme, if they start using X800 as the name for their next-next generation card.
I really do think the G92 should have been the 8900, as it really isn't a completely new architecture, but rather a die-shrink and improvement over the G80. The 9600GT could have been the 8700GTS, but then again, they completely messed up their mobile graphics naming scheme (the 8600m GS is an 8500GT in disguise for example, and the 8700m GT is just an overclocked 8600m GT, which is a severly underclocked 8600GT), and would have created further confusion. Oh, not to mention their G92 cores for laptops are the 8800m GT and GTX... what the hell? The two versions are 64 SP and 96 SP models. They're mobile 9600GTs, and 8800GTS (older model).

If this information is all true, then Nvidia is drowning itself in names and numbers.



When the AM2 processors came out, that was certainly true, the bandwidth of DDR2 was not showing a real increase in performance, as the latencies were still quite high for the low speeds they acheived. The same thing happenned even before that when the LGA775 Intels started getting DDR2 supporting chipsets. This is usually the case with memory though, with the trade-off being higher latencies for higher clock speeds.
As DarkMatter mentioned, that was then, but now the latencies are reasonable for the high speeds we see, but you mentioned that most low latency/high OC DDR ran 2.3/3T, not true, they all ran 2T at the most, since I hadn't seen reports of any motherboard supporting anything other than 1/2T, which continues true for DDR2. Either way, the difference between 1T and 2T is minimal. I personally never had a problem with 2T. Heck, it allowed me to push another 10MHz through my RAM at times.
I believe either GeIL or G.Skill has really low latency DDR RAM, based on TCCDs, near the end of DDR's reign, that ran at 1-2-2-5, or it was 1.5-2-2-5. Certainly still not a match to the DDR2-800 RAM of today running 4-4-4-8, or even lower (my Ballistix are now running 4-4-3-5, so that there is also an example), but they were lower than the DDR2 of that time.
It's a shame, fast DDR is still hard to find, and fairly expensive. You'd think Ebay would be flooded wth them, but in fact, it's all generic sticks from sellers in Hong Kong, Singapore, etc.
 
I must say, Nvidia should have taken advantage of the fact that DX10 required a complete redesign of the core to support shaders, over pipelines, and renamed their graphics cards. It would have created less confusion, as buyers would know that the Geforce 7900 was older, and the Wazledoozle Extreme 3 was the newer version. It would also have been a chance for them to adopt a newer driver architecture and naming scheme, not to mention, getting rid of the need to think of what the hell they would do once their numbers got into the tens of thousands, with the Geforce series, which will happen soon enough. ATi might tell them to quit that naming scheme, if they start using X800 as the name for their next-next generation card.
I really do think the G92 should have been the 8900, as it really isn't a completely new architecture, but rather a die-shrink and improvement over the G80. The 9600GT could have been the 8700GTS, but then again, they completely messed up their mobile graphics naming scheme (the 8600m GS is an 8500GT in disguise for example, and the 8700m GT is just an overclocked 8600m GT, which is a severly underclocked 8600GT), and would have created further confusion. Oh, not to mention their G92 cores for laptops are the 8800m GT and GTX... what the hell? The two versions are 64 SP and 96 SP models. They're mobile 9600GTs, and 8800GTS (older model).

If this information is all true, then Nvidia is drowning itself in names and numbers.



When the AM2 processors came out, that was certainly true, the bandwidth of DDR2 was not showing a real increase in performance, as the latencies were still quite high for the low speeds they acheived. The same thing happenned even before that when the LGA775 Intels started getting DDR2 supporting chipsets. This is usually the case with memory though, with the trade-off being higher latencies for higher clock speeds.
As DarkMatter mentioned, that was then, but now the latencies are reasonable for the high speeds we see, but you mentioned that most low latency/high OC DDR ran 2.3/3T, not true, they all ran 2T at the most, since I hadn't seen reports of any motherboard supporting anything other than 1/2T, which continues true for DDR2. Either way, the difference between 1T and 2T is minimal. I personally never had a problem with 2T. Heck, it allowed me to push another 10MHz through my RAM at times.
I believe either GeIL or G.Skill has really low latency DDR RAM, based on TCCDs, near the end of DDR's reign, that ran at 1-2-2-5, or it was 1.5-2-2-5. Certainly still not a match to the DDR2-800 RAM of today running 4-4-4-8, or even lower (my Ballistix are now running 4-4-3-5, so that there is also an example), but they were lower than the DDR2 of that time.
It's a shame, fast DDR is still hard to find, and fairly expensive. You'd think Ebay would be flooded wth them, but in fact, it's all generic sticks from sellers in Hong Kong, Singapore, etc.

I would love to see cards with waazledoozle brand!! Waazledoozle Extreme FTW!! :rockout:

About memory, sorry for the confusion, I wanted to say 2.5-2.5-2.5-X or 3-3-3-X and the likes when I said 2.5T/3T. I thought there could be some confusion about it, but I was lazy enough to not care. :D

On the other hand, there was 1.5-2-2-5 and similar DDR memory? I never heard about it. If you look at my specs I have G.Skill memory and that one was the best you could buy from them when I bought it (AM2 launched 2-3 months later). It can do 2-2-2-4 1T easily up to 480 Mhz (effective) and 2 sticks, but it can't do better than 3-3-3-7 2T at 600 Mhz, which is what I pointed out.. <-- Not 100% sure really because I only OCed with all 4 sticks plugged. It can't do 1.5 T in any of the timings either. Basically it's good because it can OC to ~530 Mhz with 2.5-3-3-5, but it can't do lower timings than 2-2-2-4 even with lower Mhz.
 
Does releasing more cards in fact mean you win? The 3870 still beats a 9600gt...

Actually the 9600 GT is slightly faster than the 3870 especially in new games such as Crysis the 9600 GT is substantially faster. But I do agree Nvidia are releasing their cards too often.:)
 
Wow a 9900GTX to replace a 9800Gx2 thats something to note.. The price of these cards are gonna be an arm and a leg though..

That very much depends on how good ATI's 4000 series cards are and how much they will cost.
 
I don't understand it. You people bash nVidia for releasing the G92 GPUs because they aren't a big improvement. Then when they release GPUs that are huge improvements you bash them too. I guess these companies are damned if they do and damned if they don't. I personally am very happy technology is moving along. By the time this will be release it will be almost 2 years since G80's release. It is about time we see a major jump.

The cycle begins again. These GPUs will be power hungy and put out massive amounts of heat, and they will improve that over time. And I will buy them one they are near the end of their life cycle and have matured.
 
I don't understand it. You people bash nVidia for releasing the G92 GPUs because they aren't a big improvement. Then when they release GPUs that are huge improvements you bash them too. I guess these companies are damned if they do and damned if they don't. I personally am very happy technology is moving along. By the time this will be release it will be almost 2 years since G80's release. It is about time we see a major jump.

Agreed! Take the 8800GTS 512MB....little improvement? well if you consider the same performance (in most things and better in some) as the 8800GTX for half the price as not being an improvement well I suppose not! Not disimilar to the HD3870 replacing the 2900XT really........(so where is ATi's improvement over the last year?....excluding the 3850 of course but TBH anything is an improvement over the 2600XT/Pro :eek:)).:D
 
Meh, sounds like a VERY well-performing card but it'll probably cost a fortune.

And besides, I'm sure a 9800 GX2 will last for a while if you've bought one right now. :shadedshu
 
Don't worry, there's plenty of stupid people where i come from. Luckily (for me), I'm not one of them. :D Check this out: some 20 miles to south from where I live, in Bosnia&Herzegovina, they still sell PIII(!) as new, entire PC with 64MB ram, TNT2 and 10GB hard drive costs about 100$(!). NEW. Go figure.:twitch:

Edit: correction, just checked out the prices, make that 140-180$(!). I dunno should I cry or laugh my ass off.

http://maps.google.com/maps?f=q&hl=...17.394791&spn=0.308573,0.6427&z=11&iwloc=addr

Wow, I tried to look 20 miles south from you -- is it really that desolate, or has google maps simply not mapped Bosnia yet? :confused:

And are you far away from Nova Prospect? :p
 
Back
Top