• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce 4XX Series Discussion

Status
Not open for further replies.
watched some videos....the image quality of gf 100 is superb..
and the card is a big one
still no idea about price ??

Yeah, I watched those too. You cannot in any way determine IQ from a video. That being said is that all we get for CES? A couple video's that give no inclination of performance? I need more Fermi brain food dammit!
 
Yeah, I watched those too. You cannot in any way determine IQ from a video. That being said is that all we get for CES? A couple video's that give no inclination of performance? I need more Fermi brain food dammit!

:laugh:

What we got was good enough for me. Card is huge, 6+8pin(so I know I am gonna need a new psu for that system, gonna probably use 3 cards), but it's up and running...looks good for March release.
 
I'm just surprised we haven't seen any slides, graphs or numbers from Nvidia as a "hold on, don't buy ATi just yet" type of deal. Just something that shows how powerful this card can actually be. I'm still having flashbacks to the FX series and the 2900XTX.
 
I'm just surprised we haven't seen any slides, graphs or numbers from Nvidia as a "hold on, don't buy ATi just yet" type of deal. Just something that shows how powerful this card can actually be. I'm still having flashbacks to the FX series and the 2900XTX.

Yep. That literally is the only reason those slides arent out. Ive been thinking that same thing all along.
 
You know, I never thought of that, but this isn't a press event, nor are they launching within a month, so that kind of info was really given out at the GC event, even if it was more centered around the Tesla-type cards.

Seeing it up and running is good, altohugh I am concerned that they are only showing one card(so far), and not SLi...
 
yea, i gotta agree....dont' want to but i surmise(o yea, eng 12 vocab) that GTX 3XX series will be a let down in games compared to what we're expecting, and esp since they've had delay after delay i'm sure most of u feel like i do. That GTX3XX series needs to bring some serious beef to the table, like a Triple Whopper instead of a Whopper Jr. lol
 
How can it be a let down when it's using the same shaders as the GTX200 series and has 2,14 times more of them? GTX285>HD4890 2,14>2 ergo GTX300>>HD5870...It won't be a performance let down and that's for sure. Price on the other hand..only time will tell :)
 
its not using the same shaders, the shaders are MIMD now, not SIMD. And personally i expect the GTX 380 to beat the HD 5870 by at least 20%, possibly 35% simply because they've spent a lot of time on fermi, and all their claims of performance are boasting that as well, so they better deliver.

Just a side thought of why i think they haven't delivered complete final specs(clock speeds n whatnot) is maybe they haven't gotten enough consistency in their chips to actually decide a final clock speed. We all know that TSMC is still having problems so maybe out of the chips nvidia does get back, the majority of them aren't hitting target or near target clocks, too much variation between chips.
 
Last edited:
all i know is everyone should be happy with what they got i went ati and im glad i did otherwise id be without decent graphics cards for over 8 months and i honestly couldnt handle a lower end gpu for that long to wait on nvidia to deliver so if the GT300 series is kickass thats great you can have them because for me and aquiring my gpus at msrp i have no complaints

the way i see it gtx 380 can be 75% faster for all i care i wont be able to afford nvidias price of admission even if i wanted to so its moot.

so to those that want Nvidia wait for it buy it and love it those that cant well do whatever tickles your fancy lol. but before we all sit here and speculate on performance lets wait for some actual #s because for now we really dont know it could very well be maybe that Nvidias parts are = to atis parts and maybe the cuda tessellation whoops atis tessellation unit we dont know how it will play out yet so ill wait till i see some reall #s from the ladies and gents in green :roll:
 
the way i see it gtx 380 can be 75% faster for all i care i wont be able to afford nvidias price of admission even if i wanted to so its moot.

If that's the case, these 5850's are as good as sold! :D
 
lol well u have a larger income then me roflol i cant even buy a router so how can i afford fermi?? hahaha
 
I just upgraded from an e8500 to a i5 750, and good god, these 260's are back to life... didnt realize how much i was being bottlenecked by the dualie.

In any case, a 5870 is not much of an upgrade from 260's sli. I might stick it out to the 5890's or die shrunk gt300's... by then prices should drop.
 
I just upgraded from an e8500 to a i5 750,

Man my i5 750 should be here tomorrow I can't wait. Your comments are making me excited and I am even taking less of an upgrade from a Q9550 to an i5. That should also give my 5870 a little increase as well.
 
I hope that the card being shown is the high end version! Otherwise the high end will need 2 8 pin connectors! Still, surely someone can give us a vague idea on frame rates? Maybe there'll be some articles tomorrow.
 
Yeah, I watched those too. You cannot in any way determine IQ from a video. That being said is that all we get for CES? A couple video's that give no inclination of performance? I need more Fermi brain food dammit!

Yeah this whole CES Fermi thing is just a big cock tease so far.
 
I'm just surprised we haven't seen any slides, graphs or numbers from Nvidia as a "hold on, don't buy ATi just yet" type of deal. Just something that shows how powerful this card can actually be. I'm still having flashbacks to the FX series and the 2900XTX.

I think it has to do with their confidence in their brand-name recognition and the fact that even with all the 5xxx series GPUs AMD has been pushing since September, nVidia still has the majority of the GPU market-share. (As far as the consumer desktop market-share is concerned at least.)

I remember watching an interview with Jen-Hsun Huang (nVidia's CEO) a while back on Charlie Rose and Jen-Hsun mentioned how nVidia likes to think of itself and their brand as "Nike", as something for hard-core gamers. While I don't agree fully with that analogy, some of it does ring true. Many times when I mention keywords like "GeForce" or "nVidia", even when talking with the non-enthusiasts, they usually know that I'm talking about GPUs or video cards. While keywords like "Radeon", "ATI" and even "AMD", well those keywords many times result in blank stares. You can say a lot about nVidia and their products but they sure have their brand name recognition taken care of.

Bo_Fox said:
I really adore NV for actively supporting 3D. It's just a matter of time when we realize that 3D is the next thing

They (nVidia) have been actively pushing this 3D stuff since ...forever. I still have two sets of 3D glasses somewhere in storage. One of them I received with a purchase of a ASUS GeForce3 TI 500 back in 2001-2002, it came in the box, and the other set I received with a nVidia sponsored game "Bridge Commander" also back in 2002. They have been at this for a long while. Interestingly though, nearly 10 years later, I never bothered to plug-in either of them. It just seems sort of corny, a throwback to the 1950s B-movie cinema.

Wake me up when they come up with true-3D visualization along the lines of consumer/gaming grade holographic displays and what not. 3D glasses are for the old folk, at least that's how I see it.
 
Last edited:
I think it has to do with their confidence in their brand-name recognition and the fact that even with all the 5xxx series GPUs AMD has been pushing since September, nVidia still has the majority of the GPU market-share. (As far as the consumer desktop market-share is concerned at least.)

I remember watching an interview with Jen-Hsun Huang (nVidia's CEO) a while back on Charlie Rose and Jen-Hsun mentioned how nVidia likes to think of itself and their brand as "Nike", as something for hard-core gamers. While I don't agree fully with that analogy, some of it does ring true. Many times when I mention keywords like "GeForce" or "nVidia", even when talking with the non-enthusiasts, they usually know that I'm talking about GPUs or video cards. While keywords like "Radeon", "ATI" and even "AMD", well those keywords many times result in blank stares. You can say a lot about nVidia and their products but they sure have their brand name recognition taken care of.

LOL, that's what my younger brother knew. He knew what Nvidia was, but he didnt know what ATI or AMD was.

They (nVidia) have been actively pushing this 3D stuff since ...forever. I still have two sets of 3D glasses somewhere in storage. One of them I received with a purchase of a ASUS GeForce3 TI 500 back in 2001-2002, it came in the box, and the other set I received with a nVidia sponsored game "Bridge Commander" also back in 2002. They have been at this for a long while. Interestingly though, nearly 10 years later, I never bothered to plug-in either of them. It just seems sort of corny, a throwback to the 1950s B-movie cinema.

Wake me up when they come up with true-3D visualization along the lines of consumer/gaming grade holographic displays and what not. 3D glasses are for the old folk, at least that's how I see it.


I've been using 3D since Geforce4 Ti4200 in 2002 with shutterglasses that were bundled with the card. 3D was actually avaialbe a few years before that. ATI has never really bothered to support 3D, and NV kinda stopped supporting 3D for a while after making it exclusive for the Zalman monitor after the 7900GTX was released. I had to hack the drivers with nHancer for 3D to still work with my 8800GTX (by the way, the hack would not work for G92 or GT200 cards), so I can still use those cheap wired shutterglasses that can be bought off ebay for only $10. I'm actively boycotting NV's Geforce 3D Vision wireless shutterglasses that retail for a ridiculous $200.

Ever since CRT monitors were on the decline, with the lack of 120Hz LCD monitors, cheap 3D solutions were not viable for most people for several years since (other than the god-awful anaglyph red-blue glasses that I used to play games on the LCD monitor, like Condemned: Criminal Origins.. it's a great game that uses the same engine and textures as FEAR.. actually better than FEAR in many respects. The game is dark and gritty, and is almost colorless, so using red-blue glasses just for that game was not bad, really.

Samsung is working hard on making 3D TV's, but I think that they are still going to require glasses. It'll be a bigger problem for people who already wear glasses, but the more 3D tech gets adopted, the sooner we'll be seeing a truly 3D display that does not need glasses.
 
Last edited:
LOL, that's what my younger brother knew. He knew what Nvidia was, but he didnt know what ATI or AMD was.

Yeah, I encounter this a lot. If you mention brand-names like Intel, nVidia, Pentium, GeForce, and similar, most people even those who are not technically inclined they seem to have some clue what is behind those brand-names.

Ever since CRT monitors were on the decline, with the lack of 120Hz LCD monitors, cheap 3D solutions were not viable for most people for several years since (other than the god-awful anaglyph red-blue glasses that I used to play games on the LCD monitor, like Condemned: Criminal Origins.. it's a great game that uses the same engine and textures as FEAR.. actually better than FEAR in many respects. The game is dark and gritty, and is almost colorless, so using red-blue glasses just for that game was not bad, really.

The set of glasses I received with the ASUS GeForce3 card seem to be based on some sort of a proprietary interface. They are of the wired sort and the wires going from the glasses are supposed to be plugged into the special plug on the video card called "VR Out". In fact I don't think these glasses are compatible with any other card than this specific model of GeForce3 from ASUS. The other pair I got with the Bridge Commander game, they are of the wireless-sort (I think, I haven't taken them out of the box in a long time) and the interface seems to be a standard VGA one on the receiver side. I'm not really sure what the exact specs or technology is behind either of the sets since I never tried em on. Heck, I don't think they are compatible with modern LCD tech to begin with.
 
Man my i5 750 should be here tomorrow I can't wait. Your comments are making me excited and I am even taking less of an upgrade from a Q9550 to an i5. That should also give my 5870 a little increase as well.

You literally will be shocked at how fast it is. Play GTA IV after you get it stable... its bizzare fast.
 
The set of glasses I received with the ASUS GeForce3 card seem to be based on some sort of a proprietary interface. They are of the wired sort and the wires going from the glasses are supposed to be plugged into the special plug on the video card called "VR Out". In fact I don't think these glasses are compatible with any other card than this specific model of GeForce3 from ASUS. The other pair I got with the Bridge Commander game, they are of the wireless-sort (I think, I haven't taken them out of the box in a long time) and the interface seems to be a standard VGA one on the receiver side. I'm not really sure what the exact specs or technology is behind either of the sets since I never tried em on. Heck, I don't think they are compatible with modern LCD tech to begin with.

Right, some of the bundled glasses require a special dongle that is proprietary to a certain make, or even model.

Some can be used with pretty much any cards, since it uses a universal dongle that connects to the VGA-output, and passes the video through to the CRT's VGA cable. As long as you have a DVI-VGA adapter, you could still use those glasses with older drivers (and G80 or older) plus WinXP. You just might get it to work with a 60Hz LCD panel, but the shutterglasses will be shutting out half of that, so you only see 30Hz per eye--such an awful flicker! I've tried it a few times before by mistake when a game defaulted to 60Hz at a strange resolution. Not a pretty sight!
 
Last edited:

Not much to see... beside if it's really Fermi in Tri-Sli than they may have bigger heat issue that we first thought water cooling hmmm! ... wonder why?

& about the video ... who is to say that is really nVidia Fermi runnin it?
or a pre-recorded video runnin in the background...??

Anyhow i want REAL benchmarks on AIR cooling & hope they'll do better that the 5970 , that way price of ATI cards will come down & cheap enought so i can buy another 5750 for crossfirex :D
 
wonder why?

Because it looks kick ass, especially for the iliterates. Plain and simple man, it's just marketing. Anyway 3 cards have always have cooling problems in most cases, it's not something new.
 
I believe this horribly photoshopped picture I made sums things up nicely for now.

hotairfermi.jpg

I figured this thread needs a little laugh.
 
Status
Not open for further replies.
Back
Top