• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 4XX Series Discussion

Status
Not open for further replies.

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.49/day)
Location
Reaching your left retina.
I find it odd that they chose those apps? It would be more interesting to see nVidia beat AMD in AMD-supported apps, no? None of these engines does anything really interesting. Where's DX11? Who cares about DX10? Looks like a pretty even 80% over GTX285?

And really, R5xxx isn't relevant at all. For a DX11 chip, 120FPS is a bare minimum, considering 3D. So what those graphs say is that for 3D gaming @ 1920x1200, I gotta buy "2xGTX380"? To add in Phys-X effects?

How about thoes apps? Can I run 3D @ 120hz with Phys-X? I am hoping a single chip can do this, not two.

That's what Fermi needs to be, at least to me as a consumer. If I have to buy two $600 cards, and maybe a third, and then a 1200W PSU as well, so I can have the pleasure of buying another kit for $600 with monitor and glasses...taxes, shipping...Fermi costs $2000 for the complete end user package, @ 1920x1080? Add in cpu, memory, motherboard, casing, HDD's, ODD's....I'm not a happy camper.

Haw many games will work perfectly? What about those DX11 titles? Seems they are just now ready for DX10, performance wise, or do we need three gpus still?

LOL. What the hell are you talking about? :laugh: It looks like seing the graphs has made your head spin or something, and has made a soup out of your ideas. :roll:

First of all, we don't know if they are fake or not... but your reaction is quite odd and funny. If true, they'd be showing the greatest performance improvement we've ever seen and that makes you rant? :roll: What do you want man? Maybe they should bundle a car too?

EDIT: Don't get me wrong, the "I want MOAR" argument is always welcomed, but I think you went a little bit overboard.

Yeah actually I posted this for you in mind and put someone else's name down by accident. You know it probably didn't get stolen from you but...... When you do searches on this topic Tech power up comes up a lot.

So you know I wouldn't be surprised if it was stolen from you.!!!

Haha, no man, what my charts do is nothing so special. I just made some comparisons and reached the same conclusion as anyone looking at the specs could reach: it will be about twice as fast as the GTX285, and that's more or less what the graphs are showing.

For the same reason they could be legit too.
 
Last edited:
Joined
Oct 6, 2009
Messages
2,824 (0.52/day)
Location
Midwest USA
System Name My Gaming System
Processor Intel i7 4770k @ 4.4 Ghz
Motherboard Asus Maximus VI Impact (ITX)
Cooling Custom Full System Water cooling Loop
Memory G.Skill 1866 Mhz Sniper 8 Gb
Video Card(s) EVGA GTX 780 ti SC
Storage Samsung SSD EVO 120GB - Samsung SSD EVO 500GB
Display(s) ASUS W246H - 24" - widescreen TFT active matrix LCD
Case Bitfenix Prodigy
Power Supply Corsair AX760 Modular PSU
Software Windows 8.1 Home Primeum
LOL. What the hell are you talking about? It looks like seing the graphs has made your head spin or something, and has made a soup out of your ideas.

First of all, we don't know if they are fake or not... but your reaction is quite odd and funny. If true, they'd be showing the greatest performance improvement we've ever seen and that makes you rant? What do you want man? Maybe they should bundle a car too?

You are right he shouldn't get too excited yet!!! These spec's aren't verified. I do agree with his statement that the price would be way to damn high though!!!

Haha, no man, what my charts do is nothing so special. I just made some comparisons and reached the same conclusion as anyone looking at the specs could reach: it will be about twice as fast as the GTX285, and that's more or less what the graphs are showing.

For the same reason they could be legit too.

They can't be for real ... I refuse to give you credit on being that smart LOL J/K :) ...... But seriously though...... I still don't think these are real with all the same reasons as I gave before.

When the specs on every previous gen cards were released before they came out. Every one always looked at the specs and got all hyped. When the cards got released they all were lucky to produce 80% to 85% of what the specs said they could do.

Plus with these cards in particular Nvidia is really not paying any attention to the gaming aspect. They said so them selves. They are instead more paying attention to the computation aspects. So they should make for a wonderful folding card. But more of an equal gaming card.

If you also look at the previous gen Nvidia cards they always seem a lot more powerful because of the use of Physx with them. You take that off they are allot closer if not less powerful (in some cases as their red counterparts) So unless all you play is Physx games or do nothing but folding I am still waiting to see real world performance:)
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.49/day)
Location
Reaching your left retina.
When the specs on every previous gen cards were released before they came out. Every one always looked at the specs and got all hyped. When the cards got released they all were lucky to produce 80% to 85% of what the specs said they could do.

The thing is, that's the case here. If there's something that my charts demostrate, is that previous Nvidia cards scaled with the GFlops most of the times. Well the GTX380 has 2.5x the floating point power when compared to the GTX285. 80% of 2.5x gives you 2x, which is what the graphs are showing.

Plus with these cards in particular Nvidia is really not paying any attention to the gaming aspect. They said so them selves. They are instead more paying attention to the computation aspects. So they should make for a wonderful folding card. But more of an equal gaming card.

No, not really. They have constantly said that what they are making to improve GPGPU is not hurting the graphics performance. On the contrary, it makes it better. In fact HPC and graphics are not very different, except that for HPC you need heavy caches and these caches offer little benefit in the graphics department. BUT they don't make it worse either.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.58/day)
Heh. That's no rant...that's simply the expectations that nV has set for themselves. If they flaunt these features, they best be able to deliver, or they are a failure, in my eyes, and I won't buy. The graphs don't matter to me.

I have the money to spend...$5k on a computer is no big deal. The only important thing about the "GTX3XX" generation is nVidia's ability to deliver on thier promises, with thier closed-platform of products. Given what is in people's living rooms, and what other supporting technology is out there, the goals are clearly set. Performance doesn't matter, the competition doesn't matter, it's simply them vs. themselves.

This is the beginning of a new generation. For both sides. DX11 presents a new table to feed from, and they must dress it appropriately. ATI/AMD are serving up Eyefinity with DX11, and they have roughly 2-3 years to perfect it before we move on. They are far from perfect now. I couldn't sell anyone an eyefinity system today with any confidence. Maybe in a few months...the power is there in 4 gpus, and they need to fix drivers. HD5K, and ATi, are done...old news now.

nVidia is serving phys-X, CUDA, and Stereoscopic 3D. It's a very different meal to consume, and they created the menu.

It HAS to be 1920x1080, in bare minimum. It has to have these other features I just listed too. It has to do that within 1200 watts. Speculating on what that performance will be isn't needed...it's simple math. Every game, 120 FPS.

Ranting has nothing to do with it. They either do it, or they don't. I hope they do, so I can spend that my money on thier platform for DX11.
 
Joined
Oct 6, 2009
Messages
2,824 (0.52/day)
Location
Midwest USA
System Name My Gaming System
Processor Intel i7 4770k @ 4.4 Ghz
Motherboard Asus Maximus VI Impact (ITX)
Cooling Custom Full System Water cooling Loop
Memory G.Skill 1866 Mhz Sniper 8 Gb
Video Card(s) EVGA GTX 780 ti SC
Storage Samsung SSD EVO 120GB - Samsung SSD EVO 500GB
Display(s) ASUS W246H - 24" - widescreen TFT active matrix LCD
Case Bitfenix Prodigy
Power Supply Corsair AX760 Modular PSU
Software Windows 8.1 Home Primeum
The thing is, that's the case here. If there's something that my charts demostrate, is that previous Nvidia cards scaled with the GFlops most of the times. Well the GTX380 has 2.5x the floating point power when compared to the GTX285. 80% of 2.5x gives you 2x, which is what the graphs are showing.

No, not really. They have constantly said that what they are making to improve GPGPU is not hurting the graphics performance. On the contrary, it makes it better. In fact HPC and graphics are not very different, except that for HPC you need heavy caches and these caches offer little benefit in the graphics department. BUT they don't make it worse either.

Okay I guess in the end we will just have to see...... But I do have a question for you?

If your theory is correct and everything comes true what you said. Now that would mean ATI would probably go back to lowering their prices. That might mean so could Nvidia a little better this time too.

Now here's where my question comes in...... If that's the case what do you think would happen? If these cards are so good that they drove their only competition almost to defeat. Wouldn't that also be like stabbing them selves in the foot.

Or let's say that they don't lower their prices to keep up and a SLI GTX380 then would put it in the 1200 to 1500 dollar range. Then they would still be shooting the selves in the foot.

I am really interested to hear your opinion of what you think Nvidia is going to do with all this power? And what effect would you think it has on us the consumer?

Heh. That's no rant...that's simply the expectations that nV has set for themselves. If they flaunt these features, they best be able to deliver, or they are a failure, in my eyes, and I won't buy. The graphs don't matter to me.

I have the money to spend...$5k on a computer is no big deal. The only important thing about the "GTX3XX" generation is nVidia's ability to deliver on thier promises, with thier closed-platform of products. Given what is in people's living rooms, and what other supporting technology is out there, the goals are clearly set. Performance doesn't matter, the competition doesn't matter, it's simply them vs. themselves.

This is the beginning of a new generation. For both sides. DX11 presents a new table to feed from, and they must dress it appropriately. ATI/AMD are serving up Eyefinity with DX11, and they have roughly 2-3 years to perfect it before we move on. They are far from perfect now. I couldn't sell anyone an eyefinity system today with any confidence. Maybe in a few months...the power is there in 4 gpus, and they need to fix drivers. HD5K, and ATi, are done...old news now.

nVidia is serving phys-X, CUDA, and Stereoscopic 3D. It's a very different meal to consume, and they created the menu.

It HAS to be 1920x1080, in bare minimum. It has to have these other features I just listed too. It has to do that within 1200 watts. Speculating on what that performance will be isn't needed...it's simple math. Every game, 120 FPS.

Ranting has nothing to do with it. They either do it, or they don't. I hope they do, so I can spend that my money on thier platform for DX11.
__________________

If you look at this way you are right........ ATI needs to fix their drivers in order for them to fully go to Eyefinity. It will happen and it will take some time.

Now the Nvidia group I think is going for a lot higher of a goal though. With their 3d tech..... I don't think 3d tech will be a viable solution ever for cutting edge gaming. The fram rates for cutting edge gaming are always to low. In the 60's at best. Unless they change something very drastically.
Now with this generation on both sides( if ATI had a 3d option) these cards can play DX9 games with over 120 FPS. Maybe some of the older DX10 games too. But realistically..... You are correct for them to even to start pushing it for DX11 it will still be awhile. Even if it's best case scenario and Nvidia's offerings do push 15% ATI's that would mean at best they would probably be getting an AVG of 70FPS to 80FPS on DX11 games. Which is way below the need for 3D.
Now I just pulled those numbers out of my ass. But I can see were someone like you would be very frustrated. Why show those specs when they should be concentrating on the new games and tech.
The only thing I can tell you on that is..... Supposedly DX11 should be way better optmized then DX10. So hopefully that will translate into FPS for you buddy!
 
Last edited:

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.49/day)
Location
Reaching your left retina.
Heh. That's no rant...that's simply the expectations that nV has set for themselves. If they flaunt these features, they best be able to deliver, or they are a failure, in my eyes, and I won't buy. The graphs don't matter to me.

I have the money to spend...$5k on a computer is no big deal. The only important thing about the "GTX3XX" generation is nVidia's ability to deliver on thier promises, with thier closed-platform of products. Given what is in people's living rooms, and what other supporting technology is out there, the goals are clearly set. Performance doesn't matter, the competition doesn't matter, it's simply them vs. themselves.

This is the beginning of a new generation. For both sides. DX11 presents a new table to feed from, and they must dress it appropriately. ATI/AMD are serving up Eyefinity with DX11, and they have roughly 2-3 years to perfect it before we move on. They are far from perfect now. I couldn't sell anyone an eyefinity system today with any confidence. Maybe in a few months...the power is there in 4 gpus, and they need to fix drivers. HD5K, and ATi, are done...old news now.

nVidia is serving phys-X, CUDA, and Stereoscopic 3D. It's a very different meal to consume, and they created the menu.

It HAS to be 1920x1080, in bare minimum. It has to have these other features I just listed too. It has to do that within 1200 watts. Speculating on what that performance will be isn't needed...it's simple math. Every game, 120 FPS.

Ranting has nothing to do with it. They either do it, or they don't. I hope they do, so I can spend that my money on thier platform for DX11.

I don't agree. Just because they are offering some extras doesn't mean a single GPU from them has to be able to do everything. When you buy a GPU, you buy a GPU and shouldn't expect to have anything extra. They DO offer extras and that's nothing but good. From there, it's up to you to decide if the performance drop derived from S3D or PhysX is worth for you or if it's worth to buy a second card. With Ati is the same, they offer Eyefinity and that's cool, you still have to buy 3 monitors, they won't give them to you and you still need more than one GPU if you trully want to play the latest DX11 games on 3 monitors at decent resolution and fps. (I only mention Ati, because that's the only other GPU company to compare with)

What I mean is that it's all your choice. You want the same experience that you always had or you want more? If you want more it's clear that you will have to pay more or be happy with what a single GPU can do. i.e. I could live with 40-50 fps with S3D and PhysX enabled and I'm sure that a single Fermi card will be able to deliver that in 99% of the games. I've been playing Stereo3D since Nvidia introduced their drivers* and I can tell you that you don't need more fps than you would need without, you do need more Hz in the monitor but that's all.

*way back in 2002? 2004? I actually don't remember when it was, but a long time ago for sure. I stopped playing Stereo3D when games started using massive post-processing effects that broke the 3D and introduced a lot of ghosting. Also post effects where always rendered in the front instead at the depth they should be, etc. Stereo3D has always been the reason for my preference for Nvidia cards in the past and one of the reasons I still use a CRT. I don't like the 3D Vision too much, on the other hand, because it has to be with their glasses and LCD monitors AFAIK, although I'm not sure.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.49/day)
Location
Reaching your left retina.
Now here's where my question comes in...... If that's the case what do you think would happen? If these cards are so good that they drove their only competition almost to defeat. Wouldn't that also be like stabbing them selves in the foot.

Or let's say that they don't lower their prices to keep up and a SLI GTX380 then would put it in the 1200 to 1500 dollar range. Then they would still be shooting the selves in the foot.

I am really interested to hear your opinion of what you think Nvidia is going to do with all this power? And what effect would you think it has on us the consumer?

For Ati: means they would have to make their products better. By improving them, by making them cheaper... whatever it takes. They would take a hit maybe, but AMD is not going anywhere. Their tech is too good to dissapear. In the very worst case they would be bought by another company, and hopefully with a better direction and more money.

For Nvidia: they will probably price them relatively low. They are not after the gaming crowd, they are after everyone. Only 20% of GPUs sold are gaming cards if at all. With GPGPU Nvidia's objective has always been to make the other 80% of people buy one of their cards. At $600-800 they are not going to do that. They will probably sell the cards for $400 and $500 respectively, even if they have to capitalize in consumer GPU profits. Something that I don't think it will happen: bad 40nm yields aside, fermi cards are cheaper to produce than GT200 cards.

For us: Fermi being that fast is only good news, it means faster GPUs at lower prices. Unless you care for the companies, what happens to them, what they have to do to survive, shouldn't matter to you. And remember that when they had the oportunity they agreed to price fixing, so I couldn't care less what happens to them. I can sleep easy knowing that none of them are going anywhere, they own too much IP as to dissapear.
 
Joined
Apr 30, 2008
Messages
4,879 (0.82/day)
Location
Multidimensional
System Name Boomer Master Race
Processor AMD Ryzen 7 7800X3D 4.2Ghz - 5Ghz CPU
Motherboard MSI B650I Edge Wifi ITX Motherboard
Cooling CM 280mm AIO + 2x 120mm Slim fans
Memory Kingston Fury 32GB 6000Mhz
Video Card(s) ASUS RTX 4070 Super 12GB OC
Storage Samsung 980 Pro 2TB + WD 2TB 2.5in HDD
Display(s) Sony 4K Bravia X85J 43Inch TV 120Hz
Case CM NR200P Max TG ITX Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply CoolerMaster V850 SFX Gold 850W PSU
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 10 Home 64bit
Benchmark Scores Don't do them anymore.
TRUTH!

but ... wizz reviews are top notch too...


I agree tho, 5970 is dominant at higher rez - same should apply for the fermi monster.


I am hoping for an improvement in IQ as well from the Fermi - but that might be hard to do, as g200 IQ is very advanced already.


Yeah I trust Wizz's reviews as well, probably the best written ones but i still like to watch vid reviews, I mean who doesnt, more interesting:toast:
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.49/day)
Location
Reaching your left retina.
Yeah I trust Wizz's reviews as well, probably the best written ones but i still like to watch vid reviews, I mean who doesnt, more interesting:toast:

Me, for te most part. I can read 10 times faster than the pace those vid reviews usually have and they are booooring for me. :D
 
Joined
Apr 30, 2008
Messages
4,879 (0.82/day)
Location
Multidimensional
System Name Boomer Master Race
Processor AMD Ryzen 7 7800X3D 4.2Ghz - 5Ghz CPU
Motherboard MSI B650I Edge Wifi ITX Motherboard
Cooling CM 280mm AIO + 2x 120mm Slim fans
Memory Kingston Fury 32GB 6000Mhz
Video Card(s) ASUS RTX 4070 Super 12GB OC
Storage Samsung 980 Pro 2TB + WD 2TB 2.5in HDD
Display(s) Sony 4K Bravia X85J 43Inch TV 120Hz
Case CM NR200P Max TG ITX Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply CoolerMaster V850 SFX Gold 850W PSU
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 10 Home 64bit
Benchmark Scores Don't do them anymore.
Me, for te most part. I can read 10 times faster than the pace those vid reviews usually have and they are booooring for me. :D

Everyones different I guess:toast:
 
Joined
Oct 6, 2009
Messages
2,824 (0.52/day)
Location
Midwest USA
System Name My Gaming System
Processor Intel i7 4770k @ 4.4 Ghz
Motherboard Asus Maximus VI Impact (ITX)
Cooling Custom Full System Water cooling Loop
Memory G.Skill 1866 Mhz Sniper 8 Gb
Video Card(s) EVGA GTX 780 ti SC
Storage Samsung SSD EVO 120GB - Samsung SSD EVO 500GB
Display(s) ASUS W246H - 24" - widescreen TFT active matrix LCD
Case Bitfenix Prodigy
Power Supply Corsair AX760 Modular PSU
Software Windows 8.1 Home Primeum
For Ati: means they would have to make their products better. By improving them, by making them cheaper... whatever it takes. They would take a hit maybe, but AMD is not going anywhere. Their tech is too good to dissapear. In the very worst case they would be bought by another company, and hopefully with a better direction and more money.

For Nvidia: they will probably price them relatively low. They are not after the gaming crowd, they are after everyone. Only 20% of GPUs sold are gaming cards if at all. With GPGPU Nvidia's objective has always been to make the other 80% of people buy one of their cards. At $600-800 they are not going to do that. They will probably sell the cards for $400 and $500 respectively, even if they have to capitalize in consumer GPU profits. Something that I don't think it will happen: bad 40nm yields aside, fermi cards are cheaper to produce than GT200 cards.

For us: Fermi being that fast is only good news, it means faster GPUs at lower prices. Unless you care for the companies, what happens to them, what they have to do to survive, shouldn't matter to you. And remember that when they had the oportunity they agreed to price fixing, so I couldn't care less what happens to them. I can sleep easy knowing that none of them are going anywhere, they own too much IP as to dissapear.

No I agree with you by all means .... I don't think it would kill ATI. But more the reason I asked that question is because...... I think if Nvidia did try to do so by making Femi so ridiculously strong. I think it would only hurt themselves. That is why I asked the question. I did see though that you answered that part too. You are right .... I would hope Nvidia understands that 80% of the people don't buy $700 to $900 dollar GPU's. And it would be suicide for them to produce only cards that cost really high.
With that said that is why I doubt that the cards they are producing will put them in a hole so much that would cause Financial ruin. Like for instance if a GTX 360 could beat a 5970 in some games and ATI could drop pricing more than Nvidia. Why would they make there mainstream card have Enthusiast grade performance and possible pricing. In other wards if ATI can not afford to keep their prices so high and compete in that area. That would leave Nvidia with the highest performing card. So that would equal everyone would want one. They know this so that usually equals over charging.

Well anyway that's just a thought. It might be a stretch..... but I just figured I'd throw it out there.

While these charts do confirm your idea of there performance. If you look at your specs a GTX 360 might be able to pull off a win over the 5870 but not a 5970 for sure. Those charts show that happening. Again they have to be fake :D
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.49/day)
Location
Reaching your left retina.
So that would equal everyone would want one. They know this so that usually equals over charging.

That's the usual thing, but I believe this time they are going to try something different. My opinion is based in something they said in an Anandtech article tbh. They asked Nvidia about pricing and they said "You will be pleasantly surprised by the price when it launches". Maybe I'm giving too much credit to that quote, but I think that releasing a card that is tempting for everyone, not just gamers is the only way to go for Nvidia considering what they want to do and what their future strategy seems to be. GPGPU is not going to be anything if they don't manage to get some "mainstream" * people aboard. And it's their interest to make GPGPU successful, because they are much much faster there.

* With mainstream there I mean the kind of people that has mainstream graphics but powerful ($600-1000) CPUs, because they do have high computing necesities, although not visual ones. If buying a $400 card along with a $400 CPU is going to be better for them than buying a $800 CPU they will buy one, but if it costs $600 they will not. Looking at the future it's better for Nvidia to sell the cards at lower margins and capitalize in HPC sales.

While these charts do confirm your idea of there performance. If you look at your specs a GTX 360 might be able to pull off a win over the 5870 but not a 5970 for sure. Those charts show that happening. Again they have to be fake :D

The only game where it's faster is Farcry 2 and FC2 with 8x AA has always been better on Nvidia hardware. That result isn't particularly interesting unless you take that into consideration. Look at the others, it will probably be something like that.
 

Bo_Fox

New Member
Joined
May 29, 2009
Messages
480 (0.09/day)
Location
Barack Hussein Obama-Biden's Nation
System Name Flame Vortec Fatal1ty (rig1), UV Tourmaline Confexia (rig2)
Processor 2 x Core i7's 4+Gigahertzzies
Motherboard BL00DR4G3 and DFI UT-X58 T3eH8
Cooling Thermalright IFX-14 (better than TRUE) 2x push-push, Customized TT Big Typhoon
Memory 6GB OCZ DDR3-1600 CAS7-7-7-1T, 6GB for 2nd rig
Video Card(s) 8800GTX for "free" S3D (mtbs3d.com), 4870 1GB, HDTV Wonder (DRM-free)
Storage WD RE3 1TB, Caviar Black 1TB 7.2k, 500GB 7.2k, Raptor X 10k
Display(s) Sony GDM-FW900 24" CRT oc'ed to 2560x1600@68Hz, Dell 2405FPW 24" PVA (HDCP-free)
Case custom gutted-out painted black case, silver UV case, lots of aesthetics-souped stuff
Audio Device(s) Sonar X-Fi MB, Bernstein audio riser.. what??
Power Supply OCZ Fatal1ty 700W, Iceberg 680W, Fortron Booster X3 300W for GPU
Software 2 partitions WinXP-32 on 2 drives per rig, 2 of Vista64 on 2 drives per rig
Benchmark Scores 5.9 Vista Experience Index... yay!!! What??? :)
@ Cadaveca, although the Fermi will be a powerhouse that could do all of the features that you desired (S3D @ 120Hz, 1920x1200, etc..) the problem is that we do not have a LCD monitor that supports both 1920x1200 and 120Hz yet!

Yes, I know this is ridiculous. We just have to keep on waiting years and years for this technology to be released. I would still like a true 3D monitor that does not require the use of shutter glasses. There will always be ghosting with shutter glasses, due to the fact that response times and the "lag" in sync times are more than a few milliseconds. LCD is still a terrible technology when it comes to input lag.

For now, a GTX 380 will be a little better than a GTX 295 plus DX11 features. That's all I expect out of it really.

I'm excited about a powerful single-GPU card for the following reasons:

The ability to do true triple buffering (by forcing it on with D3DOverrider). Even if some games provide support for triple buffering in the menu settings, SLI/Crossfire will still not allow true triple buffering. So if the frame rate is not always above the refresh rate (say, 120Hz or 60Hz), then you'd have to choose between the extremely annoying frame-rate "halving" (when it drops down to half of the refresh rate in a sudden transition, sometimes frequently) or disabling Vsync altogether with the side-effects of screen tearing (which is horrible with 60Hz LCD monitors).

Another reason is to avoid microstuttering whenever AFR is being used (proven to still exist for Dirt 2 when 50fps actually looks like 30fps). AFR also introduces some input lag (similar to the lag that triple buffering causes).

That's why we gamers so desperately need 120Hz monitors, even if most of us do not know what we are missing. We can disable Vsync for fast-paced games and enjoy zero lag with muss less noticeable screen tearing than with 60Hz, or enable Vsync with or without triple buffering. If it's a demanding game like Stalker Clear Sky, and we cannot force triple buffering due to SLI/Crossfire, then having the frame rates being fraction multiples of 120Hz is much better than fraction multiples of 60Hz (dropping down to intervals of 1/2, 1/3, 1/4, and so forth).

Another caveat about SLI/Crossfire is that not all of the games support it. You might have to wait 2 months for there to be proper driver support for SLI/Crossfire.

All in all, I'm glad that NV is making a powerhouse single GPU, even if it takes 3 re-spins. It's better to have it late than early and buggy or prone to early death. There are still some people like me who appreciate having a single powerhouse GPU rather than multi-GPU solutions.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.58/day)
I don't agree. Just because they are offering some extras doesn't mean a single GPU from them has to be able to do everything. When you buy a GPU, you buy a GPU and shouldn't expect to have anything extra. They DO offer extras and that's nothing but good. From there, it's up to you to decide if the performance drop derived from S3D or PhysX is worth for you or if it's worth to buy a second card. With Ati is the same, they offer Eyefinity and that's cool, you still have to buy 3 monitors, they won't give them to you and you still need more than one GPU if you trully want to play the latest DX11 games on 3 monitors at decent resolution and fps. (I only mention Ati, because that's the only other GPU company to compare with)

What I mean is that it's all your choice. You want the same experience that you always had or you want more? If you want more it's clear that you will have to pay more or be happy with what a single GPU can do. i.e. I could live with 40-50 fps with S3D and PhysX enabled and I'm sure that a single Fermi card will be able to deliver that in 99% of the games. I've been playing Stereo3D since Nvidia introduced their drivers* and I can tell you that you don't need more fps than you would need without, you do need more Hz in the monitor but that's all.

*way back in 2002? 2004? I actually don't remember when it was, but a long time ago for sure. I stopped playing Stereo3D when games started using massive post-processing effects that broke the 3D and introduced a lot of ghosting. Also post effects where always rendered in the front instead at the depth they should be, etc. Stereo3D has always been the reason for my preference for Nvidia cards in the past and one of the reasons I still use a CRT. I don't like the 3D Vision too much, on the other hand, because it has to be with their glasses and LCD monitors AFAIK, although I'm not sure.

I agree with ya 110%. I'm extra sensitive to FPS, and for me, good gaming is 120FPS. Less, and I tend to get headaches. And truly, I don't think expecting that in a single card is realistic either...that's kinda my point...Fermi isn't just a single card, it's a complete platform, amd with SLi added in as a feature, it's GOTTA take 2 cards or more. That's perfectly fine to me. We know what Fermi is already...the details are out. Performance isn't a question...merely drivers are left, as single-card performance isn't the full story, to me. And there's no doubting they can pull it off.

And while Eyefinity takes 3 monitors, the end cost is the same as nV's S3D(thanks for that abbreviation!), so that means little to me. Heck, I've already got the monitors. But like I said...it's not ready, so to me, even as an ATI fanboy, ATI has failed to successfully launch Eyefinity and 5800-series. Old news, and what ATi does really has no effect on Fermi, or vice-versa.

Take a look at my sig...first scores are the best of ATi's DX9 series, second is DX10. Eother company meets, or exceeds, the expectations put out by those platforms already. But here we are entering a new platform, DX11...they've got time to kill to release, renew, and knowing nV and re-labelling, even a bit of recycling.

I guess part of the pressure on Nv is due to them stopping production of GT200 cards, but I think they've properly managed this, so that stores aren't left holding old stock.
That's the usual thing, but I believe this time they are going to try something different. My opinion is based in something they said in an Anandtech article tbh. They asked Nvidia about pricing and they said "You will be pleasantly surprised by the price when it launches". Maybe I'm giving too much credit to that quote, but I think that releasing a card that is tempting for everyone, not just gamers is the only way to go for Nvidia considering what they want to do and what their future strategy seems to be. GPGPU is not going to be anything if they don't manage to get some "mainstream" * people aboard. And it's their interest to make GPGPU successful, because they are much much faster there.

* With mainstream there I mean the kind of people that has mainstream graphics but powerful ($600-1000) CPUs, because they do have high computing necesities, although not visual ones. If buying a $400 card along with a $400 CPU is going to be better for them than buying a $800 CPU they will buy one, but if it costs $600 they will not. Looking at the future it's better for Nvidia to sell the cards at lower margins and capitalize in HPC sales.

See, I get something different form that...I hear nV say, when mentioning cost..."Hey, you don't need a super-powerful cpu...we got you covered. Buy that cheap cpu, and spend that money on graphics. We can crunch the same numbers they can, and we can do it faster." I see them saying you only need a $200 cpu(Hello Phenom2 C3), and here, take our $600 gpu for your power-computing needs.

That's why AMD is a consideration....thier cost has always fit well with AMD cpus. But if you got an AMD cpu, why not buy an AMD videocard too? What do they have that makes them better?
 
Last edited:

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.49/day)
Location
Reaching your left retina.
See, I get something different form that...I hear nV say, when mentioning cost..."Hey, you don't need a super-powerful cpu...we got you covered. Buy that cheap cpu, and spend that money on graphics. We can crunch the same numbers they can, and we can do it faster." I see them saying you only need a $200 cpu(Hello Phenom2 C3), and here, take our $600 gpu for your power-computing needs.

But that wouldn't work IMO. There's not enough software that uses GPGPU, because there's not enough hardware out yet, and there's never going to be unless someone takes the risk and gives the first step by releasing it in high numbers. Someone has to make the first move and that has to be Nvidia, because of the strategy they chose for Fermi and GPGPU, they have commented they commitment to enter the "mainstream" (same as above) market with their GPU. No need to say that's never going to happen at a high price. And now they can, they can compete on the price too, better than in the past, so they should continue. Not only to win the gaming market, but to enter the new one. They have to make the equivalent to giving out promotional DVDs or they will never succeed. But it's just my opinion.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.58/day)
I'd only be concerned if nVidia was laying people off. I don't think they have at all, even through this "recession". As such, they litteraly have the staff to release the software to support the HPC crowd...in fact, I think they had that confrence a couple months ago, jsut to say exactly that.

We've been told what Fermi is. We've seen final products(although they don't appear to be similar to the demo card). We've got JenHsun up on stage saying he's commited to HPC, and that 's the direction that nV is headed.

In the end, they are confined in just the same way that ATi has been with the 5-series...die size and power consumption of the given process. We know that these constraints are causing issues, due to problems @ TSMC.

We know they are "late". But really, they always have been late. So that's nothing new, nor is it important, although ATI marketing would have most beleive differently.

We know thier high-end single-gpu card will debut @ $600. The dual gpu card will then come and fill that price point, and the single gpu should drop to $400 or so.

We know they'll bring DX11, and all those other things.

All they have left to do is start releasing software...and given the production delays, they have even more time to work on it.

It's only really development houses that are affected by nV's slow release...only companies working on DX11 software, and needing hardware to test on, should really care about what nV is doing. These guys are buying up ATI hardware to test on, and it's too bad that nV missed the boat there...but at the same time, those sales will still happen...

Fanboys will be fanboys, and they'll still buy an nV gpu when it comes out. Nothing has changed, nor will it change, any time soon.

After watching product releases for the past 8 years, everything seems normal.
 

Bo_Fox

New Member
Joined
May 29, 2009
Messages
480 (0.09/day)
Location
Barack Hussein Obama-Biden's Nation
System Name Flame Vortec Fatal1ty (rig1), UV Tourmaline Confexia (rig2)
Processor 2 x Core i7's 4+Gigahertzzies
Motherboard BL00DR4G3 and DFI UT-X58 T3eH8
Cooling Thermalright IFX-14 (better than TRUE) 2x push-push, Customized TT Big Typhoon
Memory 6GB OCZ DDR3-1600 CAS7-7-7-1T, 6GB for 2nd rig
Video Card(s) 8800GTX for "free" S3D (mtbs3d.com), 4870 1GB, HDTV Wonder (DRM-free)
Storage WD RE3 1TB, Caviar Black 1TB 7.2k, 500GB 7.2k, Raptor X 10k
Display(s) Sony GDM-FW900 24" CRT oc'ed to 2560x1600@68Hz, Dell 2405FPW 24" PVA (HDCP-free)
Case custom gutted-out painted black case, silver UV case, lots of aesthetics-souped stuff
Audio Device(s) Sonar X-Fi MB, Bernstein audio riser.. what??
Power Supply OCZ Fatal1ty 700W, Iceberg 680W, Fortron Booster X3 300W for GPU
Software 2 partitions WinXP-32 on 2 drives per rig, 2 of Vista64 on 2 drives per rig
Benchmark Scores 5.9 Vista Experience Index... yay!!! What??? :)
I'm extra sensitive to FPS, and for me, good gaming is 120FPS. Less, and I tend to get headaches.

No, you do not get headaches with 85 fps instead of 120 fps. If it's sustained 85fps instead of 120fps (minimum 80fps or so), you are probably getting headaches from that cocaine drink you're having rather than the frame rates. If you're using an LCD monitor at 60Hz, you're only seeing 60fps, so brighten up yourself! :p

I can pretty much tell a difference between 60 and 80 fps, and hardly between 70 and 90 fps, but not really between 90 and 120 fps (well, maybe barely, but nope).

At least the Fermi will be able to do a lot more games at your 120fps goal.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.58/day)
Of course, you realize I'm talking abou S3D here, right? 120FPS of S3D=60FPS per eye...I definately notice.

Normal gaming, etc, yeah, I'm good at about 90FPS or so.
 
Joined
Sep 7, 2008
Messages
355 (0.06/day)
Location
Canberra Australia
Processor i5 750 4.2ghz 1.4V
Motherboard Gigabyte P55-UD4P
Cooling Corsair H70
Memory 2X2G Gskill @2140 8-11-8-28-1T
Video Card(s) 2xGainward Phantom3 570@860/4200
Storage 2xWD 500G AAKS Raid0 + 2XWD 1T Green in Raid0
Display(s) 24" HP w2408h+23" HP w2338h
Case SilverStone Raven2E-B
Audio Device(s) onboard for now
Power Supply Antec 1000W Truepower quattro
Software Win7 Ultimate 64bit
Of course, you realize I'm talking abou S3D here, right? 120FPS of S3D=60FPS per eye...I definately notice.

Normal gaming, etc, yeah, I'm good at about 90FPS or so.

Err.. so you haven't ever experienced it and already know you would get headache from it? I can tell you its different.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
7,997 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I think surely enough on a 60hz monitor we wont SEE much (if any) difference, but for me, the difference has always been in how responsive the game FEELS, especially first person shooters, 120fps is a fantastic threshold for a super responsive mouse.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.58/day)
Err.. so you haven't ever experienced it and already know you would get headache from it? I can tell you its different.

Actually, I have experienced it, and it made me sick. Similar ot motion sickness, that I sitl lget form a few games, no matter the FPS. I feel confident, though, that 120FPS will fix that for S3D. Maybe I'm wrong though, and S3D isn't for me. I find it slightly comical you guys think you know how it is for me...however, I'm glad you have/are enjoying it, as that bodes well for me in the future.

You know, I sit at home, all day long. Videogames are my entertainment...how I escape from reality. And not just me either; my wife and my oldest son both choose gaming over anything else, so we've got multiple console systems, and when it comes to computers, we have many systems, and like to experience everything there is to offer. I wn pretty much every game out there...almost every extra dollar I get goes towards gaming.

I'm merely relating my personal experience, you can take it as you will. I just got home form the hospital with my daughter, who has had holes drilled in her skull and some titanium placed. I'm in no mood to argue what's truth for me.

:toast:
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.49/day)
Location
Reaching your left retina.
Actually, I have experienced it, and it made me sick. Similar ot motion sickness, that I sitl lget form a few games, no matter the FPS. I feel confident, though, that 120FPS will fix that for S3D. Maybe I'm wrong though, and S3D isn't for me. I find it slightly comical you guys think you know how it is for me...however, I'm glad you have/are enjoying it, as that bodes well for me in the future.

You know, I sit at home, all day long. Videogames are my entertainment...how I escape from reality. And not just me either; my wife and my oldest son both choose gaming over anything else, so we've got multiple console systems, and when it comes to computers, we have many systems, and like to experience everything there is to offer. I wn pretty much every game out there...almost every extra dollar I get goes towards gaming.

I'm merely relating my personal experience, you can take it as you will. I just got home form the hospital with my daughter, who has had holes drilled in her skull and some titanium placed. I'm in no mood to argue what's truth for me.

:toast:

The weird perspective alone is enough to make some people sick, that might be your case. Also you have to train yourself a lot to be compfortable with S3D, starting with low divergence when you are new (barely feeling 3D) etc. Everyone has to do that and some people never get to feel right with shutter glasses. I hope the problem is the FPSs though, that way you will get to enjoy it anytime.

And I hope the best to your daughter mate. :toast:
 

Bo_Fox

New Member
Joined
May 29, 2009
Messages
480 (0.09/day)
Location
Barack Hussein Obama-Biden's Nation
System Name Flame Vortec Fatal1ty (rig1), UV Tourmaline Confexia (rig2)
Processor 2 x Core i7's 4+Gigahertzzies
Motherboard BL00DR4G3 and DFI UT-X58 T3eH8
Cooling Thermalright IFX-14 (better than TRUE) 2x push-push, Customized TT Big Typhoon
Memory 6GB OCZ DDR3-1600 CAS7-7-7-1T, 6GB for 2nd rig
Video Card(s) 8800GTX for "free" S3D (mtbs3d.com), 4870 1GB, HDTV Wonder (DRM-free)
Storage WD RE3 1TB, Caviar Black 1TB 7.2k, 500GB 7.2k, Raptor X 10k
Display(s) Sony GDM-FW900 24" CRT oc'ed to 2560x1600@68Hz, Dell 2405FPW 24" PVA (HDCP-free)
Case custom gutted-out painted black case, silver UV case, lots of aesthetics-souped stuff
Audio Device(s) Sonar X-Fi MB, Bernstein audio riser.. what??
Power Supply OCZ Fatal1ty 700W, Iceberg 680W, Fortron Booster X3 300W for GPU
Software 2 partitions WinXP-32 on 2 drives per rig, 2 of Vista64 on 2 drives per rig
Benchmark Scores 5.9 Vista Experience Index... yay!!! What??? :)
Actually, I have experienced it, and it made me sick. Similar ot motion sickness, that I sitl lget form a few games, no matter the FPS. I feel confident, though, that 120FPS will fix that for S3D. Maybe I'm wrong though, and S3D isn't for me. I find it slightly comical you guys think you know how it is for me...however, I'm glad you have/are enjoying it, as that bodes well for me in the future.

You know, I sit at home, all day long. Videogames are my entertainment...how I escape from reality. And not just me either; my wife and my oldest son both choose gaming over anything else, so we've got multiple console systems, and when it comes to computers, we have many systems, and like to experience everything there is to offer. I wn pretty much every game out there...almost every extra dollar I get goes towards gaming.

I'm merely relating my personal experience, you can take it as you will. I just got home form the hospital with my daughter, who has had holes drilled in her skull and some titanium placed. I'm in no mood to argue what's truth for me.

:toast:

Oh man, I'm sorry. I thought you were talking about just having 120fps. Well, what we need for S3D is 120Hz, not 120fps. We can still enjoy a fluid game at 60fps with S-3D, as long as the monitor is doing it at over 100Hz. Usually, I prefer to run my CRT at 140Hz to reduce as much flicker as possible (70Hz for each eye) when playing games in 3D.

I bet that once 120Hz LCD monitors start becoming popular, Nvidia (or at least EVGA, etc..) might start bundling their cards with shutter glasses real soon.

I wish the best for your daughter. You're lucky to have a wife who enjoy playing games. Mine does not. Hope your daughter recovers ASAP and that your family prospers with happiness.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.58/day)
Oh man, I'm sorry. I thought you were talking about just having 120fps. Well, what we need for S3D is 120Hz, not 120fps. We can still enjoy a fluid game at 60fps with S-3D, as long as the monitor is doing it at over 100Hz. Usually, I prefer to run my CRT at 140Hz to reduce as much flicker as possible (70Hz for each eye) when playing games in 3D.

I bet that once 120Hz LCD monitors start becoming popular, Nvidia (or at least EVGA, etc..) might start bundling their cards with shutter glasses real soon.

I wish the best for your daughter. You're lucky to have a wife who enjoy playing games. Mine does not. Hope your daughter recovers ASAP and that your family prospers with happiness.

120hz=120FPS, i thought, because the frames are "shifted", which gives the 3d perspective? The glasses literally oppose another in contrast, so you only really see with one eye at a time?

And thanks for the well wishes...she'll be fine, but the experience is emotionally draining. This is her second of 4 surgeries, and none are that big of a deal, but I still worry, of course, as any parent would.

And yeah, I'm pretty lucky, having a wife that enjoys gaming as much as I do. The competitiveness is a good stress reliever for both of us, and it's easy to get involved in a game, and forget life for a bit. But because she likes alot of things I don't, I get the chance to see alot of different games, and once the kids are thrown in, there's little we haven't played...I'm the one that gets stuck playing the games before the kids do, so we can be sure of the content they consume.

It also allow me to have multiple systems, and no problems with spending the money on new stuff when it comes out...but at the same time, I've definately become highly opinionated when it comes to buying stuff, and it's always good fun to see new stuff come, as it's more toys to play with for my entire family.


The fact that ATI and nVidia currently differ so greatly in thier offerings means alot to me. there more difference there is, the more entertainment I have. When I don't drive, all that extra cash that would have been spent on insurance, or gas, goes to gaming, so dumping $600 on some monitors, or a monitor and glasses, is no big deal to me...as long as it works. If it doesn't, then in the end, I say so. I mean...eyefinity works, I guess, but it's not enough for me.

I don't want Fermi to bring the same disappointment. And that's why it matters to me so much...I may be logging hundreds of hours using it.
 
Joined
Oct 6, 2009
Messages
2,824 (0.52/day)
Location
Midwest USA
System Name My Gaming System
Processor Intel i7 4770k @ 4.4 Ghz
Motherboard Asus Maximus VI Impact (ITX)
Cooling Custom Full System Water cooling Loop
Memory G.Skill 1866 Mhz Sniper 8 Gb
Video Card(s) EVGA GTX 780 ti SC
Storage Samsung SSD EVO 120GB - Samsung SSD EVO 500GB
Display(s) ASUS W246H - 24" - widescreen TFT active matrix LCD
Case Bitfenix Prodigy
Power Supply Corsair AX760 Modular PSU
Software Windows 8.1 Home Primeum
A 120HZ monitor is what I am investing in next!!! Even the 5870 needs one! Let alone probably a Femi. I think 120HZ is the new standard soon! IMO :D
 
Status
Not open for further replies.
Top