Monday, May 26th 2008
Next-gen NVIDIA GeForce Specs Unveiled, Part 2
Although it's a bit risky to post information taken from unknown for me sources, not to mention that the site is German, it's worth trying. The guys over at Gamezoom (Google translated) reported yesterday that during NVIDIA's Editors Day, the same place where the F@H project for NVIDIA cards and the buyout of RayScale were announced, NVIDIA has also unveiled the final specs for its soon to be released GT200 cards. This information comes to complement our previous Next-gen NVIDIA GeForce Specs Unveiled story:
Source:
Gamezoom
- GeForce GTX 280 will feature 602MHz/1296MHz/1107MHz core/shader/memory clocks, 240 stream processors, 512-bit memory interface, and GDDR3 memory as already mentioned.
- GeForce GTX 260 will come with 576MHz/999MHz/896MHz reference clock speeds, 192 stream processors, 448-bit memory interface and GDDR3 memory
108 Comments on Next-gen NVIDIA GeForce Specs Unveiled, Part 2
surely enough the 4870x2 might beat a GTX280, however given these specs im fairly sure that both new nvidia cards will tear the ass out of one 4870/50.
sure.
refresh rate has nothing to do with fps :hometheater.about.com/od/televisionbasics/qt/framevsrefresh.htm
when you watch a movie with a 30 fps do you see the frames?
is the same principle when you play a game or you watch a movie you don't need 200 fps to run it smooth
If it's below 60, I can easily see those damn frames. It's a really fast stutter closer to 60, but annoying as hell.
And don't even get me started on refresh rate haha. I get a headache in a literal 2 mins with low refresh rate (low is below 100 haha).
G200 = GeForce GTX 260 & 280
2 many cards are coming out.......
36-60 FPS is perfectly fine for slower paced games (like Splinter Cell, Thief, most RTS, etc), but for fast paced games (like Doom3 on nightmare, etc) it's unacceptable.
As to what people can visually see, it depends on the person, and on your setup - even here, I can usually tell when FPS starts dropping below 60 . . . but the only game I have an issue with "tearing" even over 60FPS is Crysis, and vsync causes to hard of a hit to enable it with 3x buffering. I've never experienced tearing with any other game, and multi-GPU setups are a ton more susceptible to it than single GPUs.
The higher the screen res you play at, the more noticeable tearing will be - even if you have high FPS and a beefcake GPU to handle it, that's a ton more pixels that the GPu has to render and keep up pace with.
you set refresh rate has a major impact on this if you're using LCDs. 60Mhz refresh on a native 75Mhz LCD will lead to tearing or otherwise.
But, this all boils down to individual ability to percieve what's on screen - everyone is different, some more sensitive to it than others.
like others have mentioned, some games can be fine at a lower fps, like crysis, is perfectly playable above 30fps, and test drive unlimited can be 45-50 and smooth as.... BUT;
first person shooters is a different story.
surely enough our eyes cannot see the difference above 60hz, however your hand can feel the difference in responsiveness at 100fps as oppose to 60fps.
ut3 is a great example of this, if you run the game stock, its capped at 62fps, which is more than smooth dont get me wrong, but when i make max fps 150, i usually get 90-110, and there is a HUGE difference in how fast you can react. especially in such an intense and fast paced game like UT3.
so the short answer is maybe we cant SEE the difference, but we can FEEL the difference of faster gameplay.
thus to me, 75+ fps is always what i aim for in those kind of games. and to get 75+ fps @ 1920x1200 +4xAA +16xAF in every game, your gfx card needs to be high end. and i will be abssolutely going for the new nvidia high end when it arrives.
especially cos it will sock it to the every more cruddy looking 48xx series.
all theyve done is added 50% more shader units, 50% more texture units, and given it faster memory.
the nvidia cards have more ROPS, more memory, more bus width, more shaders, more texture units, and a refined design of what we KNOW works. nvidia has literally made this card more beefy as oppose to trying to fill its shortcomings.
NV FTW.
But I agree, higher framerate does play better (that's how I got good at CS - 120 vsync on). But I also don't know how anyone plays without vsync. Sure, you can get massive performance penalty (up to 50% if your card can't handle it), but it's better to me than the entire screen tearing and disrupting my view (and pissing me off haha). Of course, my refresh is 100 so it would be 50 fps minimum and not 30 :D
in any case its each to their own, and the price is always right for both products, so theres so dumb choice.
There's a marketing stunt going on or these are very bad details. *shrugs*