Monday, December 14th 2009
NVIDIA Pitches GeForce GTX 300 Series to Clinch Performance Crown
NVIDIA's latest DirectX 11 compliant GPU architecture, codenamed "Fermi," is getting its first consumer graphics (desktop) implementation in the form of the GeForce GTX 300 series. The nomenclature turned from being obvious to clear, with a set of company slides being leaked to the media, carrying the GeForce GTX 300 series names for the two products expected to come out first: GeForce GTX 380, and GeForce GTX 360. The three slides in public domain as of now cover three specific game benchmarks, where the two graphics cards are pitted against AMD's Radeon HD 5870 and Radeon HD 5970, being part of the company's internal tests.
Tests include Resident Evil 5 (HQ settings, 1920x1200, 8x AA, DX10), STALKER Clear Sky (Extreme quality, No AA, 1920 x 1200, DX10), and Far Cry 2 (Ultra High Quality, 1920x1200, 8x AA, DX10). Other GPUs include GeForce GTX 295 and GTX 285 for reference, just so you know how NVIDIA is pitting the two against the Radeon HD 5000 GPUs, given that the figures are already out. With all the three tests, GTX 380 emerged on top, with GTX 360 performing close to the HD 5970. A point to note, however, is that the tests were run at 1920 x 1200, and tests have shown that the higher-end HD 5000 series GPUs, particularly the HD 5970, is made for resolutions higher than 1920 x 1200. AA was also disabled in STALKER Clear Sky. NVIDIA's GeForce GTX 300 will be out in Q1 2010.
Update (12/15): NVIDIA's Director of Public Relations EMEAI told us that these slides are fake, but also "when it's ready it's going to be awesome".
Source:
Guru3D
Tests include Resident Evil 5 (HQ settings, 1920x1200, 8x AA, DX10), STALKER Clear Sky (Extreme quality, No AA, 1920 x 1200, DX10), and Far Cry 2 (Ultra High Quality, 1920x1200, 8x AA, DX10). Other GPUs include GeForce GTX 295 and GTX 285 for reference, just so you know how NVIDIA is pitting the two against the Radeon HD 5000 GPUs, given that the figures are already out. With all the three tests, GTX 380 emerged on top, with GTX 360 performing close to the HD 5970. A point to note, however, is that the tests were run at 1920 x 1200, and tests have shown that the higher-end HD 5000 series GPUs, particularly the HD 5970, is made for resolutions higher than 1920 x 1200. AA was also disabled in STALKER Clear Sky. NVIDIA's GeForce GTX 300 will be out in Q1 2010.
Update (12/15): NVIDIA's Director of Public Relations EMEAI told us that these slides are fake, but also "when it's ready it's going to be awesome".
189 Comments on NVIDIA Pitches GeForce GTX 300 Series to Clinch Performance Crown
:D
I would have to agree , that the truth of those posted "benchmark" screenshot is very slim.
Looks like nvidia spent too much money on adverts and bribing ....
Typically happens, though, one leads for a few series, then the other overtakes them for a few series . . .
Now, if GTX380 is a dual-GPU solution (like nVidia have been claiming it will be), then I could see it topping out over the 5970 . . . but, then again, if it scales that poorly from a GTX360, that doesn't bode well for the 300 series as a whole . . . at least compared to the overall gains of 2 GPU setups from the 5000 series.
I guess we'll just have to see.
but, they've been throwing rumors out of releasing a dual-GPU board at the same time the single-GPU board is first released.
If that is the case, then I'd have to fathom the 380 as being a dual-GPU board.
But . . . this is simply (un)founded speculation . . . until it's on the shelves, we can't know for sure what nVidia is up to. Either way, they're lagging seriously behind with their new series.
I mean I don't even rely on artificial benchmarks like futuremark's products, ... I run on my machine my games and my settings , and I probably won't see bad things on either side but the fact why nvidia has that top fps it's cause they just compete with fps , look for example ATI cards have it's own sound card, they are maybe better option when it comes to TV and multimedia cause of the AVIVO (i never used it my self actually) , and most importantly there's been stuff over years I can say, the red side didn't cause crashes or fails too much and had better drivers , well that were how my friends talked who had many gpus from both sides as soon as from 1998. (but I agree the catalyst drivers 9.1 until 9.11 , all those in the middle including 9.1 were really crappy.)
Not to mention screen quality has been praised in the red camp , I do agree on this one cause I can see it my self, the ATI's shadows are really standing out and you can clearly see the difference , this is something I gladly switch over for a few fps.
Now me realizing the source is 3Dguru , I can safely say , that the truth of that pics went from little to zero. Indeed , but if those pics have any little truth , the gtx380 has to be a dual core. That wont tie up with the 260 cause then the 260 would have to be dual core too to fit.
On the other hand , what if this new gpus really have a truly new design , some new super optimizing code for example cuda , that's the key to so fast games , ... again saying that speed is what's i don't see as the main decider anymore , cause both camps have GPUs of that cost that can run almost every game fast enough. Crysis being an exception (Crysis is a presentation of the engine rather more of a game , but the game actually has some of the talent spirit that I like in that game)
Even still . . . when your card ousts the competition by an average of 5 FPS, you can claim the performance title . . . and with that comes all the raging hard-ons for owning a card that's "performance king," no matter what the cost. big reason why nvidia have been able to push such insane pricing for their hardware the last 3-5 years. Personally, I can afford it, but I won't buy nVidia products (for numerous reasons) . . . the average user can't, but if they want "the best of the best of the best," they're willing to fork out the dough.
My biggest wish, for the gaming/harware market as a whole, would be that ATI is finally back to a financially sound position, and can start pushing their ATI Game! program a lot more (which they've rather neglected the last few years). There needs to be competition in the gaming market against TWIMTBP, and ATI just can't affor to do so, ATM. I can't really fathom nVidia doing anything "truly new," they've been milking the same designs for the last few years . . . enough so that both ATI and Intel have made numerous comments on their architecture.
I've been starting to wonder if we're at a point where nVidia are simply "tapped out," and can't take that architecture any further . . . forcing them to go back to R&D . . . if that's the case, then who knows what the results would be? They could be faster, or slower . . . and it would take much longer to get the product to market than was originally thought (although, Fermi is starting to fit this bill quite nicely).
ATI can tell you first hand, though, re-designing a new GPU from scratch, or making major changes to existing architecture, will lead you into a lot of pitfalls.
Myself go a-b and back daily, which leaves the machine about 12-14 hours idle daily.
as to the comment someone made about '60 FPS is the best an LCD can do, who cares'
did it occur to you that people buy these, and play it on games *Drumroll* in the future? when games are SLOWER? 10 more FPS now = 5 more FPS then, and it could be a deal breaker
"The way it's meant to be played" ....indeed! :D
Idle powerconsumtion is important. load i cudnt care less about.