# NVIDIA Pitches GeForce GTX 300 Series to Clinch Performance Crown



## btarunr (Dec 14, 2009)

NVIDIA's latest DirectX 11 compliant GPU architecture, codenamed "Fermi," is getting its first consumer graphics (desktop) implementation in the form of the GeForce GTX 300 series. The nomenclature turned from being obvious to clear, with a set of company slides being leaked to the media, carrying the GeForce GTX 300 series names for the two products expected to come out first: GeForce GTX 380, and GeForce GTX 360. The three slides in public domain as of now cover three specific game benchmarks, where the two graphics cards are pitted against AMD's Radeon HD 5870 and Radeon HD 5970, being part of the company's internal tests. 

Tests include Resident Evil 5 (HQ settings, 1920x1200, 8x AA, DX10), STALKER Clear Sky (Extreme quality, No AA, 1920 x 1200, DX10), and Far Cry 2 (Ultra High Quality, 1920x1200, 8x AA, DX10). Other GPUs include GeForce GTX 295 and GTX 285 for reference, just so you know how NVIDIA is pitting the two against the Radeon HD 5000 GPUs, given that the figures are already out. With all the three tests, GTX 380 emerged on top, with GTX 360 performing close to the HD 5970. A point to note, however, is that the tests were run at 1920 x 1200, and tests have shown that the higher-end HD 5000 series GPUs, particularly the HD 5970, is made for resolutions higher than 1920 x 1200. AA was also disabled in STALKER Clear Sky. NVIDIA's GeForce GTX 300 will be out in Q1 2010.

*Update (12/15):* NVIDIA's Director of Public Relations EMEAI told us that these slides are fake, but also "when it's ready it's going to be awesome".



 

 



*View at TechPowerUp Main Site*


----------



## DrunkenMafia (Dec 14, 2009)

damn, looks like they have a good product there.  Will this be the beginning of yet another GFX king cycle?? 

Will wait for some 3rd party tests to see what they find.


----------



## slyfox2151 (Dec 14, 2009)

wait so are these benchmarks confirmed now?

i would assume they havnt been and are still possibly fake.


----------



## btarunr (Dec 14, 2009)

slyfox2151 said:


> wait so are these benchmarks confirmed now?
> 
> i would assume they havnt been and are still possibly fake.



They're as 'confirmed' as similar slides from AMD were, ahead of product launches.


----------



## qwerty_lesh (Dec 14, 2009)

Internal tests..
they wont be anything like real world imo.

Good on them if they do achieve this performance level, although I don't believe nvidia will.


----------



## sapetto (Dec 14, 2009)

Ha again?! Lol ATI also showed similar slides and they were not very correct (in the slides 5870 outperforms GTX295).


----------



## btarunr (Dec 14, 2009)

sapetto said:


> Ha again?! Lol ATI also showed similar slides and they were not very correct (in the slides 5870 outperforms GTX295).



IIRC, there was no GTX 295 in HD 5870 slides. That was GTX 285.


----------



## laszlo (Dec 14, 2009)

i didn't expected almost 50% more from fermi but the price of gt380 will be almost double compared to HD5870;only gt360 will be placed under it i think around 300$


----------



## sapetto (Dec 14, 2009)

btarunr said:


> IIRC, there was no GTX 295 in HD 5870 slides. That was GTX 285.


Here is an example


----------



## laszlo (Dec 14, 2009)

sapetto said:


> Here is an example
> http://i25.tinypic.com/2emg9zs.jpg



if you check on your slide far cry 2 results are similar to the posted ones so i think they're real


----------



## Tatty_One (Dec 14, 2009)

It seems pretty beleivable to me, although Nvidia will have cherry picked the games of course so we have not seen the complete picture yet, if the 380 is faster (across the board) than the 5970 then I would expect it to be 20% more expensive and if the 360 is quicker than the 5870 then I would expect it's pricing to be around midway between the 5870 and the 5970 and thats gonna be pricey!  of course AMD will bring their prices down a bit and then we all have to decide if it's "Bang for buck" or sheer power we want..... it just all goes full circle really.... again.....and again......and again.


----------



## toyo (Dec 14, 2009)

If this is really true, I must ask:

GTX 360, what's your price???

And... for AMD... it was sweet while it lasted...

But... this is nothing more than the fake empty shell Jen-Hsun Huang pranced around with pretending it was a whole Fermi. Most probably just a desperate try to ruin holiday sales for the red team...


----------



## shevanel (Dec 14, 2009)

can't wait to see how much money it's going to take to own one.


----------



## toyo (Dec 14, 2009)

And I wonder what the game devs will do with so much power now available...


----------



## laszlo (Dec 14, 2009)

toyo said:


> And I wonder what the game devs will do with so much power now available...




nothing because majority of gamers don't have high-end cards and they have to sell the games to this section of the market which is the bigger one.


----------



## shevanel (Dec 14, 2009)

i also wanna know how hot their cards are running.

heat output and price are my two biggest issues.


----------



## Amok (Dec 14, 2009)

How can you not see these are fakes? They were posted on forums by some dude named succesfull troll. The irony )

First of all, PR slides released by Nvidia or Ati only show percentages of performance, never do they show fps numbers. 

Check the slide below to convince yourselves






Second, this is a real review, from xbitlabs.






This is the slide






Both the review slide and the so called "PR slide" use no AA, DX 10, highest settings etc. 

it's pretty obvious the GTX 285/295/Radeon 5870 /5970 scores are made up, they do not perform that well on high-end rigs at those settings.

These slides are good fakes, but fakes nontheless.

It would be great to see Fermi being such a good performer, but we have to be real, these are fakes.

It's been already discussed here quite a bit.

http://www.xtremesystems.org/forums/showthread.php?t=240958


----------



## iamverysmart (Dec 14, 2009)

You could just ignore the ATI benchmarks and only look at the nVidia cards.
The GTX380 should perform a bit more then double of a GTX285?
512 vs 240 shaders.


----------



## Amok (Dec 14, 2009)

but the GTX 285 does not score a 44 fps average on those settings in that game. Not even a GTX 295 does. The slides have fake results. 

And  once again, Nvidia always shows percentages of performance in their slides, not actual numbers.


----------



## iamverysmart (Dec 14, 2009)

Maybe they had a custom benchmark that had them facing a wall.


----------



## laszlo (Dec 14, 2009)

Amok the stalker slide is correct see here:http://www.tomshardware.com/reviews/radeon-hd-5870,2422-12.html

same settings same results...


----------



## Tatty_One (Dec 14, 2009)

Amok said:


> but the GTX 285 does not score a 44 fps average on those settings in that game. Not even a GTX 295 does. The slides have fake results.
> 
> And  once again, Nvidia always shows percentages of performance in their slides, not actual numbers.



Not so sure, the results here resemble reviews/findings from this site, taking into account the increased AA here for example, look at far cry 2 in this thread and its settings at 19XX..... then look at the TPU review of the 5870 which shows GTX285 also here....

http://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/14.html


----------



## Amok (Dec 14, 2009)

Look at these slides, from the THG forums.












THG never did a GTX 300 series review. I can make such slides as well, would you believe me?


Still think they are real?


----------



## leonard_222003 (Dec 14, 2009)

You people know what will happen , don't you ?
Amd/Ati will reduce the price ridiculously low if they lose on performance to much , Nvidia will start screaming at the little people to reduce costs ( integrators ) , the little people will start making both or defect to the enemy.


----------



## crow1001 (Dec 14, 2009)

FAKE!!

Take em down, you all look like suckers.


----------



## Amok (Dec 14, 2009)

And here is the original thread,

http://www.tomshardware.co.uk/forum/page-276100_15_0.html

Posted by a user named "Succesfull troll"

Come on, obvious fakes, so many people took the bite...


----------



## csendesmark (Dec 14, 2009)

Amok said:


> And here is the original thread,
> 
> http://www.tomshardware.co.uk/forum/page-276100_15_0.html
> 
> ...



LOL
NAME FAIL


----------



## buggalugs (Dec 14, 2009)

Nvidia can kiss my ass.


----------



## laszlo (Dec 14, 2009)

the slides taken from guru maybe fakes but the numbers used in it for the existing cards are real and confirmed by existing reviews;all we don't know is if fermi numbers are invented or not

i expect fermi to be better than cypress but i don't know the% so we must wait....


----------



## Amok (Dec 14, 2009)

It's not hard to collect numbers from existing reviews, make up some other numbers and make a powerpoint slide to resemble an nvidia official slide.


But, important point is that i never saw an official Nvidia or Ati slide to contain actual numbers, they only contain percentages.

Again, examples of official PR slides.













They never include actual performance numbers, just percentages.


----------



## Imsochobo (Dec 14, 2009)

sapetto said:


> Ha again?! Lol ATI also showed similar slides and they were not very correct (in the slides 5870 outperforms GTX295).



And it does.

In fact, by a large margin in diffrent games, i would say 5870 is about on PAR with 295 with the latest driver.











I don't recall any of those games being more ati friendly than nvidia friendly.


----------



## the54thvoid (Dec 14, 2009)

I'm new here but have been an avid techpowerup watcher.  These slides are not the real deal.  I recall a great many bench reviews where Stalker has ATI cards outperfomring NV quite competently.  A number of sites reviewed show the 5870 beat the 295.  Toms Hardware is one of the exceptions.
You still get biased hardware sites that favour one camp.  I had a 295 and ditched it for 2 5850's.  I can speak for both camps.  Based on the reviews i read before purchasing my 5850's (i read practically every english review on the web - i'm that careful) these benchies are fabricated.  Stalker is the key here - NV performs poorly in clear sky - it's an ATI game.  Here are some reviews that contain a 5870 beating a 295 in other games than stalker.
http://www.guru3d.com/article/radeon-hd-5870-review-test/18
http://www.guru3d.com/article/radeon-hd-5870-review-test/16
http://www.hardwarecanucks.com/foru...phire-radeon-hd-5870-1gb-gddr5-review-16.html
http://www.legitreviews.com/article/1080/6/
and the most amusing from this very website you amnesic puppies
http://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/20.html (for Stalker SOC)
http://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/21.html (for stalker clear skies)

So, before the camps go to war, do your research... and beware false prophets.


----------



## Csokis (Dec 14, 2009)

Hmm... DX11 bench?


----------



## InnocentCriminal (Dec 14, 2009)

It's marketing guff - end of. It's when the card gets reviewed that I'll be interested.

Oh and welcome to the forums the54thvoid, don't forget to fill in your System Specification.


----------



## Animalpak (Dec 14, 2009)

It seems to me a joke the single GTX 380 is faster than the long and weighing 5970? If it is true I have to laugh ahaha.. Owned


----------



## Lionheart (Dec 14, 2009)

ZOMG, I just realized the Z in ZOMG doesnt mean anything lol, oh yeah about these benchmarks reviews, complete BS


----------



## z1tu (Dec 14, 2009)

what a pathetic attempt, they mean to tell me that the 5970 is just barely better than the 360 will be and in some cases even worse?  Who the hell falls for crap like this?:shadedshu


----------



## crow1001 (Dec 14, 2009)

I like to see sites like tech get a fair share of traffic but this is not the way to do it, guru know they are fake but they know it will attract loads of hits, tech does not have to go down this root when its so obvious they are fake.


----------



## HammerON (Dec 14, 2009)

Nay sayers ~ let us wait and see


----------



## blibba (Dec 14, 2009)

225W TDP... gief GTX395 nao.


----------



## qubit (Dec 14, 2009)

crow1001 said:


> FAKE!!
> 
> Take em down, you all look like suckers.



They might be, but I wouldn't be surprised if the real figures are in this ballpark.

A GTX 300 seems to have about double the performance of a GTX 285, which is not unreasonable.


----------



## crow1001 (Dec 14, 2009)

WTF you talking about " seems " keep guessing away, most NV fanboys do anyway. I'll wait for legit reviews.


----------



## gigabit942007 (Dec 14, 2009)

from what i see this are only DX10 tests no real DX11 tests with tesselation and other dx11 stuff so from what i see they might be better in DX10 but they might suck in DX11 and these tests are not real !


----------



## t77snapshot (Dec 14, 2009)

crow1001 said:


> FAKE!!
> 
> Take em down, you all look like suckers.



hey show some respect to our news Editors ok.:shadedshu



gigabit942007 said:


> from what i see this are only DX10 tests no real DX11 tests with tesselation and other dx11 stuff so from what i see they might be better in DX10 but they might suck in DX11 and these tests are not real !



Well there are very little games right now that even utilize DX11.


----------



## qubit (Dec 14, 2009)

crow1001 said:


> WTF you talking about " seems " keep guessing away, most NV fanboys do anyway. I'll wait for legit reviews.



Just because I have an nvidia card does not make me a "fanboy". Thanks. The tests may well not be real and I said that, too.

And you gotta admit my point is reasonable. After all, the 5870 roughly doubled the performance of the 4890, didn't it?


----------



## Flyordie (Dec 14, 2009)

You guys do realize ATI isn't just gonna stand around and watch this unfold...

There will be an HD5980 or equivalent.  ;-p 

Probably a 512bit GDDR5 HD5890...  As we all know the HD58xx series is very bandwidth starved... giving it that extra bandwidth should pull it ahead of almost everything Nvidia can throw out...


----------



## Amok (Dec 14, 2009)

Fakes.
Look at the closeup, someone pasted in the numbers in photoshop, it's really obvious. 

You can notice the heavy pixelation around the numbers. Cheap stuff






Let me show you what i crafted in 5 minutes... Jen Hsuang gave me this bench himself. Do you believe me?






And i can make as many versions, with whatever scores you want.


----------



## stasdm (Dec 14, 2009)

Whichever the numbers are, they show that GTX3XX is a bad scaler.
380 looses more than 50% cores number gain when compared to 360.

That means there would not be any sence in dual GTX3XX configuration (the productivity gain will hardly be about 5%). So, in high-end gaming it's a looser.


----------



## laszlo (Dec 14, 2009)

Amok said:


> Fakes.
> Look at the closeup, someone pasted in the numbers in photoshop, it's really obvious.
> 
> You can notice the heavy pixelation around the numbers. Cheap stuff
> ...



i don't see any pixel distortions in the posted ones by bta


----------



## Amok (Dec 14, 2009)

you can't see it because i zoomed in on the picture in photoshop and screenshoted it. At the normal low resolution it's not noticeable, but once you zoom in it's really obvious.

I made another zoom in, even further in...
See what i mean. Do it yourself, you will see it as well. Really shabby work from the guy who faked it.


----------



## laszlo (Dec 14, 2009)

Amok said:


> you can't see it because i zoomed in on the picture in photoshop and screenshoted it. At the normal low resolution it's not noticeable, but once you zoom in it's really obvious.
> 
> I made another zoom in, even further in...
> See what i mean. Do it yourself, you will see it as well. Really shabby work from the guy who faked it.
> http://i45.tinypic.com/n71o9e.jpg




i zoom it at max ....nothing... take the posted picture not the uploaded


----------



## Mussels (Dec 14, 2009)

arent those three games nvidia "the way its meant to be played" titles?


----------



## crow1001 (Dec 14, 2009)

t77snapshot said:


> hey show some respect to our news Editors ok.:shadedshu



Get a clue, they are fake, editor needs to acknowledge this in the first post or lock it here, the original thread where these fake results came from has been locked until there is proof to support them, I suggest this goes the same way, but being as they ARE fake that will never happen.


----------



## Amok (Dec 14, 2009)

laszlo said:


> i zoom it at max ....nothing... take the posted picture not the uploaded



Yeah, the jpg version had bad compression, but as you can see, i did another version, would you believe what i did if some guy posted it on the web or in the news section of some big site?


----------



## shevanel (Dec 14, 2009)

just another ploy to keep people talking and that is all good.

meanwhile, I'm about to play some Dirt 2 w/ DX11.. anyone down?


----------



## WarEagleAU (Dec 14, 2009)

I really do not expect Fermi to be better than cypress to the points made in the graphs. I Think Fermi will be a great nvidia product, but AMD is in the win with their ATI cards now and will continue to be when they drop the prices the day Fermi launches.


----------



## laszlo (Dec 14, 2009)

Amok said:


> Yeah, the jpg version had bad compression, but as you can see, i did another version, would you believe what i did if some guy posted it on the web or in the news section of some big site?



see what i mean: http://i49.tinypic.com/23mk11d.jpg


----------



## Imsochobo (Dec 14, 2009)

Flyordie said:


> You guys do realize ATI isn't just gonna stand around and watch this unfold...
> 
> There will be an HD5980 or equivalent.  ;-p
> 
> Probably a 512bit GDDR5 HD5890...  As we all know the HD58xx series is very bandwidth starved... giving it that extra bandwidth should pull it ahead of almost everything Nvidia can throw out...



They aint, i gained less on a memory clock from 1000 mhz vs 1400 mhz.
a core clock from 850 to 950 = way more fps gain, meaning, they aint memory bottlenecked as MANY belive.

G300 will have a 384 bit memory bus running GDDR5 memory modules.
That will have an estimated memory bandwidth of 185-200 gb/sec.

Its funny how 128 bit memory bus diffrence on the 4870 could mean just mere 5-8% eh? they aint very memory bottlenecked, stop arguing about the damn bus, ati know better than you, i belive they've done EXTENSIVE testing to find out what gives most bang for the buck.


----------



## shevanel (Dec 14, 2009)

yeah  512mb of 384bit memory probably.

I'll probably buy one when they release. I'd like to be able to kick paper in Batman AA while simultaneously having DX11 features for my other games.


----------



## Amok (Dec 14, 2009)

Look at what i found... neat huh?


----------



## KainXS (Dec 14, 2009)

never trust any internal or pre release reviews, most of the time they will be fake, and this one could have been done by a little kid from the looks I could make a review saying the HD8870 will be faster than the GTX600, would anyone like to see that, of course not, because it would be fake, just like this.

I can't even believe the mods posted this crap, lol you don't even know what the source is


----------



## Mussels (Dec 14, 2009)

KainXS said:


> never trust any internal or pre release reviews, most of the time they will be fake, and this one could have been done by a little kid from the looks I could make a review saying the HD8870 will be faster than the GTX600, would anyone like to see that, of course not, because it would be fake, just like this.
> 
> I can't even believe the mods posted this crap,



newsposters and mods arent the same people


besides, if you'd actually clicked the link for source...


----------



## KainXS (Dec 14, 2009)

coulda fooled me ^^


----------



## crow1001 (Dec 14, 2009)

Mussels said:


> newsposters and mods arent the same people
> 
> 
> besides, if you'd actually clicked the link for source...
> ...




Well you not think it right that techpowerup should display the same disclaimer in its post?


----------



## Mussels (Dec 14, 2009)

crow1001 said:


> Well you not think it right that techpowerup should display the same disclaimer in its post?



up to the discretion of the news poster - they DO provide links to the source, under the assumption that anyone interested in the source of the post would click it for further information.


----------



## TooFast (Dec 14, 2009)

by the time this thing comes out, the 5890 and 5990 will be ready.


----------



## EastCoasthandle (Dec 14, 2009)

Was this ever confirmed by nvidia? From what was told this is fake.  It started in Tom's Hardware Thread.  As you can see it looks like an official TM benchmark result but look at the user name (Successful_Troll).  Then it appeared the next day at  OCUK forum to look like official nvidia slide.  In this thread, the poster here links back to TM forum grinning about the thread created there.  

This is why I ask if this has been officially confirmed by nvidia via press release on their homepage (for example).  Because as it stands now it doesn't look real based on how this came about.


----------



## Mussels (Dec 14, 2009)

EastCoasthandle said:


> Was this ever confirmed by nvidia? From what was told this is fake.  It started in Tom's Hardware Thread.  As you can see it looks like an official TM benchmark result but look at the user name (Successful_Troll).  Then it was photoshoped in the nd was photoshopped again at the OCUK forum to look like official nvidia slide.  In this thread, the poster here links back to TM forum grinning about the thread created there.
> 
> This is why I ask if this has been officially confirmed by nvidia via press release on their homepage (for example).  Because as it stands now it doesn't look real based on how this came about.



it looks to be completely fake.


----------



## crow1001 (Dec 14, 2009)

I think you will agree when I say techpowerup is a very popular site, so to have something real dodgy pasted on the front page for all to view and with no disclaimer unless you go to the source, this is correct? I think not.


----------



## Amok (Dec 14, 2009)

Don't you understand, both posters on THG and overclockers.co.uk are actually Jen Hsuan in disguise... trying to ruin AMDs Christmas sale...


----------



## KainXS (Dec 14, 2009)

A monkey came back from the future and posted this on those forums, thats the source lol


----------



## Mussels (Dec 14, 2009)

crow1001 said:


> I think you will agree when I say techpowerup is a very popular site, so to have something real dodgy pasted on the front page for all to view and with no disclaimer unless you go to the source, this is correct? I think not.



indeed.

The case has been made, so hopefully BTA will read these points and edit his post.


----------



## scope54 (Dec 14, 2009)

Fake:
http://techreport.com/forums/viewtopic.php?f=3&t=69653

BAD NEWS- slide is a fake. (straight from NVIDIA, this is not leaked)


----------



## Mistral (Dec 14, 2009)

Amok, the graph you posted is out of date. Here's the latest, straight from nV's official internal testing. You can see why it hasn't been made public before.






As you can notice, it includes data from the upcoming mainstream offering of the third major superpower in the graphics market.


----------



## shevanel (Dec 14, 2009)




----------



## Amok (Dec 14, 2009)

Ahhh, my sources kept me on the dark on this one... But still, Larrabee is the king to be...

One GPU to rule them all, One GPU to find them,
One GPU to bring them all and in the darkness bind them all


----------



## KainXS (Dec 14, 2009)

look how fast the chrome is zomg


----------



## Mussels (Dec 14, 2009)

breaking news! etch-a-sketch faster than GTX300!


----------



## mdm-adph (Dec 14, 2009)

This is really sad -- everyone knows the G300 is going to be likely a bit faster than the R800; there was no reason to even make these fake graphs.    It seems like someone was just wanting to try and cut into ATI's Christmas season sales or something.


----------



## halfwaythere (Dec 14, 2009)

Whats really sad is respectable sites published this crap. I guess nowadays you would do just about anything for a few clicks.


----------



## Selene (Dec 14, 2009)

good stuff, Ho Ho Ho Merry Christmas ATI.


----------



## wolf (Dec 14, 2009)

sick to the max.

I want the etch-a-sketch, larrabee, GTX380 and 5970 so I can hug them all and treat them as my children.


----------



## ArmoredCavalry (Dec 14, 2009)

Mussels said:


> http://img.techpowerup.org/091214/Capture294.jpg
> 
> 
> breaking news! etch-a-sketch faster than GTX300!



I lol'ed



mdm-adph said:


> This is really sad -- everyone knows the G300 is going to be likely a bit faster than the R800; there was no reason to even make these fake graphs.    It seems like someone was just wanting to try and cut into ATI's Christmas season sales or something.



Yeah, you would hope that with Nvidia taking all this time their next generation gpu's would be quite a bit faster than the gtx200's, which means you would have higher performance than the hd5k's.... (at least the 5870)

Question is where the heck are they...


----------



## TheMailMan78 (Dec 14, 2009)

Ok one thing no one is talking about here is DX11. All the games that they have shown are DX10. 100% of games currently being developed using DX11 are being done with ATI cards. I think we are going to see a lot of Nvidia people crying foul when the dust clears. Much like ATI people cry foul when they see "TWIMTBP" logo.


----------



## shevanel (Dec 14, 2009)

yeah, the only thing that sucks about owning this ATi card is that when I fired up BATMAN AA everything was the same as when I had a GTx275 but I couldn't kick the paper on the floor. SAD


----------



## mdm-adph (Dec 14, 2009)

TheMailMan78 said:


> Ok one thing no one is talking about here is DX11. All the games that they have shown are DX10. 100% of games currently being developed using DX11 are being done with ATI cards. I think we are going to see a lot of Nvidia people crying foul when the dust clears. Much like ATI people cry foul when they see "TWIMTBP" logo.



I don't see why -- DX11 is an open implementation.  All Nvidia needs to do to take advantage of any advancements in that area is release a DX11 card.

TWIMTBP-based tweaking benefits Nvidia cards and Nvidia cards only.


----------



## TheMailMan78 (Dec 14, 2009)

mdm-adph said:


> I don't see why -- DX11 is an open implementation.  All Nvidia needs to do to take advantage of any advancements in that area is release a DX11 card.
> 
> TWIMTBP-based tweaking benefits Nvidia cards and Nvidia cards only.



DX11 games will be optimized for ATI cards and drivers for a long time to come. Much like TWIMTBP. I'm telling you people are going to start crying foul.


----------



## animal007uk (Dec 14, 2009)

after reading this i think im going to by a c64 lol, tape in load up game on.

but seriously as long as my games run more than 30fps and im happy with how the game looks then it don't matter who has the fastest card out there.

would like to see the price of this nvidia card tho not that im going to buy one.


----------



## karlotta (Dec 14, 2009)

TheMailMan78 said:


> Ok one thing no one is talking about here is DX11. All the games that they have shown are DX10. 100% of games currently being developed using DX11 are being done with ATI cards. I think we are going to see a lot of Nvidia people crying foul when the dust clears. Much like ATI people cry foul when they see "TWIMTBP" logo.


 +1, and if the "PR" graphs are right that is FAIL for NVDA... No AA and only a smidgen above the 5970....PR from nvda.... that is always fail.


----------



## Benetanegia (Dec 14, 2009)

laszlo said:


> i zoom it at max ....nothing... take the posted picture not the uploaded



There's absolutely no artifacts in Bta's pictures. 

Besides those artifacts in Amok's augmented pictures are jpg artifacts. Open up any small jpg file with letters and you will find those artifacts. On png (like bta's) on the other hand...

And I'm not saying they are not fake btw, we don't know. But are not fake in the sense Amok is trying to demostrate.

Sorry Amok, you fail.


----------



## TheMailMan78 (Dec 14, 2009)

Benetanegia said:


> There's absolutely no artifacts in Bta's pictures.
> 
> Besides those artifacts in Amok's augmented pictures are jpg artifacts. Open up any small jpg file with letters and you will find those artifacts. On png (like bta's) on the other hand...
> 
> ...



 Even if there were artifacts that would prove nothing. I can add and remove them all day long.

Edit: Bta gets his news from all over but usually quotes only one source. Not to sound like a brown noser but the man does his homework. If these are in fact fake I would be VERY surprised.


----------



## ArmoredCavalry (Dec 14, 2009)

TheMailMan78 said:


> DX11 games will be optimized for ATI cards and drivers for a long time to come. Much like TWIMTBP. I'm telling you people are going to start crying foul.



All I know is that Dirt2 runs in DX11 w/ all max settings, and is super smooth except in one area (China).

Adjusting graphics doesn't change the slight stutter, as it is the same with 8xAA and no AA...

Maybe somone didn't get to testing that area? (I think its the last to get unlocked...)


----------



## TheMailMan78 (Dec 14, 2009)

ArmoredCavalry said:


> All I know is that Dirt2 runs in DX11 w/ all max settings, and is super smooth except in one area (China).
> 
> Adjusting graphics doesn't change the slight stutter, as it is the same with 8xAA and no AA...
> 
> Maybe somone didn't get to testing that area? (I think its the last to get unlocked...)



I learned one trick that makes the game fly without effecting the graphics. Set everything to Ultra except the car refections. Set those to high. The difference is 30+ fps! Honestly I cant tell the difference ether between high and ultra on that setting.


----------



## ArmoredCavalry (Dec 14, 2009)

TheMailMan78 said:


> I learned one trick that makes the game fly without effecting the graphics. Set everything to Ultra except the car refections. Set those to high. The difference is 30+ fps! Honestly I cant tell the difference ether between high and ultra on that setting.



Ah nice, yah the same thing happened with very high vs Ultra shadows in Far Cry 2 w/ hd 4870...

I'll be sure to try that out, thanks.


----------



## shevanel (Dec 14, 2009)

i run it all on high and everything on ultra that has the option and no issues here. looks great, runs better.


----------



## Binge (Dec 14, 2009)

sapetto said:


> Here is an example
> http://i25.tinypic.com/2emg9zs.jpg



Sir, the 5870 does perform better than the 295 in those examples.  Still the 295 beats the 5870 in a lot of different titles.

I read that BSoN thread on the slides, and I still don't see a source for any of this becoming clear.  Guru said this was to be taken with a big grain of salt.


----------



## FordGT90Concept (Dec 14, 2009)

Bit-tech Clear Sky benchmark...











"Extreme Quality Setting" vs "High Detail"

Unknown AF compared to 16x AF.


It is pretty clear that NVIDIA used average FPS.  At the same time, we can't know for certain if they even ran the same benchmark routine.

All that shows is that the 5870 and GTX 295 should match in performance but NVIDIA shows the GTX 295 being faster in their benchmark.


My conclusion: inconclusive.


----------



## Binge (Dec 14, 2009)

It'll be like this until some ridiculously short time before they go on sale or as they are on sale.  Boy, waiting is the name of the game


----------



## ..'Ant'.. (Dec 14, 2009)

I bet you that the prices for these new nvidia cards will be so expensive and then ATI will lower the prices.


----------



## extrasalty (Dec 14, 2009)

Slide Wars!


----------



## @RaXxaa@ (Dec 14, 2009)

If its even 10 fps higer the GTX380 then we are looking at thousands of over priced dollars on it for just 10 fps


----------



## erocker (Dec 14, 2009)

maq_paki said:


> If its even 10 fps higer the GTX380 then we are looking at thousands *hundreds* of over priced dollars on it for just 10 fps



Fixed.


----------



## dalekdukesboy (Dec 14, 2009)

Mussels said:


> http://img.techpowerup.org/091214/Capture294.jpg
> 
> 
> breaking news! etch-a-sketch faster than GTX300!



that...is the most fucking hilarious thing I've seen in a long time, I literally had to wait a couple minutes to stop laughing to type this response, that etch a sketch dude put me over the hilarity cliff, I chuckled at the chrome/larabee...but that set me up perfectly for scrolling to this...that is just...priceless! lol...shit, I'm going to buy me an etch a sketch for Christmas and replace my 8800gts 512...that'll be like upgrading an atom 1.3 ghz processor to a core i7 on nitrogen at 6 ghz!!!!


----------



## mechtech (Dec 14, 2009)

meh

LCDs can only do 60 FPS, with the exception of a few 120HZ ones on the market, so who cares if a 5850 gets 72fps and the (to be released) Gtx 380 gets 113fps, when an LCD can do 60 fps

Idle power consumption, and noise matter more to me than 120 vs 220 fps anyday


----------



## DrPepper (Dec 14, 2009)

Performance matters to us more than idle power consumption because our gpu's aren't usually idle.


----------



## dalekdukesboy (Dec 14, 2009)

mechtech said:


> meh
> 
> LCDs can only do 60 FPS, with the exception of a few 120HZ ones on the market, so who cares if a 5850 gets 72fps and the (to be released) Gtx 380 gets 113fps, when an LCD can do 60 fps
> 
> Idle power consumption, and noise matter more to me than 120 vs 220 fps anyday



psst, that's why I got a 22 inch g225f viewsonic crt while I still could last year


----------



## TheMailMan78 (Dec 14, 2009)

mechtech said:


> meh
> 
> LCDs can only do 60 FPS, with the exception of a few 120HZ ones on the market, so who cares if a 5850 gets 72fps and the (to be released) Gtx 380 gets 113fps, when an LCD can do 60 fps
> 
> Idle power consumption, and noise matter more to me than 120 vs 220 fps anyday



Wrong. Some LCDs can do 240. It also depends on the resolution.


----------



## imperialreign (Dec 14, 2009)

Well, not to get too wrapped up with any supposed controversy . . . I'll wait until TPUs reviews are up to draw solid concclusions.


THe one thing that has perked my interest, though, is the resolutions the've used for the slides . . .

Taking a grain of salt that nVidia more than likely hand-picked any results to show in their favor (like all companies are good about doing), and the fact that at 1920x1200 they seem to perform near on par with the 5000 series . . . adn the scaling from 360 to the 380 doesn't look like much of a gain . . . I get the impression the GTX3xx series will be running neck and neck with the 5000 series . . . doesn't appear like there will be too much of a real performance gain over ATI's hardware.

If this does turn out to be true, hopefully the pricing game will be much closer, which will really benefit us consumers.


----------



## WhiteLotus (Dec 14, 2009)

InnocentCriminal said:


> It's marketing guff - end of. It's when the card gets reviewed that I'll be interested.
> 
> Oh and welcome to the forums the54thvoid, don't forget to fill in your System Specification.



What he said.

I never take any notice of PR releases. EVER. each camp uses it's own method to make theirs look better.

Truth be told though, Nvidia will most likely take the crown again, but WILL be the more expensive card.


----------



## TheMailMan78 (Dec 14, 2009)

imperialreign said:


> Well, not to get too wrapped up with any supposed controversy . . . I'll wait until TPUs reviews are up to draw solid concclusions.
> 
> 
> THe one thing that has perked my interest, though, is the resolutions the've used for the slides . . .
> ...


 I agree. But half the fun of TPU is wild speculation and baseless facts.


----------



## imperialreign (Dec 14, 2009)

TheMailMan78 said:


> I agree. But half the fun of TPU is wild speculation and baseless facts.



Yeah . . . we got that down to an art form at this point.

Just to let y'all know - I personally predict the GTX 300 series to be capable of solving world hunger, ending the fighting in the Middle East, and can launch the space shuttle all at the same time.


----------



## TooFast (Dec 14, 2009)

I think its time to buy some amd stock ;]


----------



## PP Mguire (Dec 14, 2009)

With 11 posts how do we know your not a troller here who is an ATI fanboy?

You seem pretty set on these being fakes when practically everyone here knows PR stunts are fakes but you fail to see that people here are telling you the Stalker numbers are true.

And with a GTX280 at those settings i did get around 44fps avg. As to the only DX10 question, you cant add DX11 percentages when you want to compare them to other DX10 cards. So you use DX10 games. Thats pretty simple to figure out, but since your such a PR genious im guessing you already figured that out?

Edit: Post fail, i forgot this had 3 pages


----------



## stasdm (Dec 14, 2009)

TheMailMan78 said:


> Wrong. Some LCDs can do 240. It also depends on the resolution.



And you brain would not differ between 50 and 60 FPS (only effect of luminiscent lamps makes some difference).


----------



## TheMailMan78 (Dec 14, 2009)

stasdm said:


> And you brain would not differ between 50 and 60 FPS (only effect of luminiscent lamps makes some difference).



Maybe yours cant but mine can.


----------



## Binge (Dec 14, 2009)

TheMailMan78 said:


> Maybe yours cant but mine can.



Nerd response:  I guess that guy doesn't understand that all people are genetically unique, and conditioning can even accelerate visual comprehension.


----------



## TheMailMan78 (Dec 14, 2009)

Binge said:


> Nerd response:  I guess that guy doesn't understand that all people are genetically unique, and conditioning can even accelerate visual comprehension.



Its just the old dead horse of "How many FPS can the human eye see".


----------



## stasdm (Dec 14, 2009)

TheMailMan78 said:


> Its just the old dead horse of "How many FPS can the human eye see".


----------



## 1c3d0g (Dec 14, 2009)

I wonder how much this beast can help Folding@Home...


----------



## PP Mguire (Dec 14, 2009)

1c3d0g said:


> I wonder how much this beast can help Folding@Home...



A reasonable post in this thread


----------



## Nailezs (Dec 14, 2009)

someone dig up that thread we had going a couple months ago about how many fps the eye and brain can see
somewhere in there was proof that the human eye/brain combo(or whatever u want to call it) could register 200fps+, and used some USAF pilot testing as evidence


----------



## Binge (Dec 14, 2009)

Nailezs said:


> someone dig up that thread we had going a couple months ago about how many fps the eye and brain can see
> somewhere in there was proof that the human eye/brain combo(or whatever u want to call it) could register 200fps+, and used some USAF pilot testing as evidence



Or we could get back on topic.  Both are great options. 

+1 F@H benchmarks.  The fake ones released a month back were totally disappointing.  I hate 73h liarz!


----------



## Aceman.au (Dec 14, 2009)

ATI will just lower their prices and once again be the better deal... Nvidia shoot themselves in the foot everytime with having high prices... If the prices were low like ATI I'd consider getting a fermi based card.


----------



## Nailezs (Dec 14, 2009)

ok, i agree that nvidia cards are more expensive but, current generation to current generation, nvidia consistantly has outperformed ati, justifing the higher prices.


----------



## Bjorn_Of_Iceland (Dec 14, 2009)

Really hard to believe this nvidia statement now.. 

it takes a couple of hours to meld a cooler on a PCB and cut it.. it takes a couple of minutes to think of numbers and make a graph.



l33tGaMeR said:


> ATI will just lower their prices and once again be the better deal... Nvidia shoot themselves in the foot everytime with having high prices...


myeah.. welcome to competition 101 -_-


----------



## MK4512 (Dec 14, 2009)

If it costs anything like a GTX 295, you might need to take out a mortgage 



laszlo said:


> i didn't expected almost 50% more from fermi but the price of gt380 will be almost double compared to HD5870;only gt360 will be placed under it i think around 300$



Wait what? If it preforms better than a 295, it should cost more than a 295, or close to it.


----------



## Binge (Dec 14, 2009)

MK4512 said:


> If it costs anything like a GTX 295, you might need to take out a mortgage



I don't get it   Do you live in a cardboard box on a section of a golf course or something?


----------



## MK4512 (Dec 14, 2009)

Binge said:


> I don't get it   Do you live in a cardboard box on a section of a golf course or something?



It's called "humble living"! And I'll thank you not to call it a box! It is the highest quality paper pulp shipping container this side of the dump!


----------



## johnnyfiive (Dec 14, 2009)

yay sauce.


----------



## OneCool (Dec 14, 2009)

funny stuff!!


----------



## punani (Dec 14, 2009)

But... can it run crysis?


----------



## RoutedScripter (Dec 14, 2009)

I won't start bait here , but the looks of the state in which nvidia is now , it's faked fermi pesentation , there is very little time in which they can develop a GPU that's 10% faster than 5970. Not to mention that the screnshot is showing supposably a single core GTX380 winning over a dual core and also a whole 10% faster , that's pretty much too good to be true.

I would have to agree , that the truth of those posted "benchmark" screenshot is very slim.

Looks like nvidia spent too much money on adverts and bribing ....


----------



## Tatty_One (Dec 15, 2009)

RuskiSnajper said:


> I won't start bait here , but the looks of the state in which nvidia is now , it's faked fermi pesentation , there is very little time in which they can develop a GPU that's 10% faster than 5970. Not to mention that the screnshot is showing supposably a single core GTX380 winning over a dual core and also a whole 10% faster , that's pretty much too good to be true.
> 
> I would have to agree , that the truth of those posted "benchmark" screenshot is very slim.



I agree that the truth in this thread is very slim, but truly, expect GT300 to be faster than it's competition, I have been observing and contibuting to these damn wars since..... well longer than I care to remeber and more often than not the green side tends to come out on top howerver at a price..... a price that some are willing to pay, whichever way you care to look at it though, it's gotta be good for the consumer, if the green side werent competative at least, we would not see price decreases from  anyone.


----------



## Lionheart (Dec 15, 2009)

SNIFF SNIFF!!! I smell bullshit


----------



## imperialreign (Dec 15, 2009)

Tatty_One said:


> I agree that the truth in this thread is very slim, but truly, expect GT300 to be faster than it's competition, I have been observing and contibuting to these damn wars since..... well longer than I care to remeber and more often than not the green side tends to come out on top howerver at a price..... a price that some are willing to pay, whichever way you care to look at it though, it's gotta be good for the consumer, if the green side werent competative at least, we would not see price decreases from  anyone.





I agree . . . except that back around the time of the 1900/7900 series, ATI were leading the performance market, and a few series prior.  ATi really fell off with the 2000 series, though . . .

Typically happens, though, one leads for a few series, then the other overtakes them for a few series . . .


Now, if GTX380 is a dual-GPU solution (like nVidia have been claiming it will be), then I could see it topping out over the 5970 . . . but, then again, if it scales that poorly from a GTX360, that doesn't bode well for the 300 series as a whole . . . at least compared to the overall gains of 2 GPU setups from the 5000 series.

I guess we'll just have to see.


----------



## phanbuey (Dec 15, 2009)

I never thought nv claimed the gtx 380 to be dual gpu


----------



## imperialreign (Dec 15, 2009)

phanbuey said:


> I never thought nv claimed the gtx 380 to be dual gpu



Well, they haven't yet (as far as I know) . . .

but, they've been throwing rumors out of releasing a dual-GPU board at the same time the single-GPU board is first released.

If that _is_ the case, then I'd have to fathom the 380 as being a dual-GPU board.

But . . . this is simply (un)founded speculation . . . until it's on the shelves, we can't know for sure what nVidia is up to.  Either way, they're lagging seriously behind with their new series.


----------



## RoutedScripter (Dec 15, 2009)

That's true and it's usually the green who's a little faster always , but that doesn't mean anything , people need to realize those extra few fps aren't anything , they are either better support in games for nvidia , optimizations ,benchmarks ... I mean I hardly care about fps.


I mean I don't even rely on artificial benchmarks like futuremark's products, ... I run on my machine my games and my settings , and I probably won't see bad things on either side but the fact why nvidia has that top fps it's cause they just compete with fps , look for example ATI cards have it's own sound card, they are maybe better option when it comes to TV and multimedia cause of the AVIVO (i never used it my self actually) , and most importantly there's been stuff over years I can say, the red side didn't cause crashes or fails too much and had better drivers , well that were how my friends talked who had many gpus from both sides as soon as from 1998. (but I agree the catalyst drivers 9.1 until 9.11 , all those in the middle including 9.1 were really crappy.)

Not to mention screen quality has been praised in the red camp , I do agree on this one cause I can see it my self, the ATI's shadows are really standing out and you can clearly see the difference , this is something I gladly switch over for a few fps. 

Now me realizing the source is 3Dguru , I can safely say , that the truth of that pics went from little to zero.





imperialreign said:


> Well, they haven't yet (as far as I know) . . .
> 
> but, they've been throwing rumors out of releasing a dual-GPU board at the same time the single-GPU board is first released.
> 
> ...



Indeed , but if those pics have any little truth , the gtx380 has to be a dual core. That wont tie up with the 260 cause then the 260 would have to be dual core too to fit.

On the other hand , what if this new gpus really have a truly new design , some new super optimizing code for example cuda , that's the key to so fast games , ... again saying that speed is what's i don't see as the main decider anymore , cause both camps have GPUs of that cost that can run almost every game fast enough.  Crysis being an exception (Crysis is a presentation of the engine rather more of a game , but the game actually has some of the talent spirit that I like in that game)


----------



## imperialreign (Dec 15, 2009)

RuskiSnajper said:


> That's true and it's usually the green who's a little faster always , but that doesn't mean anything , people need to realize those extra few fps aren't anything , they are either better support in games for nvidia , optimizations ,benchmarks ... I mean I hardly care about fps.



Well, IMHO, the last few series have been running so close together that I really don't think the differences anymore are boiling down to hardware . . . but rather developer optimization . . . and we all know that the majority of games are better optimized for nVidia's hardware, much thanks to their TWIMTBP program.  I'd love for ATI to step it up a notch and start pushing their ATI Game! program a bit more, but alas . . . fundage is rather tight . . .

Even still . . . when your card ousts the competition by an average of 5 FPS, you can claim the performance title . . . and with that comes all the raging hard-ons for owning a card that's "performance king," no matter what the cost.  big reason why nvidia have been able to push such insane pricing for their hardware the last 3-5 years.  Personally, I can afford it, but I won't buy nVidia products (for numerous reasons) . . . the average user can't, but if they want "the best of the best of the best," they're willing to fork out the dough.

My biggest wish, for the gaming/harware market as a whole, would be that ATI is finally back to a financially sound position, and can start pushing their ATI Game! program a lot more (which they've rather neglected the last few years).  There needs to be competition in the gaming market against TWIMTBP, and ATI just can't affor to do so, ATM.




RuskiSnajper said:


> Indeed , but if those pics have any little truth , the gtx380 has to be a dual core. That wont tie up with the 260 cause then the 260 would have to be dual core too to fit.
> 
> On the other hand , what if this new gpus really have a truly new design , some new super optimizing code for example cuda , that's the key to so fast games , ... again saying that speed is what's i don't see as the main decider anymore , cause both camps have GPUs of that cost that can run almost every game fast enough.  Crysis being an exception (Crysis is a presentation of the engine rather more of a game , but the game actually has some of the talent spirit that I like in that game)



I can't really fathom nVidia doing anything "truly new," they've been milking the same designs for the last few years . . . enough so that both ATI and Intel have made numerous comments on their architecture.

I've been starting to wonder if we're at a point where nVidia are simply "tapped out," and can't take that architecture any further . . . forcing them to go back to R&D . . . if that's the case, then who knows what the results would be?  They could be faster, or slower . . . and it would take much longer to get the product to market than was originally thought (although, Fermi is starting to fit this bill quite nicely).

ATI can tell you first hand, though, re-designing a new GPU from scratch, or making major changes to existing architecture, will lead you into a lot of pitfalls.


----------



## Makaveli (Dec 15, 2009)

lol the best part of this thread are those slides. Can you guys keep it up I wanna see a powervr chip in there, also I would love to see the speed of a Trident x4 STI turbo "aka AMDNV Killer"


----------



## eidairaman1 (Dec 15, 2009)

DrPepper said:


> Performance matters to us more than idle power consumption because our gpu's aren't usually idle.



Thats if you dont have a job, or you work from home

Myself go a-b and back daily, which leaves the machine about 12-14 hours idle daily.


----------



## TheMailMan78 (Dec 15, 2009)

eidairaman1 said:


> Thats if you dont have a job, or you work from home
> 
> Myself go a-b and back daily, which leaves the machine about 12-14 hours idle daily.



S3 sleep or just turn the damn thing off. Fixed.


----------



## eidairaman1 (Dec 15, 2009)

even still, suppose your not doing any graphic demanding task while away, such as defrag, torrents etc.


----------



## TheMailMan78 (Dec 15, 2009)

eidairaman1 said:


> even still, suppose your not doing any graphic demanding task while away, such as defrag, torrents etc.



Not enough consumption to make any real difference in your power bill. Its all gimmicks man. If you have 50 computers running on one bill THEN you worry about such things. A dirty AC filter will cost you more money.


----------



## Mussels (Dec 15, 2009)

stasdm said:


> And you brain would not differ between 50 and 60 FPS (only effect of luminiscent lamps makes some difference).



on my old CRT i could easily see a difference from 60Hz to 120Hz, same with FPS. i was disapointed when i went LCD, due to that (but other things made up for it)



TheMailMan78 said:


> Maybe yours cant but mine can.


same




Nailezs said:


> someone dig up that thread we had going a couple months ago about how many fps the eye and brain can see
> somewhere in there was proof that the human eye/brain combo(or whatever u want to call it) could register 200fps+, and used some USAF pilot testing as evidence



i posted that so many times i hate posting it. The human eye can see one odd frame out of 200+, it all depends on how good your brain is (aka, how well trained you are - look at how fast CSS kiddies are vs a 40 year old noob)


as to the comment someone made about '60 FPS is the best an LCD can do, who cares'

did it occur to you that people buy these, and play it on games *Drumroll* in the future? when games are SLOWER? 10 more FPS now = 5 more FPS then, and it could be a deal breaker


----------



## ShadowFold (Dec 15, 2009)

KainXS said:


> coulda fooled me ^^



CUDA fooled me ...


----------



## SummerDays (Dec 15, 2009)

I guess Nvidia was trying to show that their GTX 295 was faster than a 5870 while being slightly more expensive.


----------



## AddSub (Dec 15, 2009)

Impressive stuff. I can't wait to tri-SLI those monster GPU's. 

"The way it's meant to be played" ....indeed!


----------



## PP Mguire (Dec 15, 2009)

RuskiSnajper said:


> That's true and it's usually the green who's a little faster always , but that doesn't mean anything , people need to realize those extra few fps aren't anything , they are either better support in games for nvidia , optimizations ,benchmarks ... *I mean I hardly care about fps*.
> 
> 
> I mean I don't even rely on artificial benchmarks like futuremark's products, ... I run on my machine my games and my settings , and I probably won't see bad things on either side but the fact why nvidia has that top fps it's cause they just compete with fps , look for example ATI cards have it's own sound card, they are maybe better option when it comes to TV and multimedia cause of the AVIVO (i never used it my self actually) , and most importantly there's been stuff over years I can say, the red side didn't cause crashes or fails too much and had better drivers , well that were how my friends talked who had many gpus from both sides as soon as from 1998. (but I agree the catalyst drivers 9.1 until 9.11 , all those in the middle including 9.1 were really crappy.)
> ...


Then why not just get a 9500GT or 5670 or something really cheap and try running games on high and see if you care about fps?



imperialreign said:


> Well, IMHO, the last few series have been running so close together that I really don't think the differences anymore are boiling down to hardware . . . but rather developer optimization . . . and we all know that the majority of games are better optimized for nVidia's hardware, much thanks to their TWIMTBP program.  I'd love for ATI to step it up a notch and start pushing their ATI Game! program a bit more, but alas . . . fundage is rather tight . . .
> 
> Even still . . . when your card ousts the competition by an average of 5 FPS, you can claim the performance title . . . and with that comes all the raging hard-ons for owning a card that's "performance king," no matter what the cost.  big reason why nvidia have been able to push such insane pricing for their hardware the last 3-5 years.  Personally, I can afford it, but I won't buy nVidia products (for numerous reasons) . . . the average user can't, but if they want "the best of the best of the best," they're willing to fork out the dough.
> 
> ...


Fermi is new, and its not a milked 285. The specs alone say that.


----------



## Imsochobo (Dec 15, 2009)

TheMailMan78 said:


> S3 sleep or just turn the damn thing off. Fixed.



download 24/7.

Idle powerconsumtion is important. load i cudnt care less about.


----------



## Mussels (Dec 15, 2009)

Imsochobo said:


> download 24/7.
> 
> Idle powerconsumtion is important. load i cudnt care less about.



then perhaps you should build a dedicated low power download system like i have


----------



## PP Mguire (Dec 15, 2009)

Get an Atom 330/ION in ITX format. Does everything you need and you only need turn your power hungry rig on for games.


----------



## Hayder_Master (Dec 15, 2009)

ok guys enough trolling let's talk abut something useful , leave the fist tests and fake images we are TPU members and we can guess the performance just only from gpu-z read

1- just like GT200 series , we find GTX260 a bit better than 4870 and same thing about GTX295 and 4870x2
2- now maybe there is some different now cuz ATi still with 256bit but they increase the ROP's and texture unit's , NVIDIA put GDDR5 
3- so ATI win in core speed and NVIDIA win in 384bit
4- so for me i think old story back again nvidia have a bit better performance but with overclocking booth ATI and NVIDIA cards i think nvdia clearly look better 
5- still most important thing which is the "PRICE" and PPD "performance per dollar" 
6- other important thing is there is games worth to get high end of DX11 card's 

thanx for reading this guys so this is my opinion guys and i like to hear yours , let's make a useful Discussion


----------



## Nick89 (Dec 15, 2009)

laszlo said:


> i didn't expected almost 50% more from fermi but the price of gt380 will be almost double compared to HD5870;only gt360 will be placed under it i think around 300$



GTX380 will be 700-800$ and GTX360 will be 450-550$. If Nvidia keeps there pricing trend.


----------



## shevanel (Dec 15, 2009)

Nick89 said:


> GTX380 will be 700-800$ and GTX360 will be 450-550$. If Nvidia keeps there pricing trend.



I agree.


----------



## Imsochobo (Dec 15, 2009)

PP Mguire said:


> Get an Atom 330/ION in ITX format. Does everything you need and you only need turn your power hungry rig on for games.



Way to slow, i would kill it with just the browser 

I like my PHII 1V 3ghz M-ATX, low power, high performance.

High performance can still be found with rather low powerconsumtion, the atom may be cheap, but it's not good at all when you talk about p/w.


----------



## Hayder_Master (Dec 15, 2009)

Nick89 said:


> GTX380 will be 700-800$ and GTX360 will be 450-550$. If Nvidia keeps there pricing trend.





i don't think so maybe this is pricing after 6 month from release , i guess GTX380 be 850$-900$ and GTX360 be 600$-700$


----------



## imperialreign (Dec 15, 2009)

PP Mguire said:


> Fermi is new, and its not a milked 285. The specs alone say that.



I've seen that, and as I said, Fermi is actually fitting the bill of what one would expect from a "new" design . . . long R&D, pushed-back release dates, etc., etc.

But, although the specs show what should be something new, let's not forget that both camps are notorious for releasing specs that would show new architecture, but once they near release, it turns out to just be a re-hash of existing hardware.

Personally, I can't draw solid conclusions either way . . . not until this series is actually on the doorstep, so-to-speak.


----------



## Amok (Dec 15, 2009)

Here comes Santa.....


----------



## wolf (Dec 15, 2009)

See that might be funny if NVIDIA had never really made a powerful GPU before...

just like INTEL haven't been able to yet.

Something more relevant might have been "A DX11 GPU..." or maybe am amusing quip about re-branding cards.

HA HA.


----------



## TheMailMan78 (Dec 15, 2009)

Imsochobo said:


> download 24/7.
> 
> Idle powerconsumtion is important. load i cudnt care less about.



You don't need a dedicated GPU for that. An IGP will do just fine. Hell my IGP plays L4D2 at decent settings. Again idle power consumption isn't important unless you are talking about volume. Example: 50 computers in one building on the same bill. Thats when idle consumption adds up.


----------



## PP Mguire (Dec 15, 2009)

Imsochobo said:


> Way to slow, i would kill it with just the browser
> 
> I like my PHII 1V 3ghz M-ATX, low power, high performance.
> 
> High performance can still be found with rather low powerconsumtion, the atom may be cheap, but it's not good at all when you talk about p/w.


The atom is meant to be low power when under load. Like max 35w with atom/ion combo. Have you ever tried one? They actually arent that slow, and the 330 is a dual core. If you seriously need dedicated graphics grab the Zotac board with a pci-e slot and shove a single slot 260 in it.



hayder.master said:


> i don't think so maybe this is pricing after 6 month from release , i guess GTX380 be 850$-900$ and GTX360 be 600$-700$


That would ONLY be that way if the 380 was a dual gpu card. I dont expect the 380 to be more than 600 considering its supposed to be single gpu and i dont think Nvidia would be THAT stupid to price over ATIs 5970.



imperialreign said:


> I've seen that, and as I said, Fermi is actually fitting the bill of what one would expect from a "new" design . . . long R&D, pushed-back release dates, etc., etc.
> 
> But, although the specs show what should be something new, let's not forget that both camps are notorious for releasing specs that would show new architecture, but once they near release, it turns out to just be a re-hash of existing hardware.
> 
> Personally, I can't draw solid conclusions either way . . . not until this series is actually on the doorstep, so-to-speak.


You can rehash current hardware and give it more. This isnt simply a die shrink like g92b or all the 9000 cards. Whether its a completely new design or not its still something alot different.


----------



## Amok (Dec 15, 2009)

wolf said:


> See that might be funny if NVIDIA had never really made a powerful GPU before...
> 
> just like INTEL haven't been able to yet.
> 
> ...



Fixed


----------



## Semi-Lobster (Dec 15, 2009)

http://www.fudzilla.com/content/view/16843/1/

Fudo says: Fermi GTX 380 / GTX 360 benches are fake


----------



## laszlo (Dec 15, 2009)

is irrelevant if they're fakes or not ;i'm not a fan of any gpu maker but i expect fermi close to the "photoshopped" numbers considering the released official nvidia infos about architecture and other details.. my 2 c


----------



## DrPepper (Dec 15, 2009)

Semi-Lobster said:


> http://www.fudzilla.com/content/view/16843/1/
> 
> Fudo says: Fermi GTX 380 / GTX 360 benches are fake



I always ignore fudzilla mostly because fud is in the name (fear uncertainty and doubt)


----------



## Marineborn (Dec 15, 2009)

wow nice res to benchmark games in, thats pretty weak


----------



## DrPepper (Dec 15, 2009)

1920 x 1200 ? 

About 2% of pc gamers have a monitor larger than that.


----------



## kylzer (Dec 15, 2009)

DrPepper said:


> I always ignore fudzilla mostly because fud is in the name (fear uncertainty and doubt)



If you read it 

its actually not fud its NH

there usually ok with info.


----------



## Semi-Lobster (Dec 15, 2009)

kylzer said:


> If you read it
> 
> its actually not fud its NH
> 
> there usually ok with info.



Indeed Fud usually cites other sources, its more of a collection of their own stuff and stuff from other places

http://www.nordichardware.com/news,10412.html


----------



## RoutedScripter (Dec 15, 2009)

PP Mguire said:


> Then why not just get a 9500GT or 5670 or something really cheap and try running games on high and see if you care about fps?
> 
> Fermi is new, and its not a milked 285. The specs alone say that.



Yes , but I wasn't exactly talking for low end cards cause i have one of the higher ones , anyways If I would have something like 4650 , I obviously wouldn't behave or have a view like that.



DrPepper said:


> I always ignore fudzilla mostly because fud is in the name (fear uncertainty and doubt)



 

I like to read criticism and hard facts , suspicions  , and "question everything" , going down to hard facts ;;  not reading some adver-articles like you see on the mainstream "news".  Not saying anything about TPU or other good pc sites in particular , but 3Dguru (ign, gametrailers...) always bring up some definitive stuff that only they know truly 100% about , like this screenshots, which aren't looking real , in some 4 sites i seen coments about being fake.


----------



## imperialreign (Dec 15, 2009)

PP Mguire said:


> You can rehash current hardware and give it more. This isnt simply a die shrink like g92b or all the 9000 cards. Whether its a completely new design or not its still something alot different.



I agree - kinda what I was trying to get at . . . nVidia might be claiming "new," white sheets might hint at "new," but sometimes "new" is a stretch of the imagination 

Either way, I'm defi interested to see how these cards will perform.  It's looking like the intense FPS battle between the two camps will continue for another series.


----------



## Benetanegia (Dec 16, 2009)

imperialreign said:


> I agree - kinda what I was trying to get at . . . nVidia might be claiming "new," white sheets might hint at "new," but sometimes "new" is a stretch of the imagination
> 
> Either way, I'm defi interested to see how these cards will perform.  It's looking like the intense FPS battle between the two camps will continue for another series.



imperial, it's definately new, definately new for a GPU. It just takes a look at the white papers and a little comprehension.


----------



## PP Mguire (Dec 16, 2009)

RuskiSnajper said:


> Yes , but I wasn't exactly talking for low end cards cause i have one of the higher ones , anyways If I would have something like 4650 , I obviously wouldn't behave or have a view like that.
> 
> 
> 
> ...


Lol iwas kidding with you  I know not everybody wants to upgrade to the latest and greatest for a few FPS increase. I totally understand but coming from a bencher/OCer point of view an extra 1000 3dmarks never really hurt. (I dont really game much anymore) Best of both worlds imo. 



imperialreign said:


> I agree - kinda what I was trying to get at . . . nVidia might be claiming "new," white sheets might hint at "new," but sometimes "new" is a stretch of the imagination
> 
> Either way, I'm defi interested to see how these cards will perform.  It's looking like the intense FPS battle between the two camps will continue for another series.


Well even if it wasnt "new" if it is as good as everybody is claiming to be completely stomping its predecessor and ATIs current gen i wouldnt care if it wasnt "new". Sometimes out with the old in with the new isnt the best case scenario.


----------



## imperialreign (Dec 16, 2009)

Benetanegia said:


> imperial, it's definately new, definately new for a GPU. It just takes a look at the white papers and a little comprehension.



Oh, I'm not trying to say that it isn't . . . like I mentioned earlier, everything _is_ adding up to be such . . . but, I can't agree for certain until it's actually released.

Both companies have pulled hardware out that has looked brand new, per the white sheets, but has actually turned out to be a simple re-hash of existing hardware.  Last instance was ATI between the 3000 and 4000 series.

I defi agree the GTX 300 series is _new_, especially considering the delay in release, but I'd like to see teh full reviews from trustworthy sites before I swear to that statment. 




PP Mguire said:


> Well even if it wasnt "new" if it is as good as everybody is claiming to be completely stomping its predecessor and ATIs current gen i wouldnt care if it wasnt "new". Sometimes out with the old in with the new isnt the best case scenario.



Wish ATI would grasp this concept


----------



## PP Mguire (Dec 16, 2009)

I think Nvidia did with the FX series :shadedshu


----------



## imperialreign (Dec 16, 2009)

PP Mguire said:


> I think Nvidia did with the FX series :shadudshu





http://www.youtube.com/watch?v=WOVjZqC1AE4


----------



## PP Mguire (Dec 16, 2009)

http://www.youtube.com/watch?v=eYWaUJakMfg&NR=1

Do want.


----------



## thraxed (Dec 16, 2009)

kinda sad when ya think ati will slap out there x2, and then ya have to wait for the 385


----------



## W1zzard (Dec 16, 2009)

Update (12/15): NVIDIA’s Director of Public Relations EMEAI told us that these slides are fake, but also "when it's ready it's going to be awesome".


----------



## RoutedScripter (Dec 16, 2009)

PP Mguire said:


> Lol iwas kidding with you  I know not everybody wants to upgrade to the latest and greatest for a few FPS increase. I totally understand but coming from a bencher/OCer point of view an extra 1000 3dmarks never really hurt. (I dont really game much anymore) Best of both worlds imo.



meh ... not used to the TPU jokes yet ,... but anyways my thoughts are correct , those OCers/benchers don't actually play a lot of games  

So , you can say , what it matters is speed , but you don't see ... err , logntime differences, other experiences if you have a single card for a year or so , you guys just switch to a new every month ; which is not anything bad and Im okay with  , as you said best to the both worlds


----------



## TheMailMan78 (Dec 16, 2009)

W1zzard said:


> Update (12/15): NVIDIA’s Director of Public Relations EMEAI told us that these slides are fake, but also "when it's ready it's going to be awesome".



So is Duke Nukem Forever.  Thanks for the update!


----------



## RoutedScripter (Dec 16, 2009)

TheMailMan78 said:


> So is Duke Nukem Forever.  Thanks for the update!



It's not .... you look now they have official facebook fan page and regular updates , at least the first official words after may...


----------



## PP Mguire (Dec 16, 2009)

RuskiSnajper said:


> meh ... not used to the TPU jokes yet ,... but anyways my thoughts are correct , those OCers/benchers don't actually play a lot of games
> 
> So , you can say , what it matters is speed , but you don't see ... err , logntime differences, other experiences if you have a single card for a year or so , you guys just switch to a new every month ; which is not anything bad and Im okay with  , as you said best to the both worlds



Me and my friend do change hardware alot. I wouldnt say every month but for me its about every 3 months.


----------



## Robert R (Jan 15, 2010)

even so. let say that those are real. i would like to show you something....


----------



## Mussels (Jan 15, 2010)

Robert R said:


> even so. let say that those are real. i would like to show you something....



its fiction.

384 bit bus with 1/2GB of ram? aint gunna happen on DDR5


----------



## Aceman.au (Jan 15, 2010)

Fermi was delayed cause ATI's card shat all over fermi... Now they have to redesign their cards to be better...


----------



## [I.R.A]_FBi (Jan 15, 2010)

orally?


----------



## Mussels (Jan 15, 2010)

[I.R.A]_FBi said:


> orally?



buttrape, is my guess.



EVerything is just guesses and speculation, but its no surprise that if Nv cant keep up with ATI's cards at a competitive price point (read: 10% faster is fine for 30% more expensive, lol) - so they did a redesign. Both sides of the fence do that, when they have a sucky generation.


----------



## DrPepper (Jan 15, 2010)

I hope they nv doesn't take too long I kinda want one. Also want a 5xxx so I can do a comparison. Here's hoping that I win the lottery though


----------

