• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Pitches GeForce GTX 300 Series to Clinch Performance Crown

download 24/7.

Idle powerconsumtion is important. load i cudnt care less about.

then perhaps you should build a dedicated low power download system like i have
 
Get an Atom 330/ION in ITX format. Does everything you need and you only need turn your power hungry rig on for games.
 
ok guys enough trolling let's talk abut something useful , leave the fist tests and fake images we are TPU members and we can guess the performance just only from gpu-z read

1- just like GT200 series , we find GTX260 a bit better than 4870 and same thing about GTX295 and 4870x2
2- now maybe there is some different now cuz ATi still with 256bit but they increase the ROP's and texture unit's , NVIDIA put GDDR5
3- so ATI win in core speed and NVIDIA win in 384bit
4- so for me i think old story back again nvidia have a bit better performance but with overclocking booth ATI and NVIDIA cards i think nvdia clearly look better
5- still most important thing which is the "PRICE" and PPD "performance per dollar"
6- other important thing is there is games worth to get high end of DX11 card's

thanx for reading this guys so this is my opinion guys and i like to hear yours , let's make a useful Discussion
 
i didn't expected almost 50% more from fermi but the price of gt380 will be almost double compared to HD5870;only gt360 will be placed under it i think around 300$

GTX380 will be 700-800$ and GTX360 will be 450-550$. If Nvidia keeps there pricing trend.
 
Get an Atom 330/ION in ITX format. Does everything you need and you only need turn your power hungry rig on for games.

Way to slow, i would kill it with just the browser :(

I like my PHII 1V 3ghz M-ATX, low power, high performance.

High performance can still be found with rather low powerconsumtion, the atom may be cheap, but it's not good at all when you talk about p/w.
 
GTX380 will be 700-800$ and GTX360 will be 450-550$. If Nvidia keeps there pricing trend.



i don't think so maybe this is pricing after 6 month from release , i guess GTX380 be 850$-900$ and GTX360 be 600$-700$
 
Fermi is new, and its not a milked 285. The specs alone say that.

I've seen that, and as I said, Fermi is actually fitting the bill of what one would expect from a "new" design . . . long R&D, pushed-back release dates, etc., etc.

But, although the specs show what should be something new, let's not forget that both camps are notorious for releasing specs that would show new architecture, but once they near release, it turns out to just be a re-hash of existing hardware.

Personally, I can't draw solid conclusions either way . . . not until this series is actually on the doorstep, so-to-speak.
 
Here comes Santa.....


2uzf320.jpg
 
Last edited:
See that might be funny if NVIDIA had never really made a powerful GPU before...

just like INTEL haven't been able to yet.

Something more relevant might have been "A DX11 GPU..." or maybe am amusing quip about re-branding cards.

HA HA.
 
download 24/7.

Idle powerconsumtion is important. load i cudnt care less about.

You don't need a dedicated GPU for that. An IGP will do just fine. Hell my IGP plays L4D2 at decent settings. Again idle power consumption isn't important unless you are talking about volume. Example: 50 computers in one building on the same bill. Thats when idle consumption adds up.
 
Way to slow, i would kill it with just the browser :(

I like my PHII 1V 3ghz M-ATX, low power, high performance.

High performance can still be found with rather low powerconsumtion, the atom may be cheap, but it's not good at all when you talk about p/w.
The atom is meant to be low power when under load. Like max 35w with atom/ion combo. Have you ever tried one? They actually arent that slow, and the 330 is a dual core. If you seriously need dedicated graphics grab the Zotac board with a pci-e slot and shove a single slot 260 in it.

i don't think so maybe this is pricing after 6 month from release , i guess GTX380 be 850$-900$ and GTX360 be 600$-700$
That would ONLY be that way if the 380 was a dual gpu card. I dont expect the 380 to be more than 600 considering its supposed to be single gpu and i dont think Nvidia would be THAT stupid to price over ATIs 5970.

I've seen that, and as I said, Fermi is actually fitting the bill of what one would expect from a "new" design . . . long R&D, pushed-back release dates, etc., etc.

But, although the specs show what should be something new, let's not forget that both camps are notorious for releasing specs that would show new architecture, but once they near release, it turns out to just be a re-hash of existing hardware.

Personally, I can't draw solid conclusions either way . . . not until this series is actually on the doorstep, so-to-speak.
You can rehash current hardware and give it more. This isnt simply a die shrink like g92b or all the 9000 cards. Whether its a completely new design or not its still something alot different.
 
See that might be funny if NVIDIA had never really made a powerful GPU before...

just like INTEL haven't been able to yet.

Something more relevant might have been "A DX11 GPU..." or maybe am amusing quip about re-branding cards.

HA HA.

Fixed :laugh::nutkick:
 
is irrelevant if they're fakes or not ;i'm not a fan of any gpu maker but i expect fermi close to the "photoshopped" numbers considering the released official nvidia infos about architecture and other details.. my 2 c
 
wow nice res to benchmark games in, thats pretty weak
 
1920 x 1200 ?

About 2% of pc gamers have a monitor larger than that.
 
I always ignore fudzilla mostly because fud is in the name (fear uncertainty and doubt)

If you read it

its actually not fud its NH

there usually ok with info.
 
Then why not just get a 9500GT or 5670 or something really cheap and try running games on high and see if you care about fps?

Fermi is new, and its not a milked 285. The specs alone say that.

Yes , but I wasn't exactly talking for low end cards cause i have one of the higher ones , anyways If I would have something like 4650 , I obviously wouldn't behave or have a view like that.

I always ignore fudzilla mostly because fud is in the name (fear uncertainty and doubt)

:wtf:

I like to read criticism and hard facts , suspicions , and "question everything" , going down to hard facts ;; not reading some adver-articles like you see on the mainstream "news". Not saying anything about TPU or other good pc sites in particular , but 3Dguru (ign, gametrailers...) always bring up some definitive stuff that only they know truly 100% about , like this screenshots, which aren't looking real , in some 4 sites i seen coments about being fake.
 
You can rehash current hardware and give it more. This isnt simply a die shrink like g92b or all the 9000 cards. Whether its a completely new design or not its still something alot different.

I agree - kinda what I was trying to get at . . . nVidia might be claiming "new," white sheets might hint at "new," but sometimes "new" is a stretch of the imagination :p

Either way, I'm defi interested to see how these cards will perform. It's looking like the intense FPS battle between the two camps will continue for another series.
 
I agree - kinda what I was trying to get at . . . nVidia might be claiming "new," white sheets might hint at "new," but sometimes "new" is a stretch of the imagination :p

Either way, I'm defi interested to see how these cards will perform. It's looking like the intense FPS battle between the two camps will continue for another series.

imperial, it's definately new, definately new for a GPU. It just takes a look at the white papers and a little comprehension.
 
Yes , but I wasn't exactly talking for low end cards cause i have one of the higher ones , anyways If I would have something like 4650 , I obviously wouldn't behave or have a view like that.



:wtf:

I like to read criticism and hard facts , suspicions , and "question everything" , going down to hard facts ;; not reading some adver-articles like you see on the mainstream "news". Not saying anything about TPU or other good pc sites in particular , but 3Dguru (ign, gametrailers...) always bring up some definitive stuff that only they know truly 100% about , like this screenshots, which aren't looking real , in some 4 sites i seen coments about being fake.
Lol iwas kidding with you :toast: I know not everybody wants to upgrade to the latest and greatest for a few FPS increase. I totally understand but coming from a bencher/OCer point of view an extra 1000 3dmarks never really hurt. (I dont really game much anymore) Best of both worlds imo. :cool:

I agree - kinda what I was trying to get at . . . nVidia might be claiming "new," white sheets might hint at "new," but sometimes "new" is a stretch of the imagination :p

Either way, I'm defi interested to see how these cards will perform. It's looking like the intense FPS battle between the two camps will continue for another series.
Well even if it wasnt "new" if it is as good as everybody is claiming to be completely stomping its predecessor and ATIs current gen i wouldnt care if it wasnt "new". Sometimes out with the old in with the new isnt the best case scenario.
 
imperial, it's definately new, definately new for a GPU. It just takes a look at the white papers and a little comprehension.

Oh, I'm not trying to say that it isn't . . . like I mentioned earlier, everything is adding up to be such . . . but, I can't agree for certain until it's actually released.

Both companies have pulled hardware out that has looked brand new, per the white sheets, but has actually turned out to be a simple re-hash of existing hardware. Last instance was ATI between the 3000 and 4000 series.

I defi agree the GTX 300 series is new, especially considering the delay in release, but I'd like to see teh full reviews from trustworthy sites before I swear to that statment. :p


Well even if it wasnt "new" if it is as good as everybody is claiming to be completely stomping its predecessor and ATIs current gen i wouldnt care if it wasnt "new". Sometimes out with the old in with the new isnt the best case scenario.

Wish ATI would grasp this concept :roll:
 
Back
Top