# NVIDIA to Counter Radeon HD 6970 ''Cayman'' with GeForce GTX 580



## btarunr (Oct 15, 2010)

AMD is undertaking its product development cycle at a breakneck pace, NVIDIA trailed it in the DirectX 11 and performance leadership race by months. This November, AMD will release the "Cayman" GPU, its newest high end GPU, the expectations are that it will outperform the NVIDIA GF100, that is a serious cause for concern, for the green team. It's back to its old tactics of talking about GPUs that haven't even taken shape, to try and water down AMD's launch. Enter, the GF110, NVIDIA's new high-end GPU under design, on which is based the GeForce GTX 580. 

The new GPU is speculated to have 512 CUDA cores, 128 TMUs, and a 512-bit wide GDDR5 memory interface holding 2 GB of memory, with a TDP of close to that of the GeForce GTX 480. In the immediate future, there are prospects of a more realistic-sounding GF100b, which is basically GF100 with all its 512 CUDA cores enabled, while retaining its 384-bit GDDR5 memory interface, 64 TMUs, and slightly higher TDP than that of the GTX 480.

*View at TechPowerUp Main Site*


----------



## mdsx1950 (Oct 15, 2010)

LoL xD

Still no flagship GTX4xx card. And they are already thinking about the GTX 580


----------



## KaelMaelstrom (Oct 15, 2010)

GTX 580??? so fast?? come on Nvidia, where is the grand daddy of the 400s GTX490???


----------



## kkaddu (Oct 15, 2010)

The 580 IS the "490"
This was always the plan. hope this time these come with tripple fan coolers only. lol


----------



## Atom_Anti (Oct 15, 2010)

Nvidia you are so funny, I like you! However you going to deep very soon.


----------



## the54thvoid (Oct 15, 2010)

btarunr said:


> It's back to its old tactics of talking about GPUs that haven't even taken shape, to try and water down AMD's launch



Says it right on the tin.


----------



## afw (Oct 15, 2010)

LOL ... this battle is epic ...


----------



## qubit (Oct 15, 2010)

I'm surprised at nvidia here. I mean why create a new GPU when you can just rebrand and call it "new"? I mean, shit, they've done it for the last 3 or 4 years now and taken the sucker's money, so why stop now?

/sarcasm


----------



## inferKNOX (Oct 15, 2010)

nVidia:





How on Earth can they chop the GTX480 then put out the original GTX480 as a GTX580? What a move of desperation!
Just when I was annoyed with AMD acting like nVidia with the rebranding/renaming/repricing/whatever, nVidia proved themselves truly unique!


----------



## amschip (Oct 15, 2010)

I wonder if this one will still have wood screws or they will upgrade it to metal ones


----------



## csendesmark (Oct 15, 2010)

Capital *L.O.L.*


----------



## crow1001 (Oct 15, 2010)

Apparently the 6970 will not be that much faster than a 480, maybe 10%, so it makes sense for Nvdia to bring out something that will take the fastest single GPU crown back from AMD ASAP, and it won't take much to do it.


----------



## RejZoR (Oct 15, 2010)

512bit bus. We can assume it will be a 4 digit price tag...


----------



## Black Panther (Oct 15, 2010)

crow1001 said:


> Apparently the 6970 will not be that much faster than a 480, maybe 10%,




If that's true, it makes even less sense for AMD to release that 6970.........


----------



## H82LUZ73 (Oct 15, 2010)

Black Panther said:


> If that's true, it makes even less sense for AMD to release that 6970.........
> 
> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/metro_2033_1280_1024.gif



he must be thinking that the re-branding is for the dual card.When in fact it is single gpu and will be about 15-20% faster then the 480.heck if nvidia does release this 580 that makes me happy ,price drops on the AMD cards.


----------



## Red_Machine (Oct 15, 2010)

Well, at least we have a confirmation on the 500 series.  Anybody know when nVidia slated them for launch?  Was it next november?


----------



## SNICK (Oct 15, 2010)

there is no way they can make    512BIT  bus    mammoth in 40nm node.most likely it would be fully enabled gf100 core.power consumtion on 40nm node would be close to 500watts!!!
512 cores gf100 is only 5.64% faster than gtx 480.


----------



## HXL492 (Oct 15, 2010)

A TDP close to the 480GTX???? I can see where this is heading....:shadedshu


----------



## crow1001 (Oct 15, 2010)

Black Panther said:


> If that's true, it makes even less sense for AMD to release that 6970.........
> 
> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/metro_2033_1280_1024.gif



Yes very informative, one game at one low res, as of today the 480 is a good 20% faster in the majority of titles over the 5870, even more depending on how much AA is used

45% quicker at a proper res and with AA.


----------



## qubit (Oct 15, 2010)

Black Panther said:


> If that's true, it makes even less sense for AMD to release that 6970.........
> 
> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/metro_2033_1280_1024.gif



This one looks like a driver glitch to me, BP. As lol-inducing as it is seeing a 4890 beating a GTX 480 in this test, you'd expect the 5870 to beat the 4890.

I reckon that with the latest drivers, you'd see the cards in the expected order of GTX 480, 5870 and the 4890 coming in last.

EDIT: BP edited her post with updated results (see her post 22) so it makes more sense. The link in my quote points to the original results.


----------



## csendesmark (Oct 15, 2010)

crow1001 said:


> Yes very informative, one game at one low res, as of today the 480 is a good 20% faster in the majority of titles over the 5870, even more depending on how much AA is used
> 
> 45% quicker at a proper res and with AA.
> 
> http://img259.imageshack.us/img259/2619/eeep.jpg



...and it's ~20% cheaper


----------



## Black Panther (Oct 15, 2010)

qubit said:


> This one looks like a driver glitch to me, BP. As lol-inducing as it is seeing a 4890 beating a GTX 480 in this test, you'd expect the 5870 to beat the 4890.
> 
> I reckon that with the latest drivers, you'd see the cards in the expected order of GTX 480, 5870 and the 4890 coming in last.



Now that you mention it, I was seeing it a bit weird! 

Also about the 5970 - 6970, I imagined it would be a dual-gpu like its predecessor

I'll edit my post for a more realistic graph, sorry!


----------



## crow1001 (Oct 15, 2010)

The DX11 card will be rendering with the DX11 code, the DX10 cards the DX10 code, therefor a lot more GPU intensive on the DX11 cards.


----------



## MikeX (Oct 15, 2010)

512 shaders, this is probably the 512 shaders binned GTX 480 Nvidia that has kept under the nvidia shoes for so long.


----------



## crow1001 (Oct 15, 2010)

Black Panther said:


> Now that you mention it, I was seeing it a bit weird!
> 
> Also about the 5970 - 6970, I imagined it would be a dual-gpu like its predecessor
> 
> I'll edit my post for a more realistic graph, sorry!



Hardly more realistic lol, that bench was ran in DX9 and not DX11 where the 480 owns the 5870.

So 480 50% quicker at 1080p






http://www.xbitlabs.com/articles/video/display/asus-matrix-5870_9.html#sect1


----------



## caleb (Oct 15, 2010)

Goodie more releases = better prices on decent "older" cards.


----------



## bear jesus (Oct 15, 2010)

btarunr said:


> The new GPU is speculated to have 512 CUDA cores, *128 TMUs*, and a *512-bit wide GDDR5 memory interface holding 2 GB of memory*, *with a TDP of close to that of the GeForce GTX 480*.



 they must have done something impressive with the 40nm process to enable a full 512 set of cuda cores, over double the TMUs and add a third more memory with a one third bigger bus and it sill be close to the 480.
By close i assume they do mean over 300w? whats that a minimum of two 8 pin pci-e connectors?

But still i hope nvidia get going on these fast, more cards from both sides can only mean good things for the consumers (me ).


----------



## Tatty_One (Oct 15, 2010)

bear jesus said:


> they must have done something impressive with the 40nm process to enable a full 512 set of cuda cores, over double the TMUs and add a third more memory with a one third bigger bus and it sill be close to the 480.
> By close i assume they do mean over 300w? whats that a minimum of two 8 pin pci-e connectors?
> 
> But still i hope nvidia get going on these fast, more cards from both sides can only mean good things for the consumers (me ).



Motherboard - 75W
8 Pin PCI-E - 150W
6 pin PCI-E - 75W

total 300W   And that won't be enuff I think so yeah, 2 8 pins!


----------



## buggalugs (Oct 15, 2010)

crow1001 said:


> Hardly more realistic lol, that bench was ran in DX9 and not DX11 where the 480 owns the 5870.
> 
> So 480 50% quicker at 1080p
> 
> ...



 If you look through all the games and 3D benchmarks its no where near 50%, average is 10-20%.You were selective in picking the only game with a decent gap, just like the other guy. For the extra 10-20% you have a blast furnace in your comp and paid an extra $100-$200.

Nvidia is still trying to appeal to the lowest common denominator by making the fastest card any cost. Sales of the 5XXX series should have taught them that most people are a bit more discerning and will weigh up heat/temps/computer noise and cost.

This thing with a 512 bit bus is going to cost more than the 480 and will most likely be over $800 outside the US. Looks like another fail.


----------



## the54thvoid (Oct 15, 2010)

Oh ffs.

Look it's all, the GTX 480 is so good this and so good that.  This new GTX 5blah will so own AMD's next 6series.

Let the adults know when the school yard spat is over and we can talk about realities of performance.  Of course the 480 kicks the 5870's ass, it gobbles up much more power to do so.

AMD's task is and has been to deliver more fps per watt.  Nvidia has went for all out power.  

It'll be the same again.  When this mystical card gets released it will probably be very compromised in some areas to allow it to actually work and it'll still be more power hungry.

The reason Nvidia hasn't got the single *card* performance crown is bacause to date, nobody has got a hold of the power issues.  Don't hold your breath for this card.  

And my final prediction.  AMD 6970 will topple gtx 480 as single fastest chip.  Then some time down the line NV will release their next puppy and it will become king, and on and on and on.


----------



## HossHuge (Oct 15, 2010)

btarunr said:


> In the immediate future




I find this term very subjective...


----------



## stupido (Oct 15, 2010)

the54thvoid said:


> Oh ffs.
> 
> Look it's all, the GTX 480 is so good this and so good that.  This new GTX 5blah will so own AMD's next 6series.
> 
> ...


quite right... I'm personally curious if the GTX 580 will still be 40nm process? If it is not, than all those power issues could be overcome?


----------



## CDdude55 (Oct 15, 2010)

hmm, It seems when it's Nvidia news, everybody complains..:shadedshu

This is good news and i hope for the best so we can get some solid competition going, lets get some of these prices down!!!!


----------



## arnoo1 (Oct 15, 2010)

wil there be a new gtx470 something like 475 or 570?
and i want prive drops,

and that new speculated gtx580 will be damn fast


----------



## DanTheMan (Oct 15, 2010)

I guess W1zzard better get that second air condition ready for these upcoming video card reviews, it looks like it gonna be a HOT winter!! Might have to add that nuclear power plant reactor to help juice these babies!


----------



## Mr McC (Oct 15, 2010)

CDdude55 said:


> hmm, It seems when it's Nvidia news, everybody complains..:shadedshu



I don't think it's that, it's just that it isn't really news: AMD are on the verge of launching 68XX series and Nvidia hopes to divert some attention away from the launch by stating that they will be releasing faster cards with better features in the future. We already assumed that.


----------



## Benetanegia (Oct 15, 2010)

I don't know from where did they take those specs, but Nvidia will not release those things. They are absurd. 512 bit?  If GF104 has taught anything, that is that performance on Fermi cards depends mostly/only on shaders and is not by any means based on ROPs/bandwidth.

The only probable GF110 specs are one of these (in order of probability):

1 - 3/2 (three halves) of a GF104, that is 3 GPC (clusters)

specs: 576 SP, 96 TMU, 384 bit, core 750 mhz, < 500 mm^2

performance: GTX480 + 25%

Posibility of a dual GF104 card.

2- GF104 with 4 SIMDs per SM instead of 3 (64 SPs instead of 48), 2 GPC

specs: 512 SP, 64 TMU, 256 bit (it would accompanied with 6 gbps memory for about the same bandwidth as GF100), core 750-800 Mhz, << 400 mm^2

performance: ~ GTX480 +/- 5%

There would be a dual gpu card based on this one.

3- Combination of both, #2 but with 3 GPC or #1 with 4 SIMD.

specs: 768 SP, 96 TMU, 384 bit, core 650-700 Mhz, ~550 mm^2

performance: GTX480 + 50%

#3 becomes posible thanks to the fact that TSMC 40nm is said to have exceeded 55nm yields, and Nvidia not fecking up the fabric like they did with GF100.

Also #2 and #3 can easily exist at the same time, as well as a dual card based on #2, and a card based on the same but with 1 GPC only and 128 bit. That way:

GF110#2 (512 SP part) >>> Bart at a higher cost
GF110#3 (768 SP part) >>> Cayman at a higher cost
2xGF110#2 (2x512 SP) == Antilles at similar cost



Mr McC said:


> I don't think it's that, it's just that it isn't really news: AMD are on the verge of launching 68XX series and Nvidia hopes to divert some attention away from the launch by stating that they will be releasing faster cards with better features in the future. We already assumed that.



I don't see *Nvidia* saying anything anywhere, all I see is a pair of websites speculating, based on the fact that speaking about a posible Nvidia response now that AMD is releasing HD6000 is going to be inflamatory and obtain them a lot of clicks.


----------



## wahdangun (Oct 15, 2010)

wow, i hope nvdia won't take another 7 month to release these card  and make wood screw version before it


----------



## Mr McC (Oct 15, 2010)

Benetanegia said:


> I don't see *Nvidia* saying anything anywhere, all I see is a pair of websites speculating, based on the fact that speaking about a posible Nvidia response now that AMD is releasing HD6000 is going to be inflamatory and obtain them a lot of clicks.



I see marketing.


----------



## beautyless (Oct 15, 2010)

Benetanegia said:


> I don't know from where did they take those specs, but Nvidia will not release those things. They are absurd. 512 bit?  If GF104 has taught anything, that is that performance on Fermi cards depends mostly/only on shaders and is not by any means based on ROPs/bandwidth.
> 
> The only probable GF110 specs are one of these (in order of probability):
> 
> ...



But, I don't think they can make 768 SP part because it was too big. 
And 1x512 SP part should be faster and higher power than Bart so 2x512 SP is hard to make.


----------



## HalfAHertz (Oct 15, 2010)

If they just release unlocked gts460s(as in all cores enabled) and gtx480s, lower the voltage a bit so that they are in the same tdp and name them 466 and 485, Nvidia will be in a pretty good position imo.


----------



## CDdude55 (Oct 15, 2010)

Mr McC said:


> I don't think it's that, it's just that it isn't really news: AMD are on the verge of launching 68XX series and Nvidia hopes to divert some attention away from the launch by stating that they will be releasing faster cards with better features in the future. We already assumed that.



I'm just saying:




inferKNOX said:


> nVidia:
> http://i55.tinypic.com/wjjtvr.jpg
> How on Earth can they chop the GTX480 then put out the original GTX480 as a GTX580? What a move of desperation!
> Just when I was annoyed with AMD acting like nVidia with the rebranding/renaming/repricing/whatever, nVidia proved themselves truly unique!





RejZoR said:


> 512bit bus. We can assume it will be a 4 digit price tag...





HXL492 said:


> A TDP close to the 480GTX???? I can see where this is heading....:shadedshu





the54thvoid said:


> The reason Nvidia hasn't got the single *card* performance crown is bacause to date, nobody has got a hold of the power issues.  Don't hold your breath for this card.





wahdangun said:


> wow, i hope nvdia won't take another 7 month to release these card  and make wood screw version before it


----------



## btarunr (Oct 15, 2010)

HossHuge said:


> I find this term very subjective...



The next 3 months.


----------



## derwin75 (Oct 15, 2010)

*Nvidia GeForce GTX 580*

That's terrific news about the new GF110 Fermi card. But I read somewhere online stated that GeForce GTX 580 will have 580 Cuda cores not 512 Cuda cores. Here is a statement below:

( NVIDIA  announced a next-generation GeForce GTX 500 series , the first high-end models named GeForce GTX 580  based on the Fermi architecture.

The GeForce GTX 580 features 580 CUDA cores to go along with its "580" moniker and have a whopping 2560MB of 384-bit GDDR5 memory

"Although we're very proud of the GTX 480," NVIDIA President and CEO Jen-Hsun Huang said, "the 400 series is merely a tease for what Fermi can really accomplish. When we release the 500 series later this year, I think everyone will be pleasantly surprised. ) By Softpedia News.

Anyone care to explain this....In my opinion I would rather buy a 580 cuda cores card instead of 512 cuda cores card.


----------



## Red_Machine (Oct 15, 2010)

Later THIS year?  Good god...


----------



## Benetanegia (Oct 15, 2010)

beautyless said:


> But, I don't think they can make 768 SP part because it was too big.
> And 1x512 SP part should be faster and higher power than Bart so 2x512 SP is hard to make.



No because you are not understanding how those 512 and 768 SP parts would be achieved. You are only adding a SIMD into every SM and that does not add much die area to the SM, as can be seen from the fact that GF104 has 3/4 of the shaders, same TMU and SFU number, crammed into less than 2/3 the die area of GF100:

GF100: 512 SP, 64 TMU, 64 SFU, 382 bit ==> 530 mm^2
GF104: 384 SP, 64 TMU, 64 SFU, 256 bit ==> 332 mm^2

GF110#2: 512 SP, 64 TMU, 64 SFU, 256 bit ==> (?) 375 mm^2 
GF110#3: 768 SP, 96 TMU, 64 SFU, 384 bit ==> (?) 375/2 * 3 = 560 mm^2*

* GT200 was 576 mm^2 and both AMD and Nvidia, as well as TSMC are saying that TSMC's 40nm node is finally as good as 55nm or 65 nm.


----------



## mdm-adph (Oct 15, 2010)

btarunr said:


> The next 3 months.



Oh, I don't doubt that in the next 3 months Nvidia will release either A) another press release B) nothing, because the card will be delayed, or C) a card that was supposed to be called the GTX 495, but rebranded as a GTX 580.


----------



## beautyless (Oct 15, 2010)

Benetanegia said:


> No because you are not understanding how those 512 and 768 SP parts would be achieved. You are only adding a SIMD into every SM and that does not add much die area to the SM, as can be seen from the fact that GF104 has 3/4 of the shaders, same TMU and SFU number, crammed into less than 2/3 the die area of GF100:
> 
> GF100: 512 SP, 64 TMU, 64 SFU, 382 bit ==> 530 mm^2
> GF104: 384 SP, 64 TMU, 64 SFU, 256 bit ==> 332 mm^2
> ...



Ahh! I see. I agreed with you now. So GF100 is the fail config design, I think. GF104 is much better.


----------



## Benetanegia (Oct 15, 2010)

mdm-adph said:


> Oh, I don't doubt that in the next 3 months Nvidia will release either A) another press release B) nothing, because the card will be delayed, or C) a card that was supposed to be called the GTX 495, but rebranded as a GTX 580.



Yeah because Nvidia has always been late and has dissapointed, unlike AMD/Ati...


----------



## iamverysmart (Oct 15, 2010)

Maybe they can use the shaders of the GTX460/GTS450 to make the power consumption acceptable?


----------



## mdm-adph (Oct 15, 2010)

Benetanegia said:


> Yeah because Nvidia has always been late and has dissapointed, unlike AMD/Ati...



Now, in AMD's defense, nobody really ever thought the Fusion chip was going to come out.  And the R600 was their version of the Nvidia FX5900 "Space Heater Edition," and it performed that task wonderfully.


----------



## HalfAHertz (Oct 15, 2010)

derwin75 said:


> That's terrific news about the new GF110 Fermi card. But I read somewhere online stated that GeForce GTX 580 will have 580 Cuda cores not 512 Cuda cores. Here is a statement below:
> 
> ( NVIDIA  announced a next-generation GeForce GTX 500 series , the first high-end models named GeForce GTX 580  based on the Fermi architecture.
> 
> ...



I think that's bull because 580 is not dividable by neither 32 or 48, so that would mean they'd need a completely new architecture to achieve it...


----------



## Naito (Oct 15, 2010)

This is ridiculous! Another half-assed re-brand from nVidia. Does this mean the whole 500 series is going to be full of other pointless re-brands as well? Disappointed nVidia. I am a nVidia fan, and the only 2 cards I would even consider getting this series, would be the GTX 460 and possibly the GTX 465 if unlock-able to a GTX 470. They should concentrate on at least releasing this 'GTX 580' as GTX 485 or something (or use 2xGF104 for GTX 490?), and then improve upon the GF104 SP/Cluster ratio/design, add more features, and make sure it doesn't guzzle too much juice, for the Geforce 500 series.

ATI looks more enticing everyday. Good for them with kicking nVidia up the arse, and increasing competition. Maybe one day I might build a rig with ATI in it.....


----------



## Naito (Oct 15, 2010)

derwin75 said:


> That's terrific news about the new GF110 Fermi card. But I read somewhere online stated that GeForce GTX 580 will have 580 Cuda cores not 512 Cuda cores.



It wont be possible. It would be more like 576 cuda cores.


----------



## the54thvoid (Oct 15, 2010)

This is just silly.

Everyone is now speculating about what NV's next big card is going to be...... Even though they don't even have it yet.  

There is only one reason for this 'outlet' of info - to make those about to shell out for 69xx cards stop and go, "oh, maybe i should wait for this.."

Shame on all of you for falling head over heels in love with an idea that is really an attempt at FUD to stall people buying AMD.

Even Benetanegia with his well informed tech speak is just 'speculating'.  And that's not an insult Ben  I'm just saying that one simple release has sparked all this rumour.  It's NV marketing doing what it does best - spreading doubt without foundation.

We really all need to realise both companies are out for our cash.  And they'll both fight nasty to get it.  

Please, let's stop falling for all this marketing shit.


----------



## derwin75 (Oct 15, 2010)

HalfAHertz said:


> I think that's bull because 580 is not dividable by neither 32 or 48, so that would mean they'd need a completely new architecture to achieve it...





This is why Nvidia called it a GF110 as a new architecture. I'm sure that Nvidia has already achieve it but we have to wait and see. For now, it is only a rumor.


----------



## Benetanegia (Oct 15, 2010)

the54thvoid said:


> This is just silly.
> 
> Everyone is now speculating about what NV's next big card is going to be...... Even though they don't even have it yet.
> 
> ...



Sorry man, that is not correct, you have to re-read the thread mostly on its entirety. Most people are just whinning about this thread/news. *I am* (and pretty much only me) speculating and that's because I like speculating, just read any HD6000/Barts/Cayman thread... 



> It's NV marketing doing what it does best - spreading doubt without foundation.



That's common, SI/NI was also announced when Fermi was about to launch. Fusion has always been there "being the future", after every Intel release. Intel's bla and and Nvidia bla, when AMD is releasing bal, and bla bla bla. Some people get hyped and some don't. I am one of those who doesn't really get hyped ever, but can't loose any oportunity to speculate on something. What it is disturbing is people that get offended by the fact that other people are hyped or not. Don't be one of those... 

EDIT: I say this ^^ because it's best for your sanity, you can start whinning about these hyped people, next you would take on people who vote on elections hoping for change, next religion and you'd end up yelling at children in the streets "Santa does not exist you prick! It's your parents".


----------



## bear jesus (Oct 15, 2010)

the54thvoid said:


> This is just silly.
> 
> Everyone is now speculating about what NV's next big card is going to be...... Even though they don't even have it yet.
> 
> ...



I have to admit though it is nice to stop for a second to think about what nvidia has got coming up in the future, i only expect to be choosing between AMD 5xxx, 6xxx or nvidia 4xx cards with my coming upgrade but still, nice to stop thinking constantly about the 6870/50 for a moment like most of us have recently. 



Benetanegia said:


> Sorry man, that is not correct, you have to re-read the thread mostly on its entirety. Most people are just whinning about this thread/news. *I am* (and pretty much only me) speculating and that's because I like speculating, just read any HD6000/Barts/Cayman thread...



True, most are complaining or almost mocking the idea of the spec


----------



## derwin75 (Oct 15, 2010)

Naito said:


> It wont be possible. It would be more like 576 cuda cores.




When it comes to graphic technology, anything is possible. So nothing is impossible. Nvidia Company is always full of surprises.


----------



## stupido (Oct 15, 2010)

the54thvoid said:


> This is just silly.
> 
> Everyone is now speculating about what NV's next big card is going to be...... Even though they don't even have it yet.
> 
> ...


meh...

calling for common sense is a bit useless when fanboism kicks in... 

however, I personally do not see a need for an upgrade... just yet...
I usually upgrade when a game comes that I can not play with my current setup.
for example I upgraded from 8800 GTS to GTX280 because of crysis...  

so far I do not see any game coming that will tax my GTX280 (not even overclocked) so bad that I will wish to cash out for something new... and when that moment comes, I will not be looking if it is NV or AMD but will look which one plays "Da game" for least cash...


----------



## Yellow&Nerdy? (Oct 15, 2010)

Erm, am I the only one who remembers the article about the "full fledged" 512SP GTX 480 a couple of months ago?http://en.expreview.com/2010/08/09/world-exclusive-review-512sp-geforce-gtx-480/9070.html 204W extra power consumption and 5% more performance. Not to mention the triple slot, triple fan cooler.

For this "GTX 580" to be possible, Nvidia has to make major changes to the GF100 architecture. And I don't see Nvidia having resources to do that, since they just got done releasing their entry-level desktop cards, are still missing a dual-GPU card and have only released their first notebook-graphics.


----------



## the54thvoid (Oct 15, 2010)

Benetanegia said:


> EDIT: I say this ^^ because it's best for your sanity, you can start whinning about these hyped people, next you would take on people who vote on elections hoping for change, next religion and you'd end up yelling at children in the streets "Santa does not exist you prick! It's your parents".



I'm not whining.  Please for gods sake dont say i'm whining.  I'm just like, "oh ffs, here we go again."

I dont whine, too long in the tooth for that.

And i often 'argue' with religious types.  It's fun.  And when people say "what's Santa getting you for christmas?" I often reply, hopefully not sexually abused again.

I have no sanity


----------



## CDdude55 (Oct 15, 2010)

stupido said:


> however, I personally do not see a need for an upgrade... just yet...
> I usually upgrade when a game comes that I can not play with my current setup.
> for example I upgraded from 8800 GTS to GTX280 because of crysis...
> 
> so far I do not see any game coming that will tax my GTX280 (not even overclocked) so bad that I will wish to cash out for something new... and when that moment comes, I will not be looking if it is NV or AMD but will look which one plays "Da game" for least cash...



This is probably the most logical and well thought out post i have read so far...



			
				the54thvoid said:
			
		

> We really all need to realise both companies are out for our cash. And they'll both fight nasty to get it.
> 
> Please, let's stop falling for all this marketing shit.



Agreed.


----------



## Hayder_Master (Oct 15, 2010)

if like this it will be kick 6980 ass, and it will be very close to 6990


----------



## Atom_Anti (Oct 15, 2010)

Yellow&Nerdy? said:


> Erm, am I the only one who remembers the article about the "full fledged" 512SP GTX 480 a couple of months ago?http://en.expreview.com/2010/08/09/world-exclusive-review-512sp-geforce-gtx-480/9070.html 204W extra power consumption and 5% more performance. Not to mention the triple slot, triple fan cooler.



I can also remember to that.



hayder.master said:


> if like this it will be kick 6980 ass, and it will be very close to 6990



This miracle Nvidia card is almost impossible, so nobody have to worry about it. I would say Nvidia is in deep shit now and won't be easy answer to Cayman.


----------



## crow1001 (Oct 15, 2010)

buggalugs said:


> If you look through all the games and 3D benchmarks its no where near 50%, average is 10-20%.You were selective in picking the only game with a decent gap, just like the other guy. For the extra 10-20% you have a blast furnace in your comp and paid an extra $100-$200.
> 
> Nvidia is still trying to appeal to the lowest common denominator by making the fastest card any cost. Sales of the 5XXX series should have taught them that most people are a bit more discerning and will weigh up heat/temps/computer noise and cost.
> 
> This thing with a 512 bit bus is going to cost more than the 480 and will most likely be over $800 outside the US. Looks like another fail.



I had a 5870, was a great card, but the 480 owns it, minimum fps is way better, and yeah max 80c temps. Don't worry I'm sure your card will play the latest titles at max IQ for a good few months yet.

Regarding selective benchmarks, 50% faster in an AMD endorsed DX11 game to show of the 5*** series, 45 % faster in DX11 metro2033  Nvidia-DX11 done right.


----------



## SNICK (Oct 15, 2010)

much like gtx380


----------



## CDdude55 (Oct 15, 2010)

Atom_Anti said:


> This miracle Nvidia card is almost impossible, so nobody have to worry about it. I would say Nvidia is in deep shit now and won't be easy answer to Cayman.



We know absolutely nothing about this card, how can someone rationally argue that ''nobody has to worry about it'' without just being a blatant fanboy.


----------



## LAN_deRf_HA (Oct 15, 2010)

This could be a horrendous failure for nvidia. 512 shaders equated to only a 5% speed increase in the reviews I've seen for those one off boards, and the bus will add what? 5% more if they're lucky? The 6970 is projected to be 10% faster than a 480, so it will have the same performance. Where nvidia is going to lose massively is price. 2 GBs of memory? Twice the bus size? Just to break even with a cheaper AMD card? That's going to have an atrocious profit margin, and if AMD puts any real price pressure on them the card's sales could come to a complete halt. I'd estimate a $100-150 price discrepancy if nvidia intends to make a profit, and they still won't claim the top title. Matching the 480s power draw means it too will be unfit for a dual gpu board.


----------



## crow1001 (Oct 15, 2010)

CDdude55 said:


> We know absolutely nothing about this card, how can someone rationally argue that ''nobody has to worry about it'' without just being a blatant fanboy.



AMD fanboys like to make out they know more than the guys that design the Nvdia GPU's, fact is they know shit and are just out troll Nv threads.


----------



## wolf (Oct 15, 2010)

wow, for 3 pages, I think this thread has the most poop-talkers and nay-sayers I've seen at once.

so many comments about how it's not possible, not going to happen, don't even try, it will suck if they do.

since when was competition a bad thing? and I don't see anything that Nvidia themselves have said there, it's not an official announcement from them about a new series or anything sheesh.

calm down people, why don't you save ripping on the card for when it arrives.


----------



## LAN_deRf_HA (Oct 15, 2010)

wolf said:


> wow, for 3 pages, I think this thread has the most poop-talkers and nay-sayers I've seen at once.
> 
> so many comments about how it's not possible, not going to happen, don't even try, it will suck if they do.
> 
> ...



This competition is a bad thing. The 580 will need to be very expensive, and rather than start a price war AMD will just jack up the price of the 6970 to match like they did with the 58xx series. This is not good for the consumer because neither company is willing to start a price war outside of the mainstream. I'd expect both the 5970 and 580 to be between $450-550 shortly after launch. At least without the 580 maybe the 480 would have finally gotten cheaper. Now it will just be discontinued.


----------



## CDdude55 (Oct 15, 2010)

As i have always said, TPU is generally bias towards ATI/AMD cards. People can disagree, but I've been here for around 3 years and have seen this crap run rampant for a while now unfortunately..

As stated above, how about we wait for the card to at least see it's system specs confirmed and we have some official announcements from Nvidia themselves, but, something tells me this type of irrational thinking will just happen again in those threads too..


----------



## dir_d (Oct 15, 2010)

I think Nvidia could get a GF100b out by march or ditch the GF100 and scale up the GF108 core.


----------



## bear jesus (Oct 15, 2010)

CDdude55 said:


> As i have always said, TPU is generally bias towards ATI/AMD cards. People can disagree, but I've been here for around 3 years and have seen this crap run rampant for a while now unfortunately..
> 
> As stated above, how about we wait for the card to at least see it's system specs confirmed and we have some official announcements from Nvidia themselves, but, something tells me this type of irrational thinking will just happen again in those threads too..



Personally i think every tech site seams to be mainly filled with fan boys of every brand, but to be honest i expect nothing less as the world seams to be mainly filled with people lacking common sense and logic  all i can hope for is that some day people learn that everything should be judged on a product by product basis not on the logo slapped on it laugh: never going to happen).


----------



## Atom_Anti (Oct 15, 2010)

CDdude55 said:


> We know absolutely nothing about this card, how can someone rationally argue that ''nobody has to worry about it'' without just being a blatant fanboy.



We know very much about it, even lot more than about the upcoming HD6900. GF110= 512+ CUDA, 128 TMUs, 512-bit 2 GB memory. It means about 700+ mm^2 chip, which is too big for TSMC production line, TDP would lot more than 300 watt, than I could continue with technical problems, heating and price. No way!, just Nvidia trying to keep their fans somehow.


----------



## cis278 (Oct 15, 2010)

*what happen to nvidia*

look at the products nvidia is putting out. there tdp is much higher than amd's products, nvidia is cutting rops from there gpu's but yet either has more cuda cores or the same amount as some last gen dx 10 cards (gts 250,gt240) but perform the same or less. luckily nvidia has a lock on physx which is still much to be desired there or who knows where they would be sitting at.


----------



## inferKNOX (Oct 15, 2010)

CDdude55 said:


> I'm just saying:


Ha ha ha, mine is not a complaint, but a bit of a giggle at nVidia's expense. Silly me, I never read out all the specs nicely, else I would have seen the new jumbo-bus, etc. I just saw the 512 Cuda cores and was beside myself with laughter thinking it's the original GTX480.

And actually I do hope that nVidia gives AMD a good kick in the butt! The fact that they want the 6800s to be GPUs that perform like 5800s is a testament to AMD's new found complacency!:shadedshu
And yes, I do know that it's because they're apparently moving their mid-range up from the x700s, but where is the sense in that?! Where's the space for the high end then... x900? Rubbish, there's not enough space there for med-high, high and ultra-high end!
It's a overpricing ploy and nothing more IMHO, so yes nVidia, make them look stupid by making their "best single GPU card" dominance short-lived! Serves them (AMD) right for trying to be lax!

*phew*... rant over... for now!


----------



## CDdude55 (Oct 15, 2010)

Atom_Anti said:


> We know very much about it, even lot more than about the upcoming HD6900. GF110= 512+ CUDA, 128 TMUs, 512-bit 2 GB memory. It means about 700+ mm^2 chip, which is too big for TSMC production line, TDP would lot more than 300 watt, than I could continue with technical problems, heating and price. No way!, just Nvidia trying to keep their fans somehow.



Sources?, links?


----------



## erocker (Oct 15, 2010)

CDdude55 said:


> Sources?, links?



First post of this thread. The two small links at the end of bta's post.


----------



## Atom_Anti (Oct 15, 2010)

CDdude55 said:


> Sources?, links?







erocker said:


> First post of this thread. The two small links in bta's post at the end of his post.



Thank You!


----------



## CDdude55 (Oct 15, 2010)

erocker said:


> First post of this thread. The two small links at the end of bta's post.



Any that are not in German?..


----------



## Benetanegia (Oct 15, 2010)

Yellow&Nerdy? said:


> Erm, am I the only one who remembers the article about the "full fledged" 512SP GTX 480 a couple of months ago?http://en.expreview.com/2010/08/09/world-exclusive-review-512sp-geforce-gtx-480/9070.html 204W extra power consumption and 5% more performance. Not to mention the triple slot, triple fan cooler.
> 
> For this "GTX 580" to be possible, Nvidia has to make major changes to the GF100 architecture. And I don't see Nvidia having resources to do that, since they just got done releasing their entry-level desktop cards, are still missing a dual-GPU card and have only released their first notebook-graphics.



That was not a card from Nvidia and was most probably FAKE. In fact, maybe not, the pictures had the portion of the chip where it states revision number blurred. That cards was probably a prototype using A1 silicon, and yes we know A1 was not very good by the fact that A2 and A3 silicon was created. pff. Only a clueless person can believe that enabling a 5% of extra shaders (32 SP) is going to consume 200w more. With that *rule of dumb*, the GTX465 with 352 SPs consumes nearly 200w, so the GTX480 with 128 SPs more would consume 1000w... suuuure.



Atom_Anti said:


> We know very much about it, even lot more than about the upcoming HD6900. GF110= 512+ CUDA, 128 TMUs, 512-bit 2 GB memory. It means about 700+ mm^2 chip, which is too big for TSMC production line, TDP would lot more than 300 watt, than I could continue with technical problems, heating and price. No way!, just Nvidia trying to keep their fans somehow.



We don't know anything. Those are fake specs coming from sites that are actually mentioning 4 different posibilities, each one of them making less sense than the next one.



erocker said:


> First post of this thread. The two small links at the end of bta's post.



Yeah, sure, based on that we know everythng about GF110...

So I follow the first link http://www.3dcenter.org/news/2010-10-13 and which of the specs mentioned there are true exactly?

1- Fully enabled GF100?
2- 512 SP, 128 TMU, 512 bit?
3- 576 SP, 96 TMU, 384 bit?
4- Two GF104s in a die? 768 SP, 128 TMU, 512 bit

The truth is they are completely clueless. And it's obvious they have no kind of source, which becomes apparent by the fact that they are posting 4 different posible configs. If you have a source or if it's Nvidia PR behind you would have *one* spec posted, wrong or not, fake of not, hype or not. But 4? Come on...


----------



## Atom_Anti (Oct 15, 2010)

Whatever it is, but Nvidia can answer only with the next production stepping. That means next summer/fall times.


----------



## the54thvoid (Oct 15, 2010)

Ya, neine, ich bein ein...something and all that.

One of the scenarios points to the 768 shader GTX 580 with about 50% performance on top of the GTX 480.

Excuse me for laughing out my cornflakes.

We're talking 40nm process here.  A GF100 variant (unlike the super GF104) isn't very practical.

It took many months to release a crippled GTX 480 (480 cores not 512).  They've yet to manage a 512 core variant.  Yes, they probably will.  But surely it requires a redesign which would be a bit odd seeing as they are working on Kepler now.

Or will Kepler be a diff design team as the Fermi design team managed to miss a few things in the design to manufacture process (just saying what JSH said).

Can i also state i think it's absolute bollocks that AMD have kept 58xx series prices so high.    If NV can make a chip like the nonsense we're all talking about, it'll cost a fortune.  Just like 69xx surely will


----------



## the54thvoid (Oct 15, 2010)

Benetanegia said:


> The truth is they are completely clueless. And it's obvious they have no kind of source, which becomes apparent by the fact that they are posting 4 different posible configs. If you have a source or if it's Nvidia PR behind you would have one spec posted, wrong or not, fake of not, hype or not. But 4? Come on...



Completely with you there dude.

Physical products and real info is the mana of tech, not rumour and superstition.


----------



## Benetanegia (Oct 15, 2010)

GF100 failed because a very specific problem that has already been fixed (fabric). All this "Nvidia couldn't do 512 SP, much less more of them" BS needs to stop (I'm not talking to anyone specifically). There was a problem and has been fixed. (2 actually, if you count bad TSMC process as well, and that was a fact thet even AMD suffered). AMD couldn't make r600 work well, it was a 720 million transistor behemoth (at the time) that waas creamed by the 680 million transistor competitor by as much as 40% performance lead (8800 Ultra). Months later the 959 million transistor and 2.25x more horsepower RV770 was born. Stop saying stupid things please...



Atom_Anti said:


> Whatever it is, but Nvidia can answer only with the next production stepping. That means next summer/fall times.



Erm, no. There's one that can be released already and that is the 3rd one they mentioned or the first one I mentioned. That one is 3/2 of a GF104, and could have been in the making since first GF104 silicon went back from TSMC 6 months ago and they realized how much better than GF100 it was, meaning that full production chips could be on their way already. You could bet that Nvidia *could* be making this indeed, since Nvidia allocated +++70% of 40nm capacity once again and capacity now is like 4x higher than it was a year ago. Only for GF106 and GF108? Most probably, but certainty that there's nothing more? Methinks not.

That one is not only doable, but it's a lot more doable than GF100 and better than GF100 in every posible way, and doesn't require Nvidia going to the drawing board at all. 50% more GPC/shaders means less than 50% more silicon, because PCIe interface, video decoder portion of the chip and many other things are already there. In any case if we take the most pessimistic number of 50% increase in silicon we end up with a 2.8 billion transistor and 480mm^2 chip. Also 50% more power consumption means 225-275w.

basically "GF110" vs GF100:

transistors: 2.8 vs 3.1 billion
die area: 480 mm^2 vs 530mm^2
shaders: 576 vs 480
tmu: 96 vs 64
384 vs 384 bit
275w vs 320w


----------



## HalfAHertz (Oct 15, 2010)

Benetanegia said:


> GF100 failed because a very specific problem that has already been fixed (fabric). All this "Nvidia couldn't do 512 SP, much less more of them" BS needs to stop (I'm not talking to anyone specifically). There was a problem and has been fixed. (2 actually, if you count bad TSMC process as well, and that was a fact thet even AMD suffered). AMD couldn't make r600 work well, it was a 720 million transistor behemoth (at the time) that waas creamed by the 680 million transistor competitor by as much as 40% performance lead (8800 Ultra). Months later the 959 million transistor and 2.25x more horsepower RV770 was born. Stop saying stupid things please...
> 
> 
> 
> ...



If only these things worked so linearly...

Ok let's think logically: all these new shaders have to be connected to the L2 cache, those connections take some space. Then you'll probably need a larger(and/or faster) L2, unless you want to leave all those new shiny cores starved for information. Then the back-end with the 50% more TMUs will need to be rewired and all of these new changes will need to be tested over and over and over...I dunno man you make it sounds a lot easier than it actually is.

I still think that if they play around with the existing cores and release fully enabled ones, Nvidia should pretty much be in the clear. Get rid of the 465 because it sucks donkey a$$, release a  466 based on the full G106 core, get rid of the 470 because the 466 will eat it up, move the 480 down to a 475 and have a 512core 485 at the top of the line. Play with prices and voltages a bit and et voila problem solved.


----------



## Athlon2K15 (Oct 15, 2010)

The fact that this thread is pure speculation makes me


----------



## cadaveca (Oct 15, 2010)

AthlonX2 said:


> The fact that this thread is pure speculation makes me



Just some nV soul-sucking on the 6-series launch. I'd be more concerned if it DIDN'T happen.


----------



## Athlon2K15 (Oct 15, 2010)

nvidia isnt going to release anything until next year 2nd quarter they have no reason to.


----------



## Benetanegia (Oct 15, 2010)

HalfAHertz said:


> If only these things worked so linearly...
> 
> Ok let's think logically: all these new shaders have to be connected to the L2 cache, those connections take some space. Then you'll probably need a larger(and/or faster) L2, unless you want to leave all those new shiny cores starved for information. Then the back-end with the 50% more TMUs will need to be rewired and all of these new changes will need to be tested over and over and over...I dunno man you make it sounds a lot easier than it actually is.



Nothing is wired as you say. Fermi is 100% modular and every step was designed so that you can add anything in a LEGO fashion, from SIMDs, to SMs, to complete GPCs. Buffers and pooled buses are placed between every step for that purpose and the performance penalty that Fermi suffers in terms of SP/performance in comparison with G80/G92/GT200 supposedly comes from this re-alignment. The trade off was made (just like when Ati created R600), now it's time to add the components that actually do the work.

Yeah, maybe it's not as easy as I made it out to be, but it certainly isn't as difficult as a competely new chip. It's been 6+ months since GF104 was finished (not released). 6 months is more than enough to make that thing and then some. Besides forget about release times, it's internal times which we have to look at, and thse are unknown. Release dates for GF104, 106 and 108 were not based on when the design was finished, but on when can I make enough of them for a proper release, *without eating up on production of the chips that make me most money*, that is higher end ones. Bottom line Fermi derivatives were probably almost finished probably even before GF100 cards were released. Enough time for anything.

EDIT: And no, you don't need more L2. Fermi had much more L2 than any GPU will ever need. Reason GPGPU (GF100 is and will always be the GPGPU chip, just like G80 always was the GPGPU part, G92 existed oly like a gaming chip tho). GF104 is showing any decrease in performance due to less L2 per SP? No, not a single 1%. And 50% more SPs per SM were added. Adding another 16 SP, equalling a 33% increase is not going to change that either.

Also:



> If only these things worked so linearly...



They don't indeed, but it's actually the other way around as you are suggesting and in absolute favor for the "3/2 GF104-GF110":

- Doubling execution units usually never doubles transistor count or die area, especially die area. And you waste less area in "margins" (I know there's a term for that). i.e:

Ati

Redwood = 627 million
Juniper = 1040 million
Cypress = 2150 million, more than twice yes, but it does not count because it has at least a massive difference in that it supports 64 bit, while Juniper and below don't.

RV730 = 514 million (remember 320 SP)
RV740 = 826 million (640 SP)
RV770 = 956 million (800 SP)

Nvidia

GF108 = 585 million
GF106 = 1170 million
GF104 = 1950 million

- Power requirement increases are almost always lower than the actual active transistor increase.


----------



## CDdude55 (Oct 15, 2010)

Like any other company, they are trying to garner hype for themselves to try and dumb down the hype for their competitors who are prepping for a product launch, nothing new.


----------



## the_pharaoh (Oct 15, 2010)

Total fail, not news. The two sites linked as "sources" cite each other as sources  give me a break...

As far as I'm concerned until cards are on shelves it's all a bunch of hot air. NVIDIA has zero credibility left with me after the Fermi "launch".


----------



## CDdude55 (Oct 15, 2010)

the_pharaoh said:


> Total fail, not news. The two sites linked as "sources" cite each other as sources  give me a break...
> 
> As far as I'm concerned until cards are on shelves it's all a bunch of hot air. NVIDIA has zero credibility left with me after the Fermi "launch".



The second site actually lists more then one source, one including that article from 3D center.

3Dcenter claims in that article of it's source: 





> ''However, the source of new information on the GF110-chip can be trusted - it is the same which had been previously noted by us called the specifications of the AMD chips Barts and the Cayman, which is now apparently out to be correct.''



So who knows..

And as for the Fermi comment.... i like my GTX 470.


----------



## v12dock (Oct 15, 2010)

Are they going to be able to make the chip this time....


----------



## HalfAHertz (Oct 15, 2010)

Benetanegia said:


> Nothing is wired as you say. Fermi is 100% modular and every step was designed so that you can add anything in a LEGO fashion, from SIMDs, to SMs, to complete GPCs. Buffers and pooled buses are placed between every step for that purpose and the performance penalty that Fermi suffers in terms of SP/performance in comparison with G80/G92/GT200 supposedly comes from this re-alignment. The trade off was made (just like when Ati created R600), now it's time to add the components that actually do the work.
> 
> Yeah, maybe it's not as easy as I made it out to be, but it certainly isn't as difficult as a competely new chip. It's been 6+ months since GF104 was finished (not released). 6 months is more than enough to make that thing and then some. Besides forget about release times, it's internal times which we have to look at, and thse are unknown. Release dates for GF104, 106 and 108 were not based on when the design was finished, but on when can I make enough of them for a proper release, *without eating up on production of the chips that make me most money*, that is higher end ones. Bottom line Fermi derivatives were probably almost finished probably even before GF100 cards were released. Enough time for anything.
> 
> ...



Well I guess we'll have to wait and see...a while....has anyone read any good books leately?


----------



## erocker (Oct 15, 2010)

I don't expect to see this "GTX 580" for a while. I don't see how a 512 core GTX 480 is going to happen as it already has happened and failed (6 - 12% perf. increase and horrible power usage). Nvidia is going to need to be creative for the next half-year or so. They need to work with the GF104 chip as much as they can until 28nm is ready. It's freaking sad. I hope they can get something competetive out, otherwise prices aren't going to be very competetive at all, all around.


----------



## bear jesus (Oct 15, 2010)

Really all i expected from nvidia in the near future was a refresh of the current cores that are fully enabled, possibly from binned chips or possibly a refresh simmilar to the 280 to 285 although i thought that refresh was basicly a drop form 65nm to 55nm thus why i was not expecting anything much past full fat gf100 and gf104 chips.
I was expecting nothing special from them untill they move onto 28nm, i would have said the same for AMD but it seams that the 6xxx cards may prove to be quite nice, only time will tell.



erocker said:


> I don't see how a 512 core GTX 480 is going to happen as it already has happened and failed (6 - 12% perf. increase and horrible power usage).



Do you think well binned chips could possibly have everything enabled and not chow down on a massive ammount more power?


----------



## erocker (Oct 15, 2010)

bear jesus said:


> Really all i expected from nvidia in the near future was a refresh of the current cores that are fully enabled, possibly from binned chips or possibly a refresh simmilar to the 280 to 285 although i thought that refresh was basicly a drop form 65nm to 55nm thus why i was not expecting anything much past full fat gf100 and gf104 chips.
> I was expecting nothing special from them untill they move onto 28nm, i would have said the same for AMD but it seams that the 6xxx cards may prove to be quite nice, only time will tell.
> 
> 
> ...



It's simply not cost sufficient. This is what a "well binned" chip gets you. http://en.expreview.com/2010/08/09/world-exclusive-review-512sp-geforce-gtx-480/9070.html/1


----------



## bear jesus (Oct 15, 2010)

erocker said:


> It's simply not cost sufficient. This is what a "well binned" chip gets you. http://en.expreview.com/2010/08/09/world-exclusive-review-512sp-geforce-gtx-480/9070.html/1



 ok then agree they need to work with the gf104 cores and try and make something pretty special, i wonder how a full fat (preferably overclocked) gf104 core would do on a dual chip card.


----------



## Steevo (Oct 15, 2010)

At 1050 core my 5870 is a excellent chip, and kicks ass for the less than $300 I paid for it, unfortunately for Nvidia when ATI releases their next set the price drop of the 5XXX series will widen the performance per dollar gap back out in their favor, and AMD will also have the fastest set of cards consuming less power.


So, AMD / NOM NOM NOM /Nvidia profits......


----------



## dj-electric (Oct 15, 2010)

I cant believe Nvidia are actualy doing that, this is so pathetic. its just shows how desperate Nvidia to make a move against the HD6000 series


----------



## mtosev (Oct 15, 2010)

nvidia is full of crap.start making good graphic cards or STFU


----------



## NC37 (Oct 15, 2010)

H82LUZ73 said:


> he must be thinking that the re-branding is for the dual card.When in fact it is single gpu and will be about 15-20% faster then the 480.heck if nvidia does release this 580 that makes me happy ,price drops on the AMD cards.



Yeah just like we all saw happen when NV released the 4 series . Heh, if that had happened, I'd prolly have bought a Radeon 5 series instead of a GTX460. ATI kept prices high instead and NV gave me the price/performance I wanted which ATI couldn't do.

ATI has the lead now. The need for them to be competitive with NV on prices isn't very high until NV can show them up. NV on the otherhand needs to pricecut. Don't think they'll be retaking the performance title with a rushed board like the 580. Thing is liable to run extremely hot and power hungry.


----------



## fullinfusion (Oct 15, 2010)

God knows I love this VIDEO 
Next....


----------



## EastCoasthandle (Oct 15, 2010)

Unless they do a major rework their counter would be nothing more than refresh. IE: 480 to 485 (even though they may not call it that).


----------



## jasper1605 (Oct 15, 2010)

fullinfusion said:


> God knows I love this VIDEO
> Next....



I can't tell you how many times I've watched that video lol.

And to the haters out there.  I'm glad nvidia is trying to release something to counter AMD, why? Competition is good.  Who cares if their card produces more heat than the sun, if it's out there AMD will need to take note of it and price more aggressively so that's a win in my book


----------



## CDdude55 (Oct 15, 2010)

Dj-ElectriC said:


> I cant believe Nvidia are actualy doing that, this is so pathetic. its just shows how desperate Nvidia to make a move against the HD6000 series





mtosev said:


> nvidia is full of crap.start making good graphic cards or STFU





fullinfusion said:


> God knows I love this VIDEO
> Next....



:shadedshu


----------



## bear jesus (Oct 15, 2010)

fullinfusion said:


> God knows I love this VIDEO
> Next....



I almost spat my coffee out over my htpc at the cuda will fix everything line  had not watched that one before, thanks for the link but damn you at the same time as my htpc is out of the case so you almost made me kill it.  



CDdude55 said:


> :shadedshu


Aww come on you have to be able to laugh at that video even if its not exactly true  but the one about them not making good cards is a flat out lie as nvidia is making good cards, the 460 is tempting me back to the green side with some sli action and thats really saying something as i'm against multi card setups for my own use.


----------



## fullinfusion (Oct 15, 2010)

EastCoasthandle said:


> Unless they do a major rework their counter would be nothing more than refresh. IE: 480 to 485 (even though they may not call it that).


Umm, you mean like this VIDEO


----------



## KainXS (Oct 15, 2010)

512bit in 2010, with 512sp's when it has 128TMU's when the GTX480 has 480 and 60 already . . . . "cough cough bullshit"


all they're doing is trying to sway people away from buying a 69XX when it comes out with the same bullshit they did with the GTX480 when the 5870 dropped.

rumors can't be trusted we all know that lol


----------



## bear jesus (Oct 15, 2010)

KainXS said:


> doing is trying to sway people away from buying a 69XX when it comes out with the same bullshit they did with the GTX480 when the 5870 dropped.



And ATI/AMD has never tryed to steal some of nvidas thunder when they have released a card before them in he past?  this business as usual as both companys just want our money.


----------



## KainXS (Oct 15, 2010)

well thats how you make money


----------



## CDdude55 (Oct 15, 2010)

bear jesus said:


> Aww come on you have to be able to laugh at that video even if its not exactly true  but the one about them not making good cards is a flat out lie as nvidia is making good cards, the 460 is tempting me back to the green side with some sli action and thats really saying something as i'm against multi card setups for my own use.



Seen that video, definitely funny and there is no problem picking on manufacturers for their faults a bit. I'm just getting tired of the constant ignorance im seeing. I'm probably gonna have to stop looking at the comments of these threads(mainly Nvidia ones) and just read the news for them. Very discouraging.


----------



## fullinfusion (Oct 15, 2010)

CDdude55 said:


> Seen that video, definitely funny and there is no problem picking on manufacturers for their faults a bit. I'm just getting tired of the constant ignorance im seeing. I'm probably gonna have to stop looking at the comments of these threads(mainly Nvidia ones) and just read the news for them. Very discouraging.


Hey CD, Im just having fun atm, But really if the shoe was one the other foot...

I however hope Nvidia really put a kick ass gpu together to compete with the new gen Amd gpu. But what ever, the thing I love is knowing that both companies are pulling things together and I cant wait to see what the cards will be like and what there going to be like in 3-5 more years.


----------



## bear jesus (Oct 15, 2010)

CDdude55 said:


> Seen that video, definitely funny and there is no problem picking on manufacturers for their faults a bit. I'm just getting tired of the constant ignorance im seeing. I'm probably gonna have to stop looking at the comments of these threads(mainly Nvidia ones) and just read the news for them. Very discouraging.



I understand, to me this all seams that much more stupid is half the stuff in the nvidia threads is exactly what was being said about ATI back in the days of the ATI 2900  it seams most people have a memory that does not last more than a couple years.



fullinfusion said:


> I however hope Nvidia really put a kick ass gpu together to compete with the new gen Amd gpu. But what ever, the thing I love is knowing that both companies are pulling things together and I cant wait to see what the cards will be like and what there going to be like in 3-5 more years.



Right now i'm really interested to see what both companys do with 28nm chips perferably in about a year


----------



## OneCool (Oct 15, 2010)

Just in time for Home Depots wood screw sale :shadedshu


----------



## CDdude55 (Oct 15, 2010)

You have to let the performance do the talking, Nvidia and AMD are just names, they're companies that don't give shit about us other then the green paper that's in our wallets. I'm very surprised to see such energy and negative enthusiasm for something even Nvidia hasn't even acknowledge exists yet. I'm always skeptical and have my doubts, though i never complain or whine about something that's still up in the air, even if it's predecessor was a bit inefficient, to me that means they have a chance to make sure their game is tight this time around and if they claim they can pack in a 512 bit bus with 2GB's of memory and 512sp with all the other goodies, then have at it, cause i want to see it.

You need competition to get innovation, but being counterproductive by posting nonsensical uninformed statements such as ''nvidia is full of crap.start making good graphic cards or STFU'' run rampant, it brings down slowly the high standards i have for a lot of visitors here at TPU and the site in generally.

But maybe that's just me...


----------



## Bundy (Oct 15, 2010)

CDdude55 said:


> Seen that video, definitely funny and there is no problem picking on manufacturers for their faults a bit. I'm just getting tired of the constant ignorance im seeing. I'm probably gonna have to stop looking at the comments of these threads(mainly Nvidia ones) and just read the news for them. Very discouraging.



This stuff discourages me too. I come here to read about tech and I can't ignore a thread like this, it's too attractive. Unfortunately for me, trolls like these threads too.

On topic: Based on Benetanegia's most useful speculation, this card does indeed seem feasible and has  a lot of wow factor.

I agree with other posters in questioning the need for such a card right now. Within the current line up of GPU's from both AMD and Nvidia, one does not need to purchase the highest performing unit in order to be able to max out the settings in most games. Increasing the performance even more will just shift the gaming customers into a lower relative market position.

Both Nvidia and AMD had better pray for a new generation 'crysis', otherwise they will be selling gamers $100-$150  low end cards instead of $250 mid range cards. Are they hoping 3D will do this for them? maybe but what else? I cant see the commercial payback for these guys if they get too far in front of the market. Otherwise, sometime between now and years end, a new game will come out that can stop all our current rigs dead in their tracks and have us ordering a new $400 card for kicks.


----------



## erixx (Oct 15, 2010)

I second that entirely CD and BUNDY, and..... JUST PREORDERED THE GTX580 CARD


----------



## Bundy (Oct 15, 2010)

erixx said:


> I second that entirely CD and BUNDY, and..... JUST PREORDERED THE GTX580 CARD



lol now for the Crysis II 3D FTW benchmark thread started by erixx


----------



## erocker (Oct 15, 2010)

CDdude55 said:


> You have to let the performance do the talking, Nvidia and AMD are just names, they're companies that don't give shit about us other then the green paper that's in our wallets. I'm very surprised to see such energy and negative enthusiasm for something even Nvidia hasn't even acknowledge exists yet. I'm always skeptical and have my doubts, though i never complain or whine about something that's still up in the air, even if it's predecessor was a bit inefficient, to me that means they have a chance to make sure their game is tight this time around and if they claim they can pack in a 512 bit bus with 2GB's of memory and 512sp with all the other goodies, then have at it, cause i want to see it.
> 
> You need competition to get innovation, but being counterproductive by posting nonsensical uninformed statements such as ''nvidia is full of crap.start making good graphic cards or STFU'' run rampant, it brings down slowly the high standards i have for a lot of visitors here at TPU and the site in generally.
> 
> But maybe that's just me...



...and if it's not full of people making counterproductive remarks to articles that hold little fact and or is based off of rumors, it's full of people defending a company due to a purchase from that company. Welcome to internet tech forums. Don't feel that you need to defend a corporation, they have the money and resources to do it themselves, be happy with whatever your purchase is from whatever brand you decide to go with. Face it. Fermi was a disaster for Nvidia. Delays, BS press releases, wood screws, etc. They made face, disabled some features/shaders/whatever and brought out a late product that competes with the competition on a performance level but not at a business level. The GTX 460 is a start for Nvidia, but they really need to progress further.


----------



## dalekdukesboy (Oct 15, 2010)

buggalugs said:


> For the extra 10-20% you have a blast furnace in your comp and paid an extra $100-$200.
> 
> 
> Um at this point only if you're an idiot could you pay more than 100$ more for a 480 than a 5870...cheapest 5870 is still in the 380 range and plenty going into the mid 400's and you can get a 480 now for 440-500 at most...so even the extreme differences is barely over 100...and most of the cards are within 50 bucks or so...


----------



## CDdude55 (Oct 15, 2010)

erocker said:


> ...and if it's not full of people making counterproductive remarks to articles that hold little fact and or is based off of rumors, it's full of people defending a company due to a purchase from that company. Welcome to internet tech forums. Don't feel that you need to defend a corporation, they have the money and resources to do it themselves, be happy with whatever your purchase is from whatever brand you decide to go with.



Not defending anyone, as i said, they're just names to me that have little to no importance to the products that get put out and i think that's what people are forgetting. I agree, if you prefer a brand, have at it, it's your money. If you do have a preference, and can articulate as to why you have that preference in a thread like this, go for it. 



			
				erocker said:
			
		

> Let's face it. Fermi was a disaster for Nvidia. Delays, BS press releases, wood screws, etc. They made face, disabled some features/shaders/whatever and brought out a late product that competes with the competition on a performance level but not at a business level. The GTX 460 is a start for Nvidia, but they really need to progress further.



Meh, shit happens, you have to hope they learned from their mistakes and will make it better to the end user the next round.


----------



## cadaveca (Oct 15, 2010)

CDdude55 said:


> Meh, shit happens, you have to hope they learned from their mistakes and will make it better to the end user the next round.




Considering that J.H. Huang admitted that mismanagement of the Fermi development process was truly the problem that lead to all the issues pre-release, this is truly a crucial point of time for nV right now, from an investor perspective. Huang screwed up, admitting that, and investors will be paying very close attention to nV's actual business success this time, rather than product performance.

Really, while people like us are the common end user of nv's products, Fermi was a very poor showing of business, and any issues must be fixed, controlled, and turned around in a very efficient manner, or investors, who truly motivate nV as a company, will be pulling out even more than they have the past 18 months.

"Shit happens" isn't gonna fly this time. Huang said basically the same tihng, so that excuse has dried up at the well, and they filled in the hole, too.

OMG the commas kill me.


----------



## CDdude55 (Oct 15, 2010)

cadaveca said:


> Considering that J.H. Huang admitted that mismanagement of the Fermi development process was truly the problem that lead to all the issues pre-release, this is truly a crucial point of time for nV right now, from an investor perspective. Huang screwed up, admitting that, and investors will be paying very close attention to nV's actual business success this time, rather than product performance.
> 
> Really, while people like us are the common end user of nv's products, Fermi was a very poor showing of business, and any issues must be fixed, controlled, and turned around in a very efficient manner, or investors, who truly motivate nV as a company, will be pulling out even more than they have the past 18 months.
> 
> ...



Yes, i agree. I'm not saying it's ok to screw up multiple times in a row.


----------



## Benetanegia (Oct 16, 2010)

cadaveca said:


> Considering that J.H. Huang admitted that mismanagement of the Fermi development process was truly the problem that lead to all the issues pre-release, this is truly a crucial point of time for nV right now, from an investor perspective. Huang screwed up, admitting that, and investors will be paying very close attention to nV's actual business success this time, rather than product performance.
> 
> Really, while people like us are the common end user of nv's products, Fermi was a very poor showing of business, and any issues must be fixed, controlled, and turned around in a very efficient manner, or investors, who truly motivate nV as a company, will be pulling out even more than they have the past 18 months.
> 
> ...



I think that investors, at least the clever ones, should be happy. Fermi had one goal, to open up a new and profitable market while staying as competitive as posible in the consumer area. Well, there IS a new market that Fermi has opened and it's doing quite well honestly. All the big names making servers have at least one product incorporating Fermi cards and that's a lot to say***.

Also there is interest in the product and the capabilities, if anyone saw the GTC, there's a lot of actual companies using Fermi for amazing things and you can see they are getting ostensible benefits from the use of it****.

So i think that's something that can only grow, at least for the next 2-5 years and is a good way to become relevant in something more than the volatile gaming crowd and I think that's something investors care for, so they should be at least semi-happy. 

I said semi-happy, because yes it could have been better, but it's been good so far. Also the new Quadro cards kick FirePro cards in the arse big time (50%-100% faster), and the professional GPU market has been the only segment of the semiconductor market that has seen an increase since the recession I believe, so any superiority in that market should put at least half a smile in investor's face IMO.


Reminder for fanboys:

*** It took I think 15 years of attempts and 3-4 different chips to Intel and AMD to enter these same market, so think twice before trying to underestimate this achievement.

**** We are talking about CEO from companies with really big names that have revenues 10 to 50 times higher than Nvidia and would eat them for lunch if they were hungry, so please Ati fanboys and general trolls, refrain from the typical "he's been paid to say that" that seems so unavoidable for some of you you to say whenever a developer of any kind praises anything Nvidia does.


----------



## Wiselnvestor (Oct 16, 2010)

I think that investors, at least the clever ones, should GTFO while you still can.

I can smell lowered 4Q revenue coming.


----------



## cadaveca (Oct 16, 2010)

Wiselnvestor said:


> I think that investors, at least the clever ones, should GTFO while you still can.
> 
> I can smell lowered 4Q revenue coming.





AMD's release is critical. WTF do you tihnk there's no real concrete info ut yet, so close to release?




As if there is any other reason.


----------



## lism (Oct 16, 2010)

This one is for Ati.  Much better performance, TDP's and extra's, nvidia is rereleasing their 490 and rebranding towards 580.


----------



## fullinfusion (Oct 16, 2010)

cadaveca said:


> AMD's release is critical. WTF do you tihnk there's no real concrete info ut yet, so close to release?
> 
> 
> 
> ...


Cad with all your problems, I believe this gpu is going to be the card that stops ya from having all your issues... 

I cant say much... i already have but like I said 2 yrs ago I have blood working in Toronto.

Troy tells me there isn't going to be issues with drivers like the 5 series drivers has,... that's all


----------



## Benetanegia (Oct 16, 2010)

Wiselnvestor said:


> I think that investors, at least the clever ones, should GTFO while you still can.
> 
> I can smell lowered 4Q revenue coming.



Aaah how many clueless AMD investors said the same 2 years ago and sold their shares for $2-4... how easy the path was made for clever ones to buy more AMD shares instead and double their inversion is just 6 months...


----------



## fullinfusion (Oct 16, 2010)

Benetanegia said:


> Aaah how many clueless AMD investors said the same 2 years ago and sold their shares for $2-4... how easy the path was made for clever ones to buy more AMD shares instead and double their inversion is just 6 months...


Really, are you one of them?  who cares about stock? Do you buy Amd Hardware because of the stock prices? I doubt it.... you buy because of the results!  

Wow


----------



## Benetanegia (Oct 16, 2010)

fullinfusion said:


> Really, are you one of them?  who cares about stock? Do you buy Amd Hardware because of the stock prices? I doubt it.... you buy because of the results!
> 
> Wow



This has nothing to do with hardware. Read above, they are saying that everything about Fermi was bad for investors. I'm saying that not everything was bad for clever ones, because there are good prospects for te recent future. Clever ones know that Fermi being late is going to make shares fall, they do know better than anyone that their shares are going to deprecate in the next couple of quarters probably, but they also know that instead of selling in order to try to cut loses, it's better to buy from those who are peeing themselves and wait for the profits in a not so distant future.

One thing is clear, the most clever investors (though of a different type) sold Nvidia shares MONTHS ago, just after shares topped out and started falling. Anyone who still has Nvidia shares now and sells them now, it's plainly stupid. Period.


----------



## erocker (Oct 16, 2010)

Benetanegia said:


> Aaah how many clueless AMD investors said the same 2 years ago and sold their shares for $2-4... how easy the path was made for clever ones to buy more AMD shares instead and double their inversion is just 6 months...



Lol. I bought my shares at $11 bucks a pop. Then Phenom came out.... I just don't understand some of these fools who sell so low. If you have stock and it drops that low, it should then be considered a (very) long term investment.


----------



## Benetanegia (Oct 16, 2010)

erocker said:


> Lol. I bought my shares at $11 bucks a pop. Then Phenom came out.... I just don't understand some of these fools who sell so low. If you have stock and it drops that low, it should then be considered a (very) long term investment.



Exactly. Either you sell as soon as they start to fall, or you wait. Selling while they're falling is the most stupid thing you can do.


----------



## AsRock (Oct 16, 2010)

CDdude55 said:


> As i have always said, TPU is generally bias towards ATI/AMD cards. People can disagree, but I've been here for around 3 years and have seen this crap run rampant for a while now unfortunately..
> 
> As stated above, how about we wait for the card to at least see it's system specs confirmed and we have some official announcements from Nvidia themselves, but, something tells me this type of irrational thinking will just happen again in those threads too..



The shit happens all the time and both are as bad ATI or nvidia fans.  It's the a way to make people wait till both companys have there shit sorted so they have a better chance of you buying the later and not the 1st.


----------



## phanbuey (Oct 16, 2010)

Benetanegia said:


> Exactly. Either you sell as soon as they start to fall, or you wait. Selling while they're falling is the most stupid thing you can do.



not really.  thats a common misconception, especially if there are better investments out there.  Just because a stock fell and you lost money on it, you shouldn't hold the shares in hopes of "making it up."  Especially if there is a better investment you could put your money into.  

fallen stock=sunk cost.  people want to hold on to stocks that fell in hopes of earning their money back.  in this case it paid off, in some most cases.


----------



## erocker (Oct 16, 2010)

phanbuey said:


> not really.  thats a common misconception, especially if there are better investments out there.  Just because a stock fell and you lost money on it, you shouldn't hold the shares in hopes of "making it up."  Especially if there is a better investment you could put your money into.
> 
> fallen stock=sunk cost.  people want to hold on to stocks that fell in hopes of earning their money back.  in this case it paid off, in some most cases.



Perhaps in a bull market where there are many stocks doing well. Buying high and selling low in the current market is not wise. Doing so would be trading one long term investment for another. Though it does depend on how much loss you have and how much of a "sure deal" you would have investing in something else. I feel sorry for those who bought AMD stock when it was over $20 a share.

I have lots of investments in stock. The mentality you describe is what a lot of recent "Wall Street losers" have done recently. The parameters of logic in the market changes, it is not constant. Going by one set of logic in a time when the market does not go with that logic is a bad move. When the market is flat like it is right now, one should concentrate on long term and safe stocks. Bad time to sell like the rest of the sheeple have been doing. There is no quick money to be made right now.

In a short time may be a good time to buy Nvidia stock.  Now is a great time to buy AMD stock.


----------



## Makaveli (Oct 16, 2010)

This thread was an interesting read even with all the BS and speculation.


----------



## Benetanegia (Oct 16, 2010)

phanbuey said:


> not really.  thats a common misconception, especially if there are better investments out there.  Just because a stock fell and you lost money on it, you shouldn't hold the shares in hopes of "making it up."  Especially if there is a better investment you could put your money into.
> 
> fallen stock=sunk cost.  people want to hold on to stocks that fell in hopes of earning their money back.  in this case it paid off, in some most cases.



Yes, if you have something that you know 100% sure that's going to give you money, then maybe, you could jump to that thing. But that never happens (legally ), or in very few cases, and most people that do that end up falling in a loop: sell -> buy another thing -> sell when it falls (loss) -> buy another thing -> sell (loss) and so on.

EDIT: OR in the best case you end up like Erocker said, exchanging long term for long term. The reason that what I said above happens is psycological actually. People that sell because shares have been falling for several months, also typically only buy shares that have been on the rise for several months, so there's nothing to be made there, they are always 2 or 3 steps behind. Most usually they buy sell, 2 months away from the change, so they lose either money or their time and effort.

In any case, I never said that it's not best to sell when they start falling, but after many months of falling shares the only smart thing is to hold on, again unless you are 100% sure that you are going to profit from another one, which is never the case. You should never put all your eggs in one basket anyway, so having some not profiting for a few months shuld never be a problem. If you do, well you are a bad investor to begin with.

The best investors in the Forbes list almost ALWAYS sell when the shares are still growing (just before the fall) and buy shares when they are low and still falling. Investing is all about having balls, balls to say: this is enough profit, time to invest into the next "sinking boat".


----------



## Imsochobo (Oct 16, 2010)

Benetanegia said:


> Aaah how many clueless AMD investors said the same 2 years ago and sold their shares for $2-4... how easy the path was made for clever ones to buy more AMD shares instead and double their inversion is just 6 months...



I sold nvidia stocks, went over to amd at 2 and sold again at 8.


----------



## DaedalusHelios (Oct 16, 2010)

CDdude55 said:


> hmm, It seems when it's Nvidia news, everybody complains..:shadedshu
> 
> This is good news and i hope for the best so we can get some solid competition going, lets get some of these prices down!!!!



I agree.

AMD fanboys scream the loudest. I just want to see GPU tech progressing. TBH with architectural changes like we see in Fermi display much better innovation and especially for its use in scientific research. I know most don't think of this, but curing diseases through GPU tech is much more important than FPS or power consumption. Nvidia is doing fine with FPS but needs to go a little greener if possible. I like both companies. Hate is never a good thing guys.


----------



## wolf (Oct 16, 2010)

personally I can't wait till nividia's next GPU's are out and about


----------



## bear jesus (Oct 16, 2010)

DaedalusHelios said:


> I like both companies. Hate is never a good thing guys.



I agree, even fanboys shoudl like the other company as they help push their favourite company to make even better products.

For the past decade or more since i stopped using 3dfx after my 3500-tv (yes i was a bit of a 3dfx fanboy back then but i was a teenage bear give me a break ) i have been constantly swapping between nvidia and ati all depending on what they can give me for the money i can spend, everyone should love both companys as even if they both do things that annoy people (renaming, rebranding, fail with software now and then, etc) in the end we win as we keep getting insane ammounts more comute power each generation recently and is that not what we all want?

I'm always looking forward to every new release like right now i am excited abut the next year for kepler and southern islands, bulldozer and sandy bridge, i will never really care who makes them it's all about what they can do, i just wish everyone else could think in a simmilar way (yes i know things like support and drivers effect why people like specfic companys but those are things that constantly change with new product cycles).

*end early morning coffee lacking rant*


----------



## mdsx1950 (Oct 16, 2010)

Not everyone who doesn't like nVidia's decisions are fanboys. I personally have lost hopes of getting a powerhouse nVidia card as their GTX 480 was not upto my standard. And they never did release any other cards. (GTX485 etc) And the GTX 4xx series took way to long to come out. So instead i went and bought a GTX260 for PhysX. nVidia was awesome. Not anymore IMO.


----------



## bear jesus (Oct 16, 2010)

mdsx1950 said:


> Not everyone who doesn't like nVidia's decisions are fanboys. I personally have lost hopes of getting a powerhouse nVidia card as their GTX 480 was not upto my standard. And they never did release any other cards. (GTX485 etc) And the GTX 4xx series took way to long to come out. So instead i went and bought a GTX260 for PhysX. nVidia was awesome. Not anymore IMO.



I can understand not being very impressed with them this round, out of all the 4xx cards the only one thats interested me is the 460 and going by your current and past rigs a couple 1gb 460s would be step backwards, but i'm sure if they came out with a new card that thrashed AMD's card and didnt suck up crazy ammounts of power you would happly get one?... or four


----------



## mdsx1950 (Oct 16, 2010)

bear jesus said:


> I can understand not being very impressed with them this round, out of all the 4xx cards the only one thats interested me is the 460 and going by your current and past rigs a couple 1gb 460s would be step backwards, but i'm sure if they came out with a new card that thrashed AMD's card and didnt suck up crazy ammounts of power you would happly get one?... or four



Well before i got the GTX260. I was waiting and waiting for their flagship card. Still no sign of it. :'( But yes i would have happily got four  xD


----------



## bear jesus (Oct 16, 2010)

mdsx1950 said:


> Well before i got the GTX260. I was waiting and waiting for their flagship card. Still no sign of it. :'(



I know the feeling, around this time last year i was expecting to see both ati and nvidia release some awesome cards for me to choose between but then fermi got delayed so i though i would wait to see what they offer and by the time i worked out i would not be seeing a full fat card the rumors of the new ati cards were coming out... i'm still waiting for my gpu upgrade after near a year  so right now i really don't care what nvidia will bring out in whatever time frame its supposed to be as i very much doubt it would be before the end of the year so i'm just going to see whats around before the end of the year and treat myself to 3 new monitors and a new gpu or 2... but i will always be jelous of your rig 

I don't expect nvidia to bring out anything that awesome untill they start on 28nm chips, i just hope they can release them close to amd's 28m gpu's so that there is no stupidly long waiting time to see what both are like to make a good choice.


----------



## cheezburger (Oct 16, 2010)

seems that spec would going to crush cayman for sure, rop and bus are far important then most people think and i say amd will have to learn hard lesson from it and just to been cheap ass. i mean com'on! 64rops vs 48rops or even 32rops and 512bit bus vs narrow 256bit bus or cripply 384bit bus? gtx 580 will eat cayman alive and everything will all goes back to good old 38xx day!


----------



## a_ump (Oct 16, 2010)

cheezburger said:


> seems that spec would going to crush cayman for sure, rop and bus are far important then most people think and i say amd will have to learn hard lesson from it and just to been cheap ass. i mean com'on! 64rops vs 48rops or even 32rops and 512bit bus vs narrow 256bit bus or cripply 384bit bus? gtx 580 will eat cayman alive and everything will all goes back to good old 38xx day!



haha well if those thoughts make you happy sweet. But this is a statement based on flimsy rumors right now, sure GTX 580 will come eventually but who know's when and what it'll actually be. 

And in my honest opinion, your statement of AMD being cheap ass's using a 256-bit bus+32 ROPs vs the GTX 480's 384-bit bus and 48ROP's, really only states that AMD's design is more efficient. The GTX 480 has 50% more bus width and 50% more ROP's not to mention transistor count and die size, yet the HD 5870 is only 11% slower overall. Let's not get started on the GTX 470 which has 25% more bus width and 25% more ROP's....and fall's below HD 5870 perf. 

I must say i completely disagree with your logic. Afterall, when there are source's like this showing AMD's new 960SPU card beating AMD's 1440SPU 5850, and almost going neck and neck with the 1600SPU 5870....i really fail to see how you figure Nvidia is going to bounce back.

This also means(if all the rumors for the past 2months r true) that AMD has increase their shader efficiency by roughly 34%. So imagine the performance of one of AMD's new chips at 1440SPU's. It's possible for Nvidia to make a come-back(which they will with time), but it won't be this gen IMHO


----------



## wolf (Oct 16, 2010)

a_ump said:


> ...And in my honest opinion, your statement of AMD being cheap ass's using a 256-bit bus+32 ROPs vs the GTX 480's 384-bit bus and 48ROP's, really only states that AMD's design is more efficient. The GTX 480 has 50% more bus width and 50% more ROP's not to mention transistor count and die size, yet the HD 5870 is only 11% slower overall. Let's not get started on the GTX 470 which has 25% more bus width and 25% more ROP's....and fall's below HD 5870 perf...



it's not quite as linear as you make it sound, the Fermi chips indeed carry more bus width and ROP's but a decent whack slower clockspeeds on core and memory than their AMD rival cards, thus making the difference smaller than 50% and 25% respectively.

I agree to an extent in terms of transistor count and die size, and it physically does have more grunt, ATi just plain did freakin' well with the 5000 series. they started off fast and just kep getting faster for the following 6 months or so with better and better drivers.


----------



## a_ump (Oct 16, 2010)

wolf said:


> it's not quite as linear as you make it sound, the Fermi chips indeed carry more bus width and ROP's but a decent whack slower clockspeeds on core and memory than their AMD rival cards, thus making the difference smaller than 50% and 25% respectively.
> 
> I agree to an extent in terms of transistor count and die size, and it physically does have more grunt, ATi just plain did freakin' well with the 5000 series. they started off fast and just kep getting faster for the following 6 months or so with better and better drivers.



o i fully realized that , lol i was just tryin in a sense to prove what a moot point it was for cheezburger to say ATI's being cheap hardware wise when they, as you stated, make up for it with better memory controller's and clock speed. Plus they're in the lead. And no pun intended cheezburger


----------



## motasim (Oct 16, 2010)

cheezburger said:


> ... gtx 580 will eat cayman alive and everything will all goes back to good old 38xx day!



... I'm sorry to break it to you mate, but it'll be some time before nVidia can produce a GPU that will "eat alive" ATI's top of the line GPU ... I believe that nVidia needs to minimize the losses for the time being until their 28nm chip sees light, and at that time that chip must beat ATI's 28nm chip or I believe that Intel will finally get the chance to humiliate and acquire nVidia  ... I personally hope, for our sake as gamers, that nVidia recover and beat ATI in the next generation of GPUs, but definitely this won't happen by the GTX 580 ...


----------



## Kantastic (Oct 16, 2010)

Kantastic will counter Nvidia's GTX 580 with his VE 9000+.


----------



## wolf (Oct 16, 2010)

Kantastic said:


> Kantastic will counter Nvidia's GTX 580 with his VE 9000+.



AMD I choose you! use a splash attack!


----------



## motasim (Oct 16, 2010)

Kantastic said:


> Kantastic will counter Nvidia's GTX 580 with his VE 9000+.



... I'll take two, and use them in cross-kantastic setup ...


----------



## DaedalusHelios (Oct 16, 2010)

cheezburger said:


> seems that spec would going to crush cayman for sure, rop and bus are far important then most people think and i say amd will have to learn hard lesson from it and just to been cheap ass. i mean com'on! 64rops vs 48rops or even 32rops and 512bit bus vs narrow 256bit bus or cripply 384bit bus? gtx 580 will eat cayman alive and everything will all goes back to good old 38xx day!



I would think Nvidia will have the more powerful single GPU again but counter late like before. Probably a slightly lower power consumption or the same as the GTX 480 for the most part. 

AMD will most likely get a little more energy efficient and just a little more performance than before at a lower price point. Probably also featuring a 5770 rebadge with no 6-pin molex requirement for the HTPC crowd.

Probably a repeat for Nvidia without being quite as late and what we thought Fermi to be all along will be found in the GTX 580. AMD will most likely gain a little more marketshare on the GPU front and continue the trend of trailing Intel by a longshot in CPU tech and sales.


----------



## a_ump (Oct 16, 2010)

with how badly AMD is doing in the CPU market...makes me wonder if they're purposely investing a good bit more resources in their GPU department and just running with what they got going good and slowly catch up CPU wise.


----------



## wolf (Oct 16, 2010)

I don't get why we are even hearing about this rather than some decent respin's, namely focussing on the potent GF104 chip like a GTX475 and GTX495.

a fully unlocked dual GF104 has the potential to be 50% faster than a GTX480, it just seems logical to me to go down that path first, and get a new GPU completely right.


----------



## Kantastic (Oct 16, 2010)

a_ump said:


> with how badly AMD is doing in the CPU market...makes me wonder if they're purposely investing a good bit more resources in their GPU department and just running with what they got going good and slowly catch up CPU wise.



Probably has a little to do with the fact that they make more revenue in the GPU market, money they can use for R&D for CPU's.


----------



## CDdude55 (Oct 16, 2010)

It would of been nice to see a dual GPU Fermi card, like wolf said, dual GF104 chips would of been a sweet combination.


----------



## AKlass (Oct 16, 2010)

I'm guessing to make the 580 chips they're going to use the chips that actually were suppose to be the 480 and whatever chips that are defective will get its shaders disabled and become the 480


----------



## wahdangun (Oct 16, 2010)

yeah its quite confusing, why nvdia didn't release full GTX460 ? i think the architecture can allow more heat for better performance


----------



## phanbuey (Oct 17, 2010)

wahdangun said:


> yeah its quite confusing, why nvdia didn't release full GTX460 ? i think the architecture can allow more heat for better performance



DING! 


I think the dual GPU full 460 will be competing with the HD 6900 on the high end.  490 might be just that.  I think the 580 is a silly rumor. but then again who knows.


----------



## Black Panther (Oct 17, 2010)

CDdude55 said:


> It would of been nice to see a dual GPU Fermi card, like wolf said, dual GF104 chips would of been a sweet combination.



+1, I was more expecting to see something to compete with the 5970. As things stand right now ATI's got a good setup with the dual gpu's in one card there  I find myself always comparing with the 5970, 'forgetting' that it's not single gpu in doing so


----------



## Fourstaff (Oct 17, 2010)

Lets get the nutters at Asus get the dual GF104 out and running then.


----------



## lism (Oct 17, 2010)

a_ump said:


> with how badly AMD is doing in the CPU market...makes me wonder if they're purposely investing a good bit more resources in their GPU department and just running with what they got going good and slowly catch up CPU wise.



This is going to change when their APU is in full production and on every motherboard out there menth for office use.


----------



## alexsubri (Oct 17, 2010)

GTX 580 = 
	

	
	
		
		

		
			





 nuff' said


----------



## 3volvedcombat (Oct 17, 2010)

Im not trying to be a nvidia fan boy, but if i was nvidia, i wouldn't really start a new series of cards, I would keep a name for the 400 with all that advertising and development, and release higher end competition under GTX 400 series code names. Then like the rumors are spread about nvidia lowering the SKU prices of all there gpu's, make perfect competition for the ATI 6000 series. 

WIN!

512-core GTX 485 on quick marketing here we come!


----------



## cheezburger (Oct 17, 2010)

a_ump said:


> haha well if those thoughts make you happy sweet. But this is a statement based on flimsy rumors right now, sure GTX 580 will come eventually but who know's when and what it'll actually be.
> 
> And in my honest opinion, your statement of AMD being cheap ass's using a 256-bit bus+32 ROPs vs the GTX 480's 384-bit bus and 48ROP's, really only states that AMD's design is more efficient. The GTX 480 has 50% more bus width and 50% more ROP's not to mention transistor count and die size, yet the HD 5870 is only 11% slower overall. Let's not get started on the GTX 470 which has 25% more bus width and 25% more ROP's....and fall's below HD 5870 perf.
> 
> ...



960sp was prototype, the real tape out is 1280sp. 


shader don't directly affecting framerate and they aint the filling unit. a 10000sp with just 16rops will most definitely stuck at maximum frame rate while 128sp with 64rops will have 4x theoretically frame rate under same clock rate. more shader or powerful shader only apply the change of graphical quality and heavily detail loading. the reason why criticizing being cheap ass because they sell low spec gpu for higher price and all they doing are just clock up and stuff more shader and then another new price tag....it feels like it's not worthy to spend 300+dollar on something you can just swipe bios/overclock it in you house...while nvidia offer more feature and spec rich (cuda/physx/3Dvision/tessellation, 40/48rops and larger bus). in the same price gtx 470 offer far better performance/feature than what cypress can do but cost the same! the only criticism is the power consumption but WHO CARE? i just want my money to be spend in the right way then worry about this dirty ugly planet. want me to buy an amd card to save the planet?...i'd rather watch the planet die then pay for overprice low spec card like this. 

unless amd can bring out some serious spec card with decent price i would never spend any penny just for the sack of power efficiency and saving the god damn planet....


----------



## bear jesus (Oct 17, 2010)

cheezburger said:


> the only criticism is the power consumption but WHO CARE? i just want my money to be spend in the right way then worry about this dirty ugly planet.




Screw the planet, i just don't want my card choice to cost me an extra $300 more to power over its lifetime thus near doubling its cost to me and of corse i don't want something that will cook me alive in my tiny closet like computer room


----------



## dir_d (Oct 17, 2010)

That is your opinion and others probably care about the things you do not. You gotta remember overclocking is a bonus and not required to be provided. For the average users which is the majority of the market, AMD is meeting their demands and succeeding.


----------



## CDdude55 (Oct 17, 2010)

bear jesus said:


> Screw the planet, i just don't want my card choice to cost me an extra $300 more to power over its lifetime thus near doubling its cost to me and of corse i don't want something that will cook me alive in my tiny closet like computer room



But if you have enough to afford the card, wouldn't you have enough to at least power and cool the damn thing? lol

That's what i have still yet to understand, these are top tier cards you're paying for, now of course this doesn't excuse inefficiencies, but if you're paying for a card like that at least pay for a sufficient amount of power and cooling for it if you're considering a card that costs $350 and up.

On a differnt note, AMD has definitely got a good performance per watt thing going with the 5 series, and i hope it continues to get even better.(and i hope Nvidia follows suit in that regard)


----------



## bear jesus (Oct 17, 2010)

CDdude55 said:


> But if you have enough to afford the card, wouldn't you have enough to at least power and cool the damn thing? lol
> 
> That's what i have still yet to understand, these are top tier cards you're paying for, now of course this doesn't excuse inefficiencies, but if you're paying for a card like that at least pay for a sufficient amount of power and cooling for it if you're considering a card that costs $350 and up.
> 
> On a differnt note, AMD has definitely got a good performance per watt thing going with the 5 series, and i hope it continues to get even better.(and i hope Nvidia follows suit in that regard)



It's not that i can't afford it... i just don't like paying for it 
 the lower my bills the more money i have to spend on other things, and if my gpu choice can save me near $150 in a year thats proberley 8 or 10 games that are on sale on steam (my pc is on 24/7 so power adds up pretty quick at $0.22 per kw/h).

But i do see your point, but cooling is a pain in the ass as no air con and its a tiny room with no window  don't ask  so heat output means a lot to me.


----------



## CDdude55 (Oct 17, 2010)

bear jesus said:


> It's not that i can't afford it... i just don't like paying for it
> the lower my bills the more money i have to spend on other things, and if my gpu choice can save me near $150 in a year thats proberley 8 or 10 games that are on sale on steam (my pc is on 24/7 so power adds up pretty quick at $0.22 per kw/h).
> 
> But i do see your point, but cooling is a pain in the ass as no air con and its a tiny room with no window  don't ask  so heat output means a lot to me.



That's a good point...

Depends a lot on the circumstance I'd expect.


----------



## Benetanegia (Oct 17, 2010)

bear jesus said:


> (my pc is on 24/7 so power adds up pretty quick at $0.22 per kw/h).



24/7. But doing what? A GTX470 and HD5870 concume about the same when in idle, GTX460 consumes incredibly lower than both. Screw the power consumption shown in reviews because power consumption of HD5000 cards is not that one anymore. Remember that idle clocks had to be increased in order to avoid all the issues, and idle consumption has increased accordingly. 

Idle consumption is similar in both cases and it's only when on full load when a GTX470 consumes so much more, full load meaning Furmark type of full load and not Crysis type of full load.


----------



## CDdude55 (Oct 17, 2010)

Benetanegia said:


> 24/7. But doing what? A GTX470 and HD5870 concume about the same when in idle, GTX460 consumes incredibly lower than both. Screw the power consumption shown in reviews because power consumption of HD5000 cards is not that one anymore. Remember that idle clocks had to be increased in order to avoid all the issues, and idle consumption has increased accordingly.
> 
> Idle consumption is similar in both cases and it's only when on full load when a GTX470 consumes so much more, full load meaning Furmark type of full load and not Crysis type of full load.



This is true..


----------



## bear jesus (Oct 17, 2010)

Benetanegia said:


> 24/7. But doing what? A GTX470 and HD5870 concume about the same when in idle, GTX460 consumes incredibly lower than both. Screw the power consumption shown in reviews because power consumption of HD5000 cards is not that one anymore. Remember that idle clocks had to be increased in order to avoid all the issues, and idle consumption has increased accordingly.
> 
> Idle consumption is similar in both cases and it's only when on full load when a GTX470 consumes so much more, full load meaning Furmark type of full load and not Crysis type of full load.



It varys a lot, sometimes i do fold but with ati's performance i'm not folding much now untill i get an nvidia card in my main rig or htpc, i do some cpu folding as well but not 24/7 as i stop folding to game most of the time, i can spend anywhere upto 6 or 8 hours ingame in a day, also at times when i know i won't be doing much i clock my 4870 down to 500mhz core and 500mhz for the memory and drop my cpu down to 1 core at 1.8ghz, then again if the extra speed helps i bump my gpu voltage up and clock to 870mhz core and 1100mhz ram(4.4hz effective)

My power usage varys a lot from week to week depending on settings and usage. even more so as sometimes i push everything to the max and then forget i have once in windows so cause much higher idle draw than it should be.


----------



## cheezburger (Oct 17, 2010)

bear jesus said:


> Screw the planet, i just don't want my card choice to cost me an extra $300 more to power over its lifetime thus near doubling its cost to me and of corse i don't want something that will cook me alive in my tiny closet like computer room



get proper cooling so you won't cook yourself alive. electric bill, work more or ask your boss for raise, or else find a new job and earn more money so you can pay additional power bill. 

problem solved!


----------



## CDdude55 (Oct 17, 2010)

cheezburger said:


> get proper cooling so you won't cook yourself alive. electric bill, work more or ask your boss for raise, or else find a new job and earn more money so you can pay additional power bill.
> 
> problem solved!



That's a bit much to go through just to power your system

If he can already pay want he has now, but doesn't want to pay extra by going with GF100, then i think that's a smart decision to pick something with less power consumption for the sake of saving money.


----------



## Kantastic (Oct 17, 2010)

cheezburger said:


> get proper cooling so you won't cook yourself alive. electric bill, work more or ask your boss for raise, or else find a new job and earn more money so you can pay additional power bill.
> 
> problem solved!



So he gets "proper cooling" and drops the temps of his GPU, does that somehow lower the amount of heat being dissipated into his room? 

Work more, ask for a raise, or find a new job? You make it sound so easy, might as well have him rub a lamp and ask the genie for a billion dollars.


----------



## DaedalusHelios (Oct 17, 2010)

Kantastic said:


> So he gets "proper cooling" and drops the temps of his GPU, does that somehow lower the amount of heat being dissipated into his room?
> 
> Work more, ask for a raise, or find a new job? *You make it sound so easy, might as well have him rub a lamp and ask the genie for a billion dollars*.



Computers don't cost that much. 

We are most likely talking about unemployed or college kids that cannot afford a decent computer if built from parts on the cheap. Hell BST's allow you to get a good amount of bang per $(or respective currency). 

The room without a window part sounds kind of funny though.


----------



## crow1001 (Oct 18, 2010)

LOL some speculative info on Nvidia cards gets a lot more interest than news on the new AMD cards.


----------



## bear jesus (Oct 18, 2010)

DaedalusHelios said:


> The room without a window part sounds kind of funny though.



Haha it's not funny in the middle of summer with just a phenom 965 and an ati 4870 i have to use a ducting fan to bring cool air into the room to avoid baking to death, an i7 980x and 4 way 480 sli would be suicide 

I have to laugh at the get proper cooling thing, i live in the uk it really does not get hot enough to need any form of cooling... unless you count crazy people locking themselfs in tiny rooms with electronics.

Really i would assume many people here would agree, if your power bill is already $2100 a year you dont really want to pay even more when you could be using that money for other more fun things 



cheezburger said:


> work more or ask your boss for raise, or else find a new job and earn more money so you can pay additional power bill.
> 
> problem solved!



I think i'm quite happy with my current job that gives me more freedom than most people i have ever met and more money than every last one of my friends, also no boss to ask for a rasie


----------



## CDdude55 (Oct 18, 2010)

crow1001 said:


> LOL some speculative info on Nvidia cards gets a lot more *negative* interest than news on the new AMD cards.



fixed.. lol


----------



## bear jesus (Oct 18, 2010)

CDdude55 said:


> fixed.. lol




nvida needs to make another 8800 as in card that thrashes anything by amd for people to stop complaning, most people loved nvidia during the 8800 time... well exect die hard ati fanboys lol


----------



## CDdude55 (Oct 18, 2010)

bear jesus said:


> nvida needs to make another 8800 as in card that thrashes anything by amd for people to stop complaning, most people loved nvidia during the 8800 time... well exect die hard ati fanboys lol



Ya, the 8800's were really a shining series for Nvidia, even today, an 8800 GTX could play most games at med spec and at a decent res, largely considering the fact that most devs are making crappy console ported games unfortunately.:shadedshu, but ya it was an awesome line of cards.

People can criticize Fermi in many regards, but it gets taken to far when a thread on a new line up of cards, that aren't even officially announced yet, hell we don't really even know if it's gonna be called the ''580'' for sure, and yet people still complain and whine over speculative rumors.

I just hope AMD and Nvidia bring some awesome cards to the market and at some decent prices, thats all i want..


----------



## bear jesus (Oct 18, 2010)

CDdude55 said:


> Ya, the 8800's were really a shining series for Nvidia, even today, an 8800 GTX could play most games at med spec and at a decent res, largely considering the fact that most devs are making crappy console ported games unfortunately.:shadedshu, but ya it was an awesome line of cards.
> 
> People can criticize Fermi in many regards, but it gets taken to far when a thread on a new line up of cards, that are even officially announced yet, hell we don't really even know if it's gonna be called the ''580'' for sure, and yet people still complain and whine over speculative rumors.
> 
> I just hope AMD and Nvidia bring some awesome cards to the market and at some decent prices, thats all i want..



I think somehow people assume as fermi (mainly gf100) was far from perfect that the next cards will be worse, realy that just defys logic, after all the bad press i'm sure nvidia would have been working hard on making the next high end cards as good as they possibly can.

I'm with you on wanting both companys to bring some awesome and well priced cards too the market, i really think the 28nm chips is were we will see something special from both companys, i feel like anything new before then is just to kind of fill the gap untill the 28nm chips can be made.


----------



## jasper1605 (Oct 18, 2010)

Kantastic said:


> might as well have him rub a lamp and ask the genie for a billion dollars.



It doesn't work. I've tried.


----------



## nt300 (Oct 18, 2010)

Nvidia plan on cooking burgers with the new card. If anything like Fermi design, they have big trouble unless they fix them. It will take at least 2 years to fix the issues Fermi has. Good luck NV, we need to for good competition to help drive prices down. Hurry, I want a new XFX, yes the one that don't make GeForce anymore just ATI Radeon


----------



## bear jesus (Oct 18, 2010)

nt300 said:


> Nvidia plan on cooking burgers with the new card. If anything like Fermi design, they have big trouble unless they fix them. It will take at least 2 years to fix the issues Fermi has. Good luck NV, we need to for good competition to help drive prices down. Hurry, I want a new XFX, yes the one that don't make GeForce anymore just ATI Radeon



Maybe if the new card is powerful enough this kind of thing will work out safe to eat 

*edit*
red machine is right, 2 years is a little over the top, they fixed some of the problems when making the gf104 so not even a year for some problems then by late next year they will be using 28nm chips.


----------



## Red_Machine (Oct 18, 2010)

nt300 said:


> It will take at least 2 years to fix the issues Fermi has.



No it won't.  They'll be fine once they drop down to 28nm, which is slated for late next year.


----------



## Mr McC (Oct 18, 2010)

crow1001 said:


> LOL some speculative info on Nvidia cards gets a lot more interest than news on the new AMD cards.





CDdude55 said:


> fixed.. lol



I think crow has a point: every time I've checked in to the site over the last couple of days, this thread has been at the top of the list and many of the comments are by no means negative towards Nvidia, indeed, a suspicious mind might see a design to ensure that the thread is constantly bumped to the top. The articles cited at the start are simply marketing, not directly released by Nvidia, specifically designed to draw attention away from ATI and, given the response here, they seem to be working. This thread is mainly characterised by wishful thinking and whilst I think speculation is fine and inevitable, I fail to see how a couple of articles about an entirely theoretical future card can provoke more interest than AMD's (almost said ATI) imminent release of the 6000 series, but that's just me. 

Incidentally, I have inadvertently bumped the thread again


----------



## bear jesus (Oct 18, 2010)

Mr McC said:


> I think crow has a point: every time I've checked in to the site over the last couple of days, this thread has been at the top of the list and many of the comments are by no means negative towards Nvidia, indeed, a suspicious mind might see a design to ensure that the thread is constantly bumped to the top. The articles cited at the start are simply marketing, not directly released by Nvidia, specifically designed to draw attention away from ATI and, given the response here, they seem to be working. This thread is mainly characterised by wishful thinking and whilst I think speculation is fine and inevitable, I fail to see how a couple of articles about an entirely theoretical future card can provoke more interest than AMD's (almost said ATI) imminent release of the 6000 series, but that's just me.
> 
> Incidentally, I have inadvertently bumped the thread again



 you are falling into the nvidia marketing departments evil trap  really all this proberley is, is a well timed excersize in marketing, i was willing to wait for fermi after the 5870's release so it worked on me last time... failed in a way though as i just ended up not buying anything for near a year 

This time i'm interested in what nvidia is working on but im almost sure that only 460 sli is the only nvidia option for me so once the 5970 is out i will pull the trigger on whatever i feel suits me best.


----------

