# Dual GeForce GTX 260 to be Officially Named GeForce GTX 295



## btarunr (Dec 2, 2008)

NVIDIA would be giving its flagship consumer graphics processor, the G200 a refresh using the newer 55nm silicon fabrication process. With this, the company plans to carve out new SKUs taking the benefit of enhanced thermal and electrical properties of the updated core. In the pipeline, is a dual-GPU card based on two GeForce GTX 260 GPUs. 

Expreview learned that the new graphics card is to be named GeForce GTX 295. NVIDIA is creating the card to regain the performance crown from ATI Radeon HD 4870 X2, the fastest single graphics card in the market. The card will sport two G200b cores in the 216 SP configuration, although not much is known about the memory configuration and clock speeds, at this point in time. The card has already passed design phase and is awaiting trial production and testing. It is expected to be released in January 2009.

*View at TechPowerUp Main Site*


----------



## Tatty_One (Dec 2, 2008)

Jan 2009 seems a very optimistic release date if they are only now looking at trial production.


----------



## Binge (Dec 2, 2008)

That card is going to pwn hard.  Hopefully they'll give it a nice beefy cooler, but none of that tripple slot BS.


----------



## smartali89 (Dec 2, 2008)

hopefully this will bring graphic card prices more down


----------



## newbielives (Dec 2, 2008)

I can finally play pong at 10,000 fps


----------



## newtekie1 (Dec 2, 2008)

Only 216SPs per core?  Seems like nVidia is holding back to me, or they couldn't manage to cool the thing with the full 240.


----------



## spearman914 (Dec 2, 2008)

newtekie1 said:


> Only 216SPs per core?  Seems like nVidia is holding back to me, or they couldn't manage to cool the thing with the full 240.



What do u expect? It's the same core technology.


----------



## SystemViper (Dec 2, 2008)

Got to love the GPU wars, lets hope for lower prices, so i can get my second GTX280


----------



## SteelSix (Dec 2, 2008)

spearman914 said:


> What do u expect? It's the same core technology.



Was prolly thinking why not a 240sp dual card. Could be heat, more likely because it's all that's needed to beat X2. More breathing room for next release. Can't keep playing that game though nV. ATI caught you with your pants down...


----------



## Analog_Manner (Dec 2, 2008)

By calling it the GTX 295, should we expect it to be 5.3% faster than a GTX 280? _Or_ about as good as two GTX 260's in SLI?


----------



## spearman914 (Dec 2, 2008)

Analog_Manner said:


> By calling it the GTX 295, should we expect it to be 5.3% faster than a GTX 280? _Or_ about as good as two GTX 260's in SLI?



2 GTX 260 will pwn the 295 itself actually. And for the math, we need some benchies first.


----------



## CStylen (Dec 2, 2008)

spearman914 said:


> 2 GTX 260 will pwn the 295 itself actually. And for the math, we need some benchies first.



What do you base that on?  The 295 will basically include two 260 cores with a die shrink which could lead to higher clocks and/or less heat.  Theoretically, the 295 will "pwn" two 260's in SLI anyday...

My very uneducated guess is that it will be slightly faster than a 4870x2 overall by 1-3%...


----------



## WhiteLotus (Dec 2, 2008)

lets all just wait and see. But if it reduces the prices on the current 260's then whoop whoop!


----------



## TooFast (Dec 2, 2008)

im sure by then ati will have a refresh of the 4870 X2


----------



## mdm-adph (Dec 2, 2008)

I must applaud Nvidia for picking a name that's not stupid.  I still wish ATI would've called the 4870X2 the "4880" or something -- sounds more professional.


----------



## EarlZ (Dec 2, 2008)

and the usual question, will it have more or lesser micro stuttering than the 4870X2 ?


----------



## Darkrealms (Dec 2, 2008)

I think if they at least do a decent job at it, it should beat the 4870x2 by a decent margin.  Again this is if the do at least a decent job with their drivers.  Although the 9800x2 is still doing fairly well so maybe . . .

I'm all for it, it brings my GTX260 prices down : )

The only real mistake I can see (other than drivers) will be price.  If they put this card over $500 it won't do well at all.  If they don't have it down below $450 in 2-3 months after launch I think people will be mad.

Cooling, Nvidia has done alright with.
OCing, Nvidia has done well with.
One GTX260 is still a very formidable card.  And prices aren't bad right now (not great but not bad).


----------



## Tatty_One (Dec 2, 2008)

Darkrealms said:


> I think if they at least do a decent job at it, it should beat the 4870x2 by a decent margin.  Again this is if the do at least a decent job with their drivers.  Although the 9800x2 is still doing fairly well so maybe . . .
> 
> I'm all for it, it brings my GTX260 prices down : )
> 
> ...



Agreed, I think it has to be realistically priced in comparison to the 4870x2.....ie 15% better performance....no more than 10% more expensive as an example.........although I very much doubt that!


----------



## newtekie1 (Dec 2, 2008)

It is most likely going to be the top performer, so we all know it is going to be way overpriced, why even hope it isn't?  I just hope they don't overprice it as badly as ATI  has overpriced the HD4870x2.


----------



## SteelSix (Dec 2, 2008)

Scaling will be what makes one better than the other by decent margins. Still debated by some, but it's evident which offers better scaling...


----------



## Tatty_One (Dec 3, 2008)

SteelSix said:


> Scaling will be what makes one better than the other by decent margins. Still debated by some, but it's evident which offers better scaling...



It is but that still didnt stop the GX2 topping the 3870x2, that was not scaling, it was GPU power, but hey, things change.


----------



## Solaris17 (Dec 3, 2008)

btarunr said:


> NVIDIA would be giving its flagship consumer graphics processor, the G200 a refresh using the newer 55nm silicon fabrication process. With this, the company plans to carve out new SKUs taking the benefit of enhanced thermal and electrical properties of the updated core. In the pipeline, is a dual-GPU card based on two GeForce GTX 260 GPUs.
> 
> Expreview learned that the new graphics card is to be named GeForce GTX 295. NVIDIA is creating the card to regain the performance crown from ATI Radeon HD 4870 X2, the fastest single graphics card in the market. The card will sport two G200b cores in the 216 SP configuration, although not much is known about the memory configuration and clock speeds, at this point in time. The card has already passed design phase and is awaiting trial production and testing. It is expected to be released in January 2009.
> 
> Source: Expreview




sand bagging


----------



## zithe (Dec 3, 2008)

GTX 295? This mean there's a GTX 290 in development? D:


----------



## DarkMatter (Dec 3, 2008)

Off course, the 55nm GTX280.


----------



## lilkiduno (Dec 3, 2008)

wow, i would love to see prices far. because when that dose happen, the 9800 GTX+'s will be even cheaper, and i will probbly, finnal tri-SLI my system... that would be nice, if anything it would knock donw the price of the asic 260 to a nice price to where i can use my EVGA step up program!!!! so rock out to the release of this card!


----------



## PCpraiser100 (Dec 3, 2008)

lilkiduno said:


> wow, i would love to see prices far. because when that dose happen, the 9800 GTX+'s will be even cheaper, and i will probbly, finnal tri-SLI my system... that would be nice, if anything it would knock donw the price of the asic 260 to a nice price to where i can use my EVGA step up program!!!! so rock out to the release of this card!



If so, how are you going to take care of those upcoming OpenCL applications? Consider a new platform.


----------



## Zubasa (Dec 3, 2008)

newtekie1 said:


> It is most likely going to be the top performer, so we all know it is going to be way overpriced, why even hope it isn't?  I just hope they don't overprice it as badly as ATI  has overpriced the HD4870x2.


Given how overpriced that GTX 280 was...:shadedshu It doesn't look good.
And that 8800 Ultra... Oh Snap


----------



## lilkiduno (Dec 3, 2008)

PCpraiser100 said:


> If so, how are you going to take care of those upcoming OpenCL applications? Consider a new platform.



eh???
im a noob... and rele don't know much about the apps i run. i just like playin my games


----------



## phanbuey (Dec 3, 2008)

PCpraiser100 said:


> If so, how are you going to take care of those upcoming OpenCL applications? Consider a new platform.



nvidia is compatible with openCL... as far as i have read anyways.  

This will be a monster card... im guessing at around the $360-$400 range when it does come out...  8800Ultra and GTX 280 came out when there was no competition.  Thank god for ATI (they gave my 260's so cheap).


----------



## DrunkenMafia (Dec 3, 2008)

I wonder if you can tri SLI these!!!


----------



## lilkiduno (Dec 3, 2008)

DrunkenMafia said:


> I wonder if you can tri SLI these!!!



talk about having the SICKEST set-up EVER!


----------



## OzzmanFloyd120 (Dec 3, 2008)

I doubt you would see enough gain to justify the price in any way.


----------



## lilkiduno (Dec 3, 2008)

no i doubt it, but, then again people buy shit when they have the money, just because they can!


----------



## OzzmanFloyd120 (Dec 3, 2008)

Nobody has money right now though, did you forget about the economy?


----------



## 3870x2 (Dec 3, 2008)

I have a feeling its not going to perform much better than the 9800GX2, but well see.


----------



## OzzmanFloyd120 (Dec 3, 2008)

3870x2 said:


> I have a feeling its not going to perform much better than the 9800GX2, but well see.



I agree, it would be interesting though to see someone bench two GTX260s in SLi vs a GX2 or 9800GTX/+s in SLi so we can get a sneak preview.
W1zz?


----------



## lilkiduno (Dec 3, 2008)

OzzmanFloyd120 said:


> Nobody has money right now though, did you forget about the economy?



no i didn't forget about the economy... also i didn't forget about the CEO's of the large coperations walking away with millions as the company when bankrupt ither


----------



## OzzmanFloyd120 (Dec 3, 2008)

lilkiduno said:


> no i didn't forget about the economy... also i didn't forget about the CEO's of the large coperations walking away with millions as the company when bankrupt ither



They don't build PCs though, they just rape and pillage. Modern day Vikings in a sense.


----------



## btarunr (Dec 3, 2008)

There will always be a market for a $500~$700 card. When people can buy two to three of those these days, the same people can jolly-well afford a single card. No more economy discussion.


----------



## DrunkenMafia (Dec 3, 2008)

OzzmanFloyd120 said:


> I doubt you would see enough gain to justify the price in any way.




Thats what everyone said with tri sli....... :shadedshu


And then the i7 came out and changed all that


----------



## OzzmanFloyd120 (Dec 3, 2008)

DrunkenMafia said:


> Thats what everyone said with tri sli....... :shadedshu
> 
> 
> And then the i7 came out and changed all that



I just don't see Six-L-I being realistic.


----------



## Solaris17 (Dec 3, 2008)

OzzmanFloyd120 said:


> I just don't see Six-L-I being realistic.



im going to do it with my GX2's


----------



## wolf (Dec 3, 2008)

i would gladly pay 400-500 for one, but in the last few months the exchange rate has gone to poop and one aussie dollar now buys only ~62 us cents, as oppose to the 90+ cents we were getting for a loooooooong time, was even at 98 cents for a while.

right now in Aus, a 4870X2 is 800-900 AUD, and a GTX280 is 650-750.....


----------



## OzzmanFloyd120 (Dec 3, 2008)

wolf said:


> i would gladly pay 400-500 for one, but in the last few months the exchange rate has gone to poop and one aussie dollar now buys only ~62 us cents, as oppose to the 90+ cents we were getting for a loooooooong time, was even at 98 cents for a while.
> 
> right now in Aus, a 4870X2 is 800-900 AUD, and a GTX280 is 650-750.....



Aren't you always telling me how you don't like multi-gpu solutions and would rather have a 300nm fab with enough power to make Christopher Reeve walk again?


----------



## wolf (Dec 3, 2008)

OzzmanFloyd120 said:


> Aren't you always telling me how you don't like multi-gpu solutions and would rather have a 300nm fab with enough power to make Christopher Reeve walk again?



traditionally yeah, im a believer in one kickass GPU, and unlike some, believe multi gpu is not the way of the future...

if a true multi core approach can be made, with a single - large - shared framebuffer i may reconsider....

however, its a given i like to have the best of the best, my thoughts are this will be THE card.

that and for the time being im done with my 4870, i dislike the driver controll panel, and in all honesty the 4870 appears cheaply made compared to every nvidia card ive ever owned bar the FX5600XT 

also, i dove right on 2x512 mb models which i regret given 1gb models came out VERY soon after.


----------



## OzzmanFloyd120 (Dec 3, 2008)

wolf said:


> ...in all honesty the 4870 appears cheaply made compared to every nvidia card ive ever owned bar the FX5600XT



I know what you mean, my GX2 feels sturdy enough where I feel like I could beat my roommate to death with it and still have a working card.


----------



## eidairaman1 (Dec 3, 2008)

its all a cooler, nothing more, nothing less


----------



## wolf (Dec 3, 2008)

eidairaman1 said:


> its all a cooler, nothing more, nothing less



you'd be surprised, theres 2 nvidia 9800's in there.


----------



## Bjorn_Of_Iceland (Dec 3, 2008)

Hope the price of this thing goes down as fast as the 9800GX2 though. hehe. 

Anyways, quad SLI still has kinks. A single GTX295 would be good though.

But I somehow get the new nVidia naming scheme now.. you can somehow get a glimpse of what kind of performance you will get when you compare em. look at GTX 260, 265, 280.. and now GTX 295. Im seeing that its performance gap from the GTX 280 is not that large compared to a GTX 260 vs GTX 280...


----------



## OzzmanFloyd120 (Dec 3, 2008)

I'd like to see the beastly cooler they plan on putting on this thing. the GT200 is a friggen hot sweaty hog. I can't imagine they like being near each other very much.
Come to think of it I'd like to see the power consumption too.


----------



## btarunr (Dec 3, 2008)

OzzmanFloyd120 said:


> Come to think of it I'd like to see the power consumption too.



It should be lower than that of a HD 4870 X2. Below is a chart that shows a 65nm GTX 260 Core 216 to already have a lower average power consumption compared to a HD 4870. You can expect that to fall even further with the newer 55nm process.

http://i4.techpowerup.com/reviews/Leadtek/GeForce_GTX_260_Extreme_Plus/images/power_average.gif

http://www.techpowerup.com/reviews/Leadtek/GeForce_GTX_260_Extreme_Plus/24.html


----------



## DarkMatter (Dec 3, 2008)

Bjorn_Of_Iceland said:


> Hope the price of this thing goes down as fast as the 9800GX2 though. hehe.
> 
> Anyways, quad SLI still has kinks. A single GTX295 would be good though.
> 
> But I somehow get the new nVidia naming scheme now.. you can somehow "feel" what kind of performance you will get when you compare em. look at GTX 260, 265, 280.. and now GTX 295. Im seeing that its performance gap from the GTX 280 is not that large compared to a GTX 260 vs GTX 280...



Hmm I don't think it works that way. I think it's more like they have three segments GT, GTS and GTX and probably 2-4 models per segment chosing between x20, x40, x60, x80 (where x is the generation, i.e GTX *2*60). Here the chosen number does indicate more or less the kind of performance you can expect in relation to each others. Then in the future they come the x30, x50, x70 and x90 which are refreshes of the others with probably some improvements but similar in performance and price or simply the specs sans clock speeds that could change. Just like 7800 and 7900 series. Then I think they have decided to use the +5 increment for dual cards, as opposed to using it as a further refresh like in 7950. Why? Because they don't need to (explained below). A clear example is the new name for the 8800 GT or 9800GT or GT 150 (AFAIK):

- *GT* 150 because it will be mainstream.
- GT *1*50 because it's one generation older than GT200 cards. I mean 2 comes after 1, not that the 1 means "1 generation older".
- GT 1*5*0 because it's like in the middle of the segment. EDIT: oh, I forgot to mention: 5 and not 4 or 6 because it's a refresh.
- GT 15*0* because it's single GPU.

Rumors say that the 55nm GT200 cards will get the GTX 290, GTX 270 and GTX 295 names and they match what I said. Now what will happen if they want to release another refresh of GT200? They don't have more numbers... or do they? Yes they have. By the time they'd make another refresh of the cards, G300 probably will be out, so GT200 cards will no longer be high-end. So what would be the name? Easy: GTS 240, GTS 270, GTS 295 or whatever other second numbers they choose, should they change the specs of the cards.

All in all the difference between GTX 295 and GTX 280 will be higher than between 280 and 260. The overall difference to HD4870 X2 is already higher AFAIK and the GTX 295 WILL be quite faster than the X2. The reason is simple, apparently the card will be released, so it must be faster, because Nvidia can't afford the bad publicity that would give them releasing a dual GPU card that doesn't outperform the X2. They have surely tested dual GT200 cards already, albeit with 65nm chip and impossible to implement in a consumer card, they surely have made some. The fact that they are releasing the GTX 295 is already enough proof that it will be considerably faster IMO.


----------



## Bjorn_Of_Iceland (Dec 3, 2008)

Makes sense. Lets just wait and see


----------



## GFC (Dec 3, 2008)

Hm.. i'm just wondering what kind of cooler will it have ? I mean if they put two PCB's they don't realy have alot of space. I hope they don't make the same mistake as with 9800GX2 which had problems for overheating it's NF200 chips..


----------



## Hayder_Master (Dec 3, 2008)

216 sp , sure they can't beat the 4870x2


----------



## DrunkenMafia (Dec 3, 2008)

OzzmanFloyd120 said:


> I just don't see Six-L-I being realistic.



Is that the official name for it, what source did you get that from


----------



## InnocentCriminal (Dec 3, 2008)

I look forward to reviews.


----------



## HossHuge (Dec 3, 2008)

newbielives said:


> I can finally play pong at 10,000 fps



Tee Hee


----------



## newtekie1 (Dec 3, 2008)

hayder.master said:


> 216 sp , sure they can't beat the 4870x2



Why wouldn't it?  The 216SP GTX260 beats a single 4870, why wouldn't two of them beat two 4870s?


----------



## TooFast (Dec 3, 2008)

newtekie1 said:


> Why wouldn't it?  The 216SP GTX260 beats a single 4870, why wouldn't two of them beat two 4870s?



because x-fire scales better than sli. dont forget ati will have a refresh of the x2 4 sure.


----------



## OzzmanFloyd120 (Dec 3, 2008)

DrunkenMafia said:


> Is that the official name for it, what source did you get that from



Sure is, this is the hierarchy of sli.

Single card
SLi
S-L-Tri
Different type of spider-L-i
S-L-five (yes, it works. Trust me )
Six-L-i
Seven-L-i
Holy shit where did you get a capable PSU-L-i

Sheesh, I thought everyone knew that!


----------



## btarunr (Dec 3, 2008)

TooFast said:


> because x-fire scales better than sli. dont forget ati will have a refresh of the x2 4 sure.



Even a SLI of two 192 SP cards goes neck and neck with a R700.


----------



## newtekie1 (Dec 3, 2008)

TooFast said:


> because x-fire scales better than sli. dont forget ati will have a refresh of the x2 4 sure.



How will they?  What will they do to it?  The HD4870x2 is already pretty much the best ATi has.  All they can do is raise clock speeds, which isn't likely to improve performance all that much.



btarunr said:


> Even a SLI of two 192 SP cards goes neck and neck with a R700.



Exactly.


----------



## Skillz (Dec 3, 2008)

Knowing these morons with their price exploit, it's going to cost an arm and 2 legs plus a penis just to get them... so i won't even bother to get my hopes up cause i'm not losing a penis over a damn vid card.


----------



## Megasty (Dec 3, 2008)

I haven't even put my 4870x2 in a rig yet & nv is already coming out with the impossible, although its a little watered down. It still great to see that they overcame that hump. Hopefully it won't cost $600+. Hell, I'm still using only one 4870 anyway so I probably won't get one (lying terribly )


----------



## Binge (Dec 3, 2008)

The only thing that might stand a chance is crossfired 4870 1gbs that are overclocked to perfection.  This is assuming that the 295 isn't a watered down version of the 260 216 like the 4870x2 is two watered down 4870 1gb.  The 4870x2 uses ram that won't clock as high as the every day 4870 1gb cards, and they can't clock nearly as high on the cores.


----------



## mdm-adph (Dec 3, 2008)

newtekie1 said:


> How will they?  What will they do to it?  The HD4870x2 is already pretty much the best ATi has.  All they can do is raise clock speeds, which isn't likely to improve performance all that much.



Nope -- four chips on one card.  It's been done before.


----------



## DarkMatter (Dec 3, 2008)

TooFast said:


> because x-fire scales better than sli. dont forget ati will have a refresh of the x2 4 sure.



That's the most falaceous meme appeared in the recent IT history. Crossfire scaled better somewhere between G80 and G94's launch. Primarily based on the fact that Crossfired R6xx scaled better than SLI back then. Ever since the adjustments made for the G94 Aka 9600 GT (on drivers) SLI scaling is as efficient as Crossfire and sometimes faster (and sometimes slower, of course). To the point where 9600 GT SLI is significantly faster than HD3870 crossfire, even when single cards are comparable. Today we have 9800 GT dangerously close to Crossfired HD4850's in many cases, which is a clear sign SLI is scaling better there. On the other hand 9800GTX SLI seems to loose ground and is usually not faster than HDs. AFAIK GTX260 SLI is again faster than Crossfired HD4870's, again showing SLI being superior in thet particular case, as single HD's are faster. GTX280 scales much better than the X2 and so on.

Anyway all of the above pretty much only applies to Core2 systems. X58 and/or Nehalem has made SLI much much faster than Crossfire. X58 has clearly destroyed some bottlenecks and even Tri-SLI GTX280's scale almost t perfection. Whether it's because of the drivers that this does not happen to Crossfire is just to be seen. Up unitl now two driver releases and many hotfixes haven't changed that, but we'll see.

Part of the problem for SLI in the past was that no Nvidia chipset (nor Ati's anyway) was as fast as Intel chipsets to begin with, so SLI already started with a disadvantage against Crossfire in Intel chipsets. X58 has evened the field and that's why SLI is doing better now IMHO.


----------



## Megasty (Dec 3, 2008)

mdm-adph said:


> Nope -- four chips on one card.  It's been done before.



Bah, I have that card; the 3DFX whatcamagig. Once I cleaned all the dust off of it, it looked like a museum piece. Just thinking of something like that made with today's high end technology, the thing would need 1000 watts just to power up. I wouldn't mind seeing that but I know my psu would.


----------



## mdm-adph (Dec 3, 2008)

Megasty said:


> Bah, I have that card; the 3DFX whatcamagig. Once I cleaned all the dust off of it, it looked like a museum piece. Just thinking of something like that made with today's high end technology, the thing would need 1000 watts just to power up. I wouldn't mind seeing that but I know my psu would.



Mark my words -- it will be done.  

Say... two dual-core R800's.


----------



## Solaris17 (Dec 3, 2008)

DarkMatter said:


> That's the most falaceous meme appeared in the recent IT history. Crossfire scaled better somewhere between G80 and G94's launch. Primarily based on the fact that Crossfired R6xx scaled better than SLI back then. Ever since the adjustments made for the G94 Aka 9600 GT (on drivers) SLI scaling is as efficient as Crossfire and sometimes faster (and sometimes slower, of course). To the point where 9600 GT SLI is significantly faster than HD3870 crossfire, even when single cards are comparable. Today we have 9800 GT dangerously close to Crossfired HD4850's in many cases, which is a clear sign SLI is scaling better there. On the other hand 9800GTX SLI seems to loose ground and is usually not faster than HDs. AFAIK GTX260 SLI is again faster than Crossfired HD4870's, again showing SLI being superior in thet particular case, as single HD's are faster. GTX280 scales much better than the X2 and so on.
> 
> Anyway all of the above pretty much only applies to Core2 systems. X58 and/or Nehalem has made SLI much much faster than Crossfire. X58 has clearly destroyed some bottlenecks and even Tri-SLI GTX280's scale almost t perfection. Whether it's because of the drivers that this does not happen to Crossfire is just to be seen. Up unitl now two driver releases and many hotfixes haven't changed that, but we'll see.
> 
> Part of the problem for SLI in the past was that no Nvidia chipset (nor Ati's anyway) was as fast as Intel chipsets to begin with, so SLI already started with a disadvantage against Crossfire in Intel chipsets. X58 has evened the field and that's why SLI is doing better now IMHO.





i owned them and i can support this the 8600's and 9600GT's spacifically the 9600GT's are imo by far to this date the best scalling nvidia card...the 8600 imo starte it but the 9600GT's were terribly efficent


----------



## wolf (Dec 3, 2008)

nvidia SLi scaling has come a long way +1, i had 8600GT's in SLi and they scaled B-E-A-utifully, and weve all seen what 9600GT's can do, GTX260's compete very well so it will be interesting.

remember im not saying that either one is better, because neither can decisively be called better.

and as for it not beating an X2, i think DarkMatter has a point, the fact that nvidia are definitly going ahead means it has to take the crown, they wouldnt release it if it didn't.

that doesn't mean ATi wont counter right back with a newer revision R700 with faster cores and faster GDDR5....but i believe, for at least a short time, this new card will be the king.


----------



## OzzmanFloyd120 (Dec 4, 2008)

wolf said:


> nvidia SLi scaling has come a long way +1, i had 8600GT's in SLi and they scaled B-E-A-utifully, and weve all seen what 9600GT's can do, GTX260's compete very well so it will be interesting.
> 
> remember im not saying that either one is better, because neither can decisively be called better.
> 
> ...



This is what the market needs, the back and fourth battle keeps the market competetive and us consumers walk out winning.

I'm happy ATi found a way to stop NV from just re-releasing the g92 over and over with different names.


----------



## wolf (Dec 4, 2008)

OzzmanFloyd120 said:


> This is what the market needs, the back and fourth battle keeps the market competetive and us consumers walk out winning.
> 
> I'm happy ATi found a way to stop NV from just re-releasing the g92 over and over with different names.



amen to that, RV770 has been a godsend even if you didn't buy one.

having said that, 45nm processes aren't far away, whats the bet GT200 AND G9X are both re-released in this form too.


----------



## Binge (Dec 4, 2008)

nV only really has to do a die shrink.  The dual gpu card is just a solution to stir the market a little.  I really wanted a GTX280 55nm but it is coming too slowly.


----------



## wolf (Dec 4, 2008)

Binge said:


> nV only really has to do a die shrink.  The dual gpu card is just a solution to stir the market a little.  I really wanted a GTX280 55nm but it is coming too slowly.



in your sig it says GTX280 192, is that a 260?


----------



## DarkMatter (Dec 4, 2008)

wolf said:


> amen to that, RV770 has been a godsend even if you didn't buy one.
> 
> having said that, 45nm processes aren't far away, whats the bet GT200 AND G9X are both re-released in this form too.



+1 to RV770 although I don't agree with the strategy and strongly believe it's not self sustainable, this particular implementation of the strategy was like a godsend.

About the rehashes I don't have the smallest issue with them as long as they are still competitive in the segment they are released. When I buy a graphics card I look for much more than just the name, more and more people have started to do so and everyone SHOULD do. I have temporarily worked in stores now and then and TBH I have yet to see anyone buying a dishwasher, a microwave or a driller or anything similar without asking every single aspect of that machine. Same with cars, mp3's, cellphones and many other things, people care to learn and ask about those things. BUT when it comes to hardware they came to the store and ask for "a powerful gaming card for 50 euro", "-the Nvidia GeForce -yeah, but which one - huh?" etc, etc. Personally I couldn't care less if that kind of people is confused by the name changes, the people that spends 10 mins asking about every propertiy of a $50 worth drill and then shells out $300 for a graphics cards without caring one bit about what he is getting, except the name (they would always have the time to come back later claiming that was not what they needed though, or that it doesn't fit in their slot, etc.). Sorry but I don't care. Just as I don't care if the 8800 GT continues selling with that name or 9800 GT or GT 150, it's the same card and it has demostrated it's still a worthwhile buy in current market.

\end of rant lol.


----------



## wolf (Dec 4, 2008)

agree, its really not that important what the name is, just know what you are buying.

all my gfx card purchases are based on; (in no particular order)

1. my brand preference of the time, for example i know my next card will be nvidia.

2. what the card has on offer in terms of tech specs, ie, shaders, rops, memory, memory bus etc. vary rarely will power consumption or heat be a swaying factor, as i have expendable income for a power bill and i after market cool just about every card i get.

3. the all important benchmarks, how well does it stack up against what i own now, and what else is on offer in the same and other segments.

4. price/performance ratio. not the biggest factor but quite important nonethless, thats how i warranted buying a 8800GTX 2 years after release, i payed LESS than price/performance for it.

name really doesn't matter jack to me, however i do enjoy anything with GT in it, Ultra in it, or any amount of X's  but thats just wank factor really.


----------



## Darkrealms (Dec 4, 2008)

*DarkMatter*, I agree with you and *wolf* on that.  I see people all the time asking everything about a product and when it comes to computer components they just as "whats good" or "whats best for $XX" and get whatever they get.
I have a friend that thinks if it costs more it must be that much better . . .  He has a 9600GT now because I told him to shutup and take it because thats what was best for him and his useage.

I bought my GTX260 because I wanted more than I could use and I wanted to play.  The GTX280 wouldn't make a difference for me because I couldn't make it  (a 22" only goes to 1680x1050).

_Although I may SLI it with another one to get my F@H score higher . . .  ; )_


----------



## Skillz (Dec 4, 2008)

They need to stop this stupid cycle of adding more to the equation, they need to make the equation better, it will only just get more and more complicated with all this chip adding, now it's "2" chip on a card pretty soon we'll see "4" and a movement from 10" to 12" in card length, they're getting tunnel vision when it comes on to new design. 
Not only will this just be a waste, now it's going to get more difficult for developers to write codes for these dual chips to be utilized at 100% efficiency, take the 4870x2 for example i know that card hasn't reached it's full potential and by the time it does it will be at it's EOL, faster card would be out at half the original retail price.


----------



## OzzmanFloyd120 (Dec 4, 2008)

Skillz said:


> They need to stop this stupid cycle of adding more to the equation, they need to make the equation better, it will only just get more and more complicated with all this chip adding, now it's "2" chip on a card pretty soon we'll see "4" and a movement from 10" to 12" in card length, they're getting tunnel vision when it comes on to new design.
> Not only will this just be a waste, now it's going to get more difficult for developers to write codes for these dual chips to be utilized at 100% efficiency, take the 4870x2 for example i know that card hasn't reached it's full potential and by the time it does it will be at it's EOL, faster card would be out at half the original retail price.



I disagree, by that logic they just end up making 100nm fabs that burn red hot when they're running.
I'm not a silicon engineer, but I'll bet it's pretty hard to keep cramming transistors, SPs, and ROPs, on to those dies.


----------



## Binge (Dec 4, 2008)

wolf said:


> in your sig it says GTX280 192, is that a 260?



U get an award for correcting my spEELing error.







@Ozzy:  SP and ROPs etc can always be refined and look at how far we've come from vacuum tubes.  Stuff can always get smaller and use less energy.  AMD did very little except make a fast chip, fast ram, and so on... I've said it before and I'll say it again.  There's a huge IQ difference in 3D between the nV and ATi cards, and at the moment nV holds the crown for best 3D picture.


----------



## Bjorn_Of_Iceland (Dec 4, 2008)

DarkMatter said:


> Part of the problem for SLI in the past was that no Nvidia chipset (nor Ati's anyway) was as fast as Intel chipsets to begin with, so SLI already started with a disadvantage against Crossfire in Intel chipsets. X58 has evened the field and that's why SLI is doing better now IMHO.


I do agree. In fact the benchies shown in various sites led me to think that SLi is somewhat crippled on nVidia chipset architechture (as single card setup performance on the new platform, has little or no difference with the previous'. Whilst multi-GPU are REALLY scaling well.). This only shows that nVidia SLi really was held back by its parents... now that it moved out of the house, its time for it to party like a rockstar.

Intel, with a more antiseptic grasp on the whole platform itself has unleashed SLi to its full rabid potential.


----------



## eidairaman1 (Dec 4, 2008)

whats funny is that Since Intel is selling the X58, that Means that Nvidia Cant use SLI as their Main Selling point on Intel Motherboards, now with AMD its a diff story until same fate happens again to NV on SLI.


----------



## DarkMatter (Dec 4, 2008)

eidairaman1 said:


> whats funny is that Since Intel is selling the X58, that Means that Nvidia Cant use SLI as their Main Selling point on Intel Motherboards, now with AMD its a diff story until same fate happens again to NV on SLI.



I don't understand what you mean.


----------



## eidairaman1 (Dec 4, 2008)

for quite some years NV has been Using SLI as their Main Selling Point for Motherboards with Their Chipsets, you could only get SLI if you had a NV chipset motherboard, now that has changed. That basically opens up the market to Users who switch videocards out when 1 set is better than the other etc.


----------



## DarkMatter (Dec 4, 2008)

eidairaman1 said:


> for quite some years NV has been Using SLI as their Main Selling Point for Motherboards with Their Chipsets, you could only get SLI if you had a NV chipset motherboard, now that has changed. That basically opens up the market to Users who switch videocards out when 1 set is better than the other etc.



Ah that's what I thought, but I didn't quite understood the sentence "that Means that Nvidia Cant use SLI as their Main Selling point on Intel Motherboards". You meant on Nvidia motherboards for Intel processors and that's what I didn't get.

You are partially right, but IMHO th thing has lately worked in the completely opposite way. Intel chipsets were much better, so anyone even considering to use multi-GPU in the future took an Intel board and an Ati graphics card. Examples of people in TPU deciding between Nvidia/Ati that went Ati just because of that, are extremely high. If anything the change goes into Nvidia's GPU division benefit. Chipset team, that's another thing, as you said they still have AMD, and they could make good mainstream chipsets for Nehalem in the future, who knows, 750i wasn't so bad after all (stability problems aside).


----------



## eidairaman1 (Dec 4, 2008)

considering it was a Biostar T-Power That high the 6 GHz Barrier with a C2E8400 and it took Asus with a Higher Dollar Board and Higher Dollar Processor to reach a little over the 6 GHz barrier.

But Ya, My Personal preference, Im unsure i will ever use SLI or Crossfire and Always think there should be Excellent Non SLI/Crossfire Chipsets out there, Because look at my Current Combo, Works Flawlessly (I know its Old tech but it can hold its own in many games- also for future tweaking- want to reach into the 2.4-3.0GHz arena/legacy 16bit games)

For Me I wouldnt Mind Running a NV Chipset/AMD CPU/ATI Graphics Card as that is what my current machine is and has served me well for the Last 5-6 years (1 year i couldnt use it because i was overseas)


----------



## wolf (Dec 4, 2008)

Binge said:


> U get an award for correcting my spEELing error.



so youll send me the GTX280 

plz plz plz  heheheheh


----------



## Skillz (Dec 4, 2008)

OzzmanFloyd120 said:


> I disagree, by that logic they just end up making 100nm fabs that burn red hot when they're running.
> I'm not a silicon engineer, but I'll bet it's pretty hard to keep cramming transistors, SPs, and ROPs, on to those dies.



No one said anything about cramming more transistors, i'm talking about a complete over haul just like they did with the core i7, one main innovated move was to make the memory controller directly on the chip... a new design, GPU has had the same architect for years, even you sir seem to be thinking with tunnel vision.
At a point in time the only way to have multitreading in computer was by having two processors, the problem was solve by making multi-cores on a single chip, the CPU industry has made more leaps in innovation, all I'm saying is maybe the GPU industry should take notes and do the same.


----------



## Tatty_One (Dec 4, 2008)

eidairaman1 said:


> for quite some years NV has been Using SLI as their Main Selling Point for Motherboards with Their Chipsets, you could only get SLI if you had a NV chipset motherboard, now that has changed. That basically opens up the market to Users who switch videocards out when 1 set is better than the other etc.



It just strengthens Intels positioning against AMD in the CPU wars, I am guessing unless things change AMD will lose on both fronts to a certain degree, on the CPU front, even more people will go Intel chipset motherboards now as they will have both GPU's to play with, mainly of course if they are seeking a multi GPU platform and on the GPU front, those people who only bought ATI cards because they wanted the multi GPU option on an Intel chipset board can go back to NVidia..............Intel win, NVidia win so who loses?  obviously loss is subjective but it's all good for the consumer.


----------



## OzzmanFloyd120 (Dec 5, 2008)

Skillz said:


> You are a fool for thinking so, no ones said anything about cramming more transistors, i'm talking about a complete over haul just like they did with the core i7, one main innovated move was to make the memory controller directly on the chip... a new design, GPU has had the same architect for years, even you sir seem to be thinking with tunnel vision.
> At a point in time the only way to have multitreading in computer was by having two processors, the problem was solve by making multi-cores on a single chip, the CPU industry has made more leaps in innovation, all I'm saying is maybe the GPU industry should take notes and do the same.



They are working on exactly that. It's called "Ray Tracing"
Also you couldn't be more wrong with architecture having not changed in all this time, it used to be where a GPU could only do one operation per SP, then they doubled that and ATi again doubled that with some crazy change they made in architecture (I'm too lazy to look up what it was called)
Another point is that having a complete overhaul of the GPU architecture would cause havoc on older games because they were designed to operate on this architecture, This problem is the reason that Ray Tracing hasn't hit the market full-force.
Finally, don't call names. It's not nice.


----------



## Skillz (Dec 5, 2008)

OzzmanFloyd120 said:


> They are working on exactly that. It's called "Ray Tracing"
> Also you couldn't be more wrong with architecture having not changed in all this time, it used to be where a GPU could only do one operation per SP, then they doubled that and ATi again doubled that with some crazy change they made in architecture (I'm too lazy to look up what it was called)
> Another point is that having a complete overhaul of the GPU architecture would cause havoc on older games because they were designed to operate on this architecture, This problem is the reason that Ray Tracing hasn't hit the market full-force.
> Finally, don't call names. It's not nice.



"then they doubled that and ATi again doubled that with some crazy change they made in architecture"

Their you go again with your tunnel vision, my point is now clarified, adding more to the equation does NOT over haul an architecture...second my core i7 is doing a great job running older programs, third where in the world does all that money go when you upgrade to a new 500 dollar GPU cause i sure as hell haven't seen anything innovative since not to mention DX10 is a fat joke, pointless to argue with someone of your intellect:shadedshu.


----------



## OzzmanFloyd120 (Dec 5, 2008)

Skillz said:


> "then they doubled that and ATi again doubled that with some crazy change they made in architecture"
> 
> Their you go again with your tunnel vision, my point is now clarified, adding more to the equation does NOT over haul an architecture...second my core i7 is doing a great job running older programs, third where in the world does all that money go when you upgrade to a new 500 dollar GPU cause i sure as hell haven't seen anything innovative since not to mention DX10 is a fat joke, pointless to argue with someone of your intellect:shadedshu.



For a start comparing a GPU to a CPU is worlds different. However, if I MUST explain. Back when the x800 series came out ATi released a major architecture change where each SP (or pixel pipeline as they called it back in those days) was able to double the amount of ROPs that each SP was able to produce, effectively creating four times pixel grunt per operation. They called it The Quad Dispatch System. This is why the x800 series of GPUs were legendary.
But if that's not enough of an "architecture change" for you then in essence the last GPU to use that was the NV2 core which would generate a single polygon and then it would round the edge of the polygon to make it appear 3D.... Care to take a guess what happened to that "architecture"? It died, and it took the Sega Saturn along with it, and I'll tell you why. It was because every single game had to be re-programmed to work with the way the GPU worked. This is the same reason that Ray Tracing hasn't taken a huge market share yet, because they can't get it to work in conjunction with with the current texture/shader system. 
IN ADDITION we don't need huge architectural changes thanks to software like OpenGL and DirectX, which actually in many cases makes just as much difference as the grunt of the GPU.


----------



## btarunr (Dec 5, 2008)

Calm down people. Please stick to the topic.


----------



## erocker (Dec 5, 2008)

Everyone please tone down your language and do not insult other members while trying to get your point across.  Behaving like children doesn't make yourselves sound very credible, nor does it help get your point across any better.  Be good.


----------



## Wile E (Dec 5, 2008)

Binge said:


> There's a huge IQ difference in 3D between the nV and ATi cards, and at the moment nV holds the crown for best 3D picture.


I 100% disagree. i just came from a 8800GT to this 4850, and IQ is the same between both. A little different? Yes. One better than the other? No. Just different.


----------



## Tatty_One (Dec 5, 2008)

Wile E said:


> I 100% disagree. i just came from a 8800GT to this 4850, and IQ is the same between both. A little different? Yes. One better than the other? No. Just different.



Agreed although at 48 my eyes are rather old, as you know I just came from a GTX260 to two HD4850's and my weiry eyes cant see any difference.......maybe I need specs.


----------



## Analog_Manner (Dec 5, 2008)

Skillz said:


> "then they doubled that and ATi again doubled that with some crazy change they made in architecture"
> 
> Their you go again with your tunnel vision, my point is now clarified, adding more to the equation does NOT over haul an architecture...second my core i7 is doing a great job running older programs, third where in the world does all that money go when you upgrade to a new 500 dollar GPU cause i sure as hell haven't seen anything innovative since not to mention DX10 is a fat joke, pointless to argue with someone of your intellect:shadedshu.



If you think DX10 is a joke then I don't think you have every played DX10 World In Conflict and compared it to DX9.


----------

