# NVIDIA GeForce GTX 295 Spotted



## malware (Dec 10, 2008)

VR-Zone scored the first photo of the upcoming GeForce GTX 295 card that's reported to make first appearance at next year's Consumer Electronics Show (CES). Unlike previous reports, the card will feature a sandwich design, like most dual GPU cards released by NVIDIA. Two 55nm GT200 CPUs will be incorporated in this card. From the picture we also see two DVI and one Display ports. The source also reports the card is using 8+6 pin connector combo to deliver external power. The pricing is yet to be disclosed, but card makers are speculating that NVIDIA will price it competitively against AMD 4870X2.





*View at TechPowerUp Main Site*


----------



## Fitseries3 (Dec 10, 2008)

no word on SLI support?


----------



## Binge (Dec 10, 2008)

That just makes my day worse... That card looks like poo.  I hope they iron out the aesthetics before release.


----------



## Error 404 (Dec 10, 2008)

Binge said:


> That just makes my day worse... That card looks like poo.  I hope they iron out the aesthetics before release.



Of course it looks awful, its a pre-release work-in-progress. I think. (I hope)
Its basically like the 9800 GX2 but without the metal cover, and much more power!


----------



## Binge (Dec 10, 2008)

But it looks like it will have less power than 2xGTX 260


----------



## GFC (Dec 10, 2008)

Binge said:


> But it looks like it will have less power than 2xGTX 260


How can it have less power ? It's built on 55nm process, unless they down-clock it because of the heat - it's going to be faster.
The main concern for me is, why the hell did they use dual-PCB setup like last time? Because of that, there's barely any place for the fan, which means it's going to be a *really* hot card.


----------



## DarkMatter (Dec 10, 2008)

I don't know what makes you think it will draw more power than the 9800 GX2, they have both the same 6-pin + 8-pin conectors.

I don't know why it would be *really* hot neither. The single PCB design of the HD3870 X2 didn't help out too much, the GX2 was cooler and quieter and indeed the temps were about the same as in single 9800s. Everything else were memes created long before the card even launched that continued existing thanks to the internets pop culture.

Rememer it's 55nm GT200 being used, we don't know anything about it yet. I say it's going to draw less power than the HD4870 X2 and have lower temps. Specilating like anyone else of course.


----------



## Error 404 (Dec 10, 2008)

Yeah, the GX2 uses exactly the same setup as this dual-PCB card, and considering how powerful the GX2 is it should be interesting to see how well this performs.
The real question will be whether this supports Tri-SLI: 6 cores of POWAH!


----------



## DarkMatter (Dec 10, 2008)

Error 404 said:


> Yeah, the GX2 uses exactly the same setup as this dual-PCB card, and considering how powerful the GX2 is it should be interesting to see how well this performs.
> The real question will be whether this supports Tri-SLI: 6 cores of POWAH!



I can almost answr that for you already. NO.

Just a speculation based on past communicates, but according to what I said IIRC, nothing more than Quad SLI will be ever supported. Never say never, but...


----------



## Binge (Dec 10, 2008)

GFC said:


> How can it have less power ? It's built on 55nm process, unless they down-clock it because of the heat - it's going to be faster.
> The main concern for me is, why the hell did they use dual-PCB setup like last time? Because of that, there's barely any place for the fan, which means it's going to be a *really* hot card.



You forget that they are selling the GTX260 as 55nm as well, so 2x GTX260 will be faster than this dual card solution.  Just look at 2x4870 vs 4870x2.


----------



## DarkMatter (Dec 10, 2008)

Binge said:


> Just look at 2x4870 vs 4870x2.



I wouldn't draw up any conclusions based on that. Different architectures and last I checked GTX260 SLI is slightly faster than 2xHD4870, while at the same time a GTX260 is slower when alone. It's too soon to conclude anything IMO.


----------



## Bjorn_Of_Iceland (Dec 10, 2008)

> But it looks like it will have less power than 2xGTX 260


Doubt it. Its a 55nm 260 with 448bit memory bandwidth on each, and a full *240* shader units on each core. Kinda like 2 beefed up 260.

Source:
http://en.expreview.com/2008/12/09/geforce-gtx295-with-480sp-surely-to-come-in-ces-09.html

Reason for it was obvious.. they wanted to outperform 4870x2 for a competitive price point. No sense in releasing a card that has the same performance as 2x GTX260 216s since 2x GTX 260 216s is more or less equivalent (averaging) vs a 4870x2.


----------



## DarkMatter (Dec 10, 2008)

Bjorn_Of_Iceland said:


> Doubt it. Its a 55nm 260 with 448bit memory bandwidth on each, and a full *240* shader units on each core. Kinda like 2 beefed up 260.
> 
> Source:
> http://www.firingsquad.com/news/newsarticle.asp?searchid=21080



Interesting! Yeah, definately interesting. It makes me happy when companies decide to make the things I think are the best option, and it's common sense TBH, is not that I have bright ideas. You definately don't need 1024 bits interface, even in 2x512 bits config, neither you need 64 ROPs churning out pixels like mad and this way you have much less routing on the PCBs. In my view the cards should have 2x384 bits and 1.5 GB of memory, enough IMO to have stellar performance on a cheaper config.



> Reason for it was obvious.. they wanted to outperform 4870x2 for a competitive price point. No sense in releasing a card that has the same performance as 2x GTX260 216s since 2x GTX 260 216s is more or less equivalent (averaging) vs a 4870x2.



EH? Last time I saw a comparison 2x260 were significanty faster than an X2, by a fair ammount, in fact, prior to 180.xx drivers that improved SLI a lot. 240 shaders can only improve upon that, IF the hypothetical 216SP 295 was to perform like 2x260, of course.

IMO if that ends up being true, they did it just because they have learnt half the way that they simply can. Maybe they had many chips with faulty ROPs, but intact SP cores (considering that they moved to GTX260 216 it makes sense*), that had to go to GTX260 cards, but now they decided this. They will never try to slightly win over Ati cards anymore. IMO They learnt the lesson and won't commit the same errors at least in a while...

* Explanation to profanes: GT200 has 8 raster processors with 4 "ROPs" and 64 bit memory controler each, for a total 32 ROPS and 512bit, it also has 10 SP/TMU clusters with 24 SP each for a total of 240. In order to improve yields, if one of the clusters fails they use that chip for the lesser card, the GTX260. Right now Nvidia has to sell as GTX260 216 ANY card that has either a defective ROP cluster or a SP cluster, but not necessarily both at the same time. With this move they can use those with only 1 faulty raster processor, which by pure statistics can be more than the ones with 1 SP cluster disabled. WIN WIN.


----------



## Bjorn_Of_Iceland (Dec 10, 2008)

kinda like that hare and turtle race eh hehe.



> EH? Last time I saw a comparison 2x260 were significanty faster than an X2, by a fair ammount, in fact, prior to 180.xx drivers that improved SLI a lot. 240 shaders can only improve upon that, IF the hypothetical 216SP 295 was to perform like 2x260, of course.


plus minus actually.. some games lean more on the 4870x2 and some on the GTX260 SLI.


----------



## human_error (Dec 10, 2008)

is it me or does the picture look like it's 3 slots wide? 
(there is enough space between the displayport and the dual dvi ports for a single slot sized grill for heat output).

If so we'll be needing new BFG ultra full towers FTW for the new GTXATX motherboard size spec to be able to run SLI


----------



## Binge (Dec 10, 2008)

DarkMatter said:


> I wouldn't draw up any conclusions based on that. Different architectures and last I checked GTX260 SLI is slightly faster than 2xHD4870, while at the same time a GTX260 is slower when alone. It's too soon to conclude anything IMO.



Tell that to dark2099 



Bjorn_Of_Iceland said:


> Doubt it. Its a 55nm 260 with 448bit memory bandwidth on each, and a full *240* shader units on each core. Kinda like 2 beefed up 260.
> 
> Source:
> http://en.expreview.com/2008/12/09/geforce-gtx295-with-480sp-surely-to-come-in-ces-09.html
> ...



And then why is it that Tatty's 260 that had a soft-volt mod owns any GTX280 under the sun?  The GTX200 series is more or less bogged down by shader cores... it never needed more than say 192 or 216.


----------



## DarkMatter (Dec 10, 2008)

Binge said:


> Tell that to dark2099



WoOt? I don't get it. 



> And then why is it that Tatty's 260 that had a soft-volt mod owns any GTX280 under the sun?  The GTX200 series is more or less bogged down by shader cores... it never needed more than say 192 or 216.



You can't dismiss the fact that GTX260 216 is faster than GTX260 192...

And almost by the same percentage as SP increase for more signs.


----------



## lemonadesoda (Dec 10, 2008)

SHOCK HORROR

This is a twin PCB GPU on one slot. It's practically an SLI pre-prepared for one socket.

They DIDNT MANAGE to get two 206b GPU's onto one PCB. Shame.


----------



## DarkMatter (Dec 10, 2008)

lemonadesoda said:


> SHOCK HORROR
> 
> This is a twin PCB GPU on one slot. It's practically an SLI pre-prepared for one socket.
> 
> They DIDNT MANAGE to get two 206b GPU's onto one PCB. Shame.



O my god. Are we going to return to the same debate once again??


----------



## HossHuge (Dec 10, 2008)

I hope it does beat a 4870x2 so maybe they will drop the price!!


----------



## cdawall (Dec 10, 2008)

look at his picture is that a 3rd PCB? and i found the quad SLi connector they just dont have it as a plug on the PCB yet


----------



## DarkMatter (Dec 10, 2008)

cdawall said:


> look at his picture is that a 3rd PCB? and i found the quad SLi connector they just dont have it as a plug on the PCB yet



Good call. Although I think it's just a little piece, not an entire PCB. IMO it's a dual slot card, it might be the perspective but I don't see enough space between the PCBs for it being 3-slot and I certainly don't see anything between them.

And don't get offense, but I'd think everybody had seen the connector already.


----------



## cdawall (Dec 10, 2008)

DarkMatter said:


> Good call. Although I think it's just a little piece, not an entire PCB. IMO it's a dual slot card, it might be the perspective but I don't see enough space between the PCBs for it being 3-slot and I certainly don't see anything between them.
> 
> And don't get offense, but I'd think everybody had seen the connector already.



i would think so to but check towards the top someone asks about sli


----------



## zOaib (Dec 10, 2008)

more sandwich , wooot ..... jesus christ !!!!!


----------



## phanbuey (Dec 10, 2008)

this card will rock... the 4870X2 will get beat, its just a question of by how much... and that depends on clockspeed.  Don't get ur panties in a wad thats just my speculation hopefully someone in far east leaks a review soon.


----------



## lemonadesoda (Dec 10, 2008)

DarkMatter said:


> O my god. Are we going to return to the same debate once again??


Where's the debate? Did you determine it over? Hadn't realised you had been promoted to the discussion police.

This 295 monster isn't an evolutionary feat. It's a frankenstein monster. By its design it's clear it will suffer from heat problems (trapped between the PCB) limiting aftermarket coolers and OCability.

It's also going to be power hungry.

IMO, two GTX260's (55nm refresh) in SLI are a superior product combination. It will allow greater flexibility and heat distribution will probably allow for higher OC and the use of larger, quieter, coolers.

woot! this is the fastest card? it will rock? meh. This is mutton dressed as lamb


----------



## phanbuey (Dec 10, 2008)

Binge said:


> Tell that to dark2099
> 
> 
> 
> And then why is it that Tatty's 260 that had a soft-volt mod owns any GTX280 under the sun?  The GTX200 series is more or less bogged down by shader cores... it never needed more than say 192 or 216.



tatty's super-high clocked 260 owned any 280 *at stock*... and the reason for that is that it had 25% fewer shaders, but ran at 30% greater clocks on the same architecture. once a 280 is overclocked to 700+ core and 1400+ it would beat that 260 no question.  240 shaders is not even enough IMO they wanted to add more but couldnt because of fab limitations.  I like the twin PCB idea it worked for the 9800GX2 and it will work for this too...


----------



## TRIPTEX_CAN (Dec 10, 2008)

lemonadesoda said:


> This 295 monster isn't an evolutionary feat. It's a frankenstein monster. By its design it's clear it will suffer from heat problems (trapped between the PCB) limiting aftermarket coolers and OCability.
> woot! this is the fastest card? it will rock? meh. This is mutton dressed as lamb



I agree the design of this card looks dated and will have some issues aside from being wildly expensive to produce. Nvidia has the capitol to invest in some better designs, rushing to take back the performance "crown" wouldn't be my product strategy but this card will be fast and it will have a colorful box.


----------



## phanbuey (Dec 10, 2008)

lol... i love it how people are like "it looks bad"... who cares if the cooler is stapled on to the cards and the whole assembly uses duct tape to hold it together? bring on the benchmarks...  on the other hand it is a pretty ugly card... they'll probably put plastic around it to cover it up a bit.


----------



## TRIPTEX_CAN (Dec 10, 2008)

phanbuey said:


> lol... i love it how people are like "it looks bad"... who cares if the cooler is stapled on to the cards and the whole assembly uses duct tape to hold it together? bring on the benchmarks...  on the other hand it is a pretty ugly card... they'll probably put plastic around it to cover it up a bit.



Um I care if it's duct taped together. This card will cost close to $600-700 when released. I can staple a few 260s together and charge you double if you want.


----------



## phanbuey (Dec 10, 2008)

TRIPTEX_MTL said:


> Um I care if it's duct taped together. This card will cost close to $600-700 when released. I can staple a few 260s together and charge you double if you want.



1. if you could, you would have already... and you would have made tons of money.
2. you should put some artwork in front of your computer.

point is: youre ultimately paying for performance, not looks - otherwise i have a really nifty looking X1550 in my other rig...

If the only complaint people can come up with is "it looks bad" then Nvdia has a successful product.


----------



## TRIPTEX_CAN (Dec 10, 2008)

phanbuey said:


> 1. if you could, you would have already... and you would have made tons of money.
> 2. you should put some artwork in front of your computer.
> 
> point is: youre ultimately paying for performance, not looks - otherwise i have a really nifty looking X1550 in my other rig...
> ...



It's not the aesthetics that bother me with the Nvidia GX2 cards. I said it looks dated, I didn't say it was ugly tbh they look fine (once the fan shroud is assembled). If the card doesn't suffer from heat issues and isn't ridiculously expensive from manufacturing it will be a winner.


----------



## mdm-adph (Dec 10, 2008)

phanbuey said:


> lol... i love it how people are like "it looks bad"... who cares if the cooler is stapled on to the cards and the whole assembly uses duct tape to hold it together? bring on the benchmarks...  on the other hand it is a pretty ugly card... they'll probably put plastic around it to cover it up a bit.



I would.  If it looks like crap, that means it was hastily put together.  And things that are hastily put together usually have high failure rates.

Now, it's neat to see Nvidia compete for the top stop again, but if the card's crappily made, it's going to be a damn shame, because the G200 is a pretty good chip -- could even be one of the last of its kind.


----------



## kysg (Dec 10, 2008)

well at least the boxart will look nice XD


----------



## ShadowFold (Dec 10, 2008)

Won't these have weaker G200 cores? I heard they have lower shader counts. I still cant beleive they didn't get all of this on one PCB tho.


----------



## LiveOrDie (Dec 10, 2008)

im just waiting for the 350, dual GPU cards just run to much up and down in games, put it this way some games run sli good and some run them crap lol, i had a 9800GX2 and it was ok but i got the 280 and all my games ran better.


----------



## TRIPTEX_CAN (Dec 10, 2008)

Live OR Die said:


> im just waiting for the 350, dual GPU cards just run to much up and down in games, put it this way some games run sli good and some run them crap lol, i had a 9800GX2 and it was ok but i got the 280 and all my games ran better.



Single GPU will always win for consistency in FPS. It doesn't matter if it's crossfire or SLi. Until someone manages to create a seamless dual GPU design that is recognized but the system as a single core there will always be games that suffer from having an extra card.


----------



## springs113 (Dec 10, 2008)

on a side note the new cat 8.12 is out


----------



## TRIPTEX_CAN (Dec 10, 2008)

springs113 said:


> on a side note the new cat 8.12 is out



Side note? That came out of left field.


----------



## Selene (Dec 10, 2008)

ok, im going to try to remain calm.
its a sample, its not what it will look like, so chill about the looks.
the GTX260 core 216 already beats the 4870 512/1gig 1v1 and in SLI vs crossfire, and the SLI core 216 beat the 4870x2, so there is no reason to be saying any thing about the 55nm SLI in one box is not going to beat the 4870x2.
The only way ATI can pull this out again, is a major price cut on the 4870s and X2s wich means we all win again.
Yes we all know their are some games the 4870x2 beats SLI GTX280s, but its a small % of games and over all the SLI gtx260s are better, so no reason to beat that dead horse.


----------



## TRIPTEX_CAN (Dec 10, 2008)

If you have to make an effort to remain calm you might be taking this all too seriously. 

The card will most likely look just like that but with a fan shroud covering it just like the 9800GX2. When pics of the 9800GX2 were "leaked" people said it wouldn't look the way it was pictured either... and it did, just with a fan shroud. 

This thread really doesnt need to be another ATI vs Nvidia thread. We're discussing the projected release of a product.


----------



## Solaris17 (Dec 10, 2008)

intresting that that is the only picture they managed to get.


----------



## mdm-adph (Dec 10, 2008)

Selene said:


> ok, im going to try to remain calm.
> its a sample, its not what it will look like, so chill about the looks.
> the GTX260 core 216 already beats the 4870 512/1gig 1v1 and in SLI vs crossfire, and the SLI core 216 beat the 4870x2, so there is no reason to be saying any thing about the 55nm SLI in one box is not going to beat the 4870x2.
> The only way ATI can pull this out again, is a major price cut on the 4870s and X2s wich means we all win again.
> Yes we all know their are some games the 4870x2 beats SLI GTX280s, but its a small % of games and over all the SLI gtx260s are better, so no reason to beat that dead horse.



I don't know about all of that... got any stats to back it up?  Just saw this review yesterday, and it looks like the GTX260/216 beats the 4870 by a slight margin overall, and only on some of the benchmarks, and I don't see how it beats it in SLI vx Crossfire, since everything I've seen shows that Crossfire is more efficient than SLI.


----------



## MrMilli (Dec 10, 2008)

- Two pcb's with a 448bit memory interface (i guess being minimum 10 layers)
- Four chips (2 GT200 + display chip + PCIE bridge) - with a 55nm GT200 being around 470mm², it's still almost twice the size of a RV770.
- 1792MB Ram
...

I wonder if nVidia is going to make any money on these things? The manufacturing costs are just really high. ATI's margins are high at the moment so they can slash their prices anytime.


----------



## Solaris17 (Dec 10, 2008)

MrMilli said:


> - Two pcb's with a 448bit memory interface (i guess being minimum 10 layers)
> - Four chips (2 GT200 + display chip + PCIE bridge) - with a 55nm GT200 being around 470mm², it's still almost twice the size of a RV770.
> - 1792MB Ram
> ...
> ...




i think they will the cost of making it wont be too much as they know that wouldnt be cost effective no i think everything is already in place. its basically going to be the same design as the GX2 i can tell you that just looking at that pic and remembering what mine looks like taken apart. the only thing they are goig to do it move a couple capacitors to fit the biger core.


----------



## Bjorn_Of_Iceland (Dec 10, 2008)

> Won't these have weaker G200 cores? I heard they have lower shader counts. I still cant beleive they didn't get all of this on one PCB tho.


It has higher shader units than a gtx 260 216.. so theoretically, its faster than an SLId GTX 260 216.


----------



## TRIPTEX_CAN (Dec 10, 2008)

MrMilli said:


> - Two pcb's with a 448bit memory interface (i guess being minimum 10 layers)
> - Four chips (2 GT200 + display chip + PCIE bridge) - with a 55nm GT200 being around 470mm², it's still almost twice the size of a RV770.
> - 1792MB Ram
> ...
> ...



If they can deliver enough performance to clearly take the performance crown (15-20% in every game) then selling massive quantity at a smaller margin will work for them. I don't really see this being a cost effective solution.


----------



## AsRock (Dec 10, 2008)

cdawall said:


> look at his picture is that a 3rd PCB? and i found the quad SLi connector they just dont have it as a plug on the PCB yet



Yeah thats a 3rd PCB but it's only about 1 inch long lol.


----------



## ZilverPhish (Dec 10, 2008)

This will kill alongside x58 and i7


----------



## phanbuey (Dec 10, 2008)

mdm-adph said:


> since everything I've seen shows that Crossfire is more efficient than SLI.



That was then and this is now... everyithing crossfire before the g200 architecture was more efficient... now its actaully the other way.. and that review is using newer drivers and a gtx 216... he was referring to the gtx 260 192... those were the ones (with old drivers) that would get stomped by a 4870 1gb and then catch right up and beat the 4870's in cf...


----------



## TRIPTEX_CAN (Dec 10, 2008)

phanbuey said:


> That was then and this is now... everyithing crossfire before the g200 architecture was more efficient... now its actaully the other way.. and that review is using newer drivers and a gtx 216... he was referring to the gtx 260 192... those were the ones (with old drivers) that would get stomped by a 4870 1gb and then catch right up and beat the 4870's in cf...



Also that was then.... lol

If the claims ATI has made in the release notes for their latest drivers hold true the lead the GTX 260 216 has just got much smaller if not removed. This is all speculation but they are claiming up to 57% increase in FC2 with crossfire systems among other things.


----------



## SystemViper (Dec 10, 2008)

cdawall said:


> look at his picture is that a 3rd PCB? and i found the quad SLi connector they just dont have it as a plug on the PCB yet



If that is anything like the 9800GX2 there is 2 of those flat ribbon cables that connect both boards, so it looks like someone just forgot to connect that 2nd cable to the PCB


----------



## MrMilli (Dec 10, 2008)

Solaris17 said:


> i think they will the cost of making it wont be too much as they know that wouldnt be cost effective no i think everything is already in place. its basically going to be the same design as the GX2 i can tell you that just looking at that pic and remembering what mine looks like taken apart. the only thing they are goig to do it move a couple capacitors to fit the biger core.



This thing will be twice as expensive to produce compared to a GX2 but they will sell it at the same price i guess. nVidia knows it won't be cost effective but they just wants to be the fastest again.


----------



## cdawall (Dec 10, 2008)

SystemViper said:


> If that is anything like the 9800GX2 there is 2 of those flat ribbon cables that connect both boards, so it looks like someone just forgot to connect that 2nd cable to the PCB



thats what i thought but looking at there design there is no spot for it.


----------



## DarkMatter (Dec 10, 2008)

lemonadesoda said:


> Where's the debate? Did you determine it over? Hadn't realised you had been promoted to the discussion police.
> 
> This 295 monster isn't an evolutionary feat. It's a frankenstein monster. By its design it's clear it will suffer from heat problems (trapped between the PCB) limiting aftermarket coolers and OCability.
> 
> ...



Someone has to fight the increasing load of BS in the forums. Like everything in your post. BS. The same BS that was said once and again about the 9800GX2. The dual card solution is BY FAR a better solution than the one present in the X2 in terms of cooling. More expensive for what a cooling solution is ($15 Cheap, $25 expensive, that affects a card to sell for $510 instead of $500, woohoo big deal!) FAR BETTER. Not only is better in theory, but actual cooling performance of the GX2 compared to the X2 proved it to be much better. 

Trapped between PCBs... Yeah because the chips in the X2 aren't trapped between the PCB and the plastic protector that is at exactly the same distance right?? What is worse is that in the X2 the second GPU recieves hot and dirty* air and because of that it's much much hotter than the first one. With two PCBs both GPUs recieve fresh and clean air and are cooled much better. The fan keeps moving the air, so it doesn't matter how hot the air gets because there are 2 GPUs in the same space. The pace at which a fan moves the air is not relative to how hot the air is in one place.

* I don't know the words. Dirty air = convoluted, whirl air. It is the worst enemy of air cooling, that's why cable management is so important.

The 9800GX2 overclocked wonderfully, better than the X2's in fact. Assuming the GTX295 will be hot and won'0t overclock well because it follows the same fundations than the GX2, when this last one was cooler and oCed better than Ati's X2 cards is stupid.

Power consumption: the GTX260 consumes much less than the HD4870, why in hell would a 55nm GT200 consume more than the X2? Simply it won't and you have to compare it to that card. It will be power hungry of course, but considering against what it will compete, even mentioning the fact means your trying to say it will consume more, which is FALSE, or in the best case for your defense UNKNOWN.


----------



## SystemViper (Dec 10, 2008)

I loved my 9800GX2, that thing cranked over 20k 3Dmark06 without much coaxing.


----------



## phanbuey (Dec 10, 2008)

TRIPTEX_MTL said:


> Also that was then.... lol
> 
> If the claims ATI has made in the release notes for their latest drivers hold true the lead the GTX 260 216 has just got much smaller if not removed. This is all speculation but they are claiming up to 57% increase in FC2 with crossfire systems among other things.



 It was... it was then?  I don't doubt it... i think alot of xfire issues are caused by lag in AMD's driver department (which is located in the basement of the building - and their boss only goes down there to to steal their red stapler)


----------



## Binge (Dec 10, 2008)

DarkMatter said:


> Someone has to fight the increasing load of BS in the forums. Like everything in your post. BS. The same BS that was said once and again about the 9800GX2. The dual card solution is BY FAR a better solution than the one present in the X2 in terms of cooling. More expensive for what a cooling solution is ($15 Cheap, $25 expensive, that affects a card to sell for $510 instead of $500, woohoo big deal!) FAR BETTER. Not only is better in theory, but actual cooling performance of the GX2 compared to the X2 proved it to be much better.
> 
> Trapped between PCBs... Yeah because the chips in the X2 aren't trapped between the PCB and the plastic protector that is at exactly the same distance right?? What is worse is that in the X2 the second GPU recieves hot and dirty* air and because of that it's much much hotter than the first one. With two PCBs both GPUs recieve fresh and clean air and are cooled much better. The fan keeps moving the air, so it doesn't matter how hot the air gets because there are 2 GPUs in the same space. The pace at which a fan moves the air is not relative to how hot the air is in one place.
> 
> ...



1. You've owned a 4870x2?  Mine ON AIR, STOCK COOLER, got better temps than my GTX260.  
2. Whoever said this won't overclock is a moron because nVidia will cripple the stock speeds to allow a massive overhead.  It makes them look better.
3. The GTX260 does not consume less power than a correctly bios modded 4870.
4. This is all speculation, and so is your retort.
5. Nobody interested in benchmarking/performance cares about the environment.
6. The thing that got nVidia into issues with their sales was sticking to the old designs and not pushing into the new era.  Why not really turn the tables and have a cooler on both sides of the card?  Wait that'd kill slot SLI...  lame.  Still at least I'm THINKING.
7. A lot of what you've said about the 260 means that you have actual knowledge on the subject.  Have you owned one?
8. Don't point fingers and call BS when I see a ton of flaws in a number of your statements.
9. I'm going 100% behind the statement "Don't knock it until you've tried it," and I honestly can't see dual card solutions beating SLI/Crossfire these days when the synthetic and real world tests prove having two single cards with a bridge clip is less energy efficient but produces better results.  Go ahead, ask Fitseries, dark2099, and a bunch of other people here.


----------



## SystemViper (Dec 10, 2008)




----------



## phanbuey (Dec 10, 2008)




----------



## DarkMatter (Dec 11, 2008)

Binge said:


> 1. You've owned a 4870x2?  Mine got better temps than my GTX260 ON AIR, STOCK COOLER.
> 2. Whoever said this won't overclock is a moron because nVidia will cripple the stock speeds to allow a massive overhead.  It makes them look better.
> 3. The GTX260 does not consume less power than a correctly bios modded 4870.
> 4. This is all speculation, and so is your retort.
> ...



1- Well that goes against nature, every benchmark out there proves the contrary. Ah only in one GPU as I said, but that's enough for crippling OCability.
3- Irrelevant. It's stock what matters, very few people will deal with bioses. Anyway if it's so easy why hasn't Ati solved that in ANY of their newer cards????
5- Graphics cards are for gaming. I don't care what people like doing with them. I'm very interested in performance, but I'm very concerned about the environment too AND specially the price. 100w more on a card means $100 more per year of operation.
6- BS. Unless you are strictly talking about them sticking to 65nm.
7- I don't have to own every single card on the market. I have friends, they even live in the same city and all!! Incredible!! One of them owns a small store 50 m away from my home. He doesn't always have a card available, that he doesn't have to sell. But many times he would let me play with some of the builds he has to mount.
Whenever I can't test things myself or ask friends that I can see and touch (not by phone, email, forums, etc) I rely on reviews and benchmarks. Mostly those of Wizzard. With the incresed load of BS in forums in general, I don't believe anything a forum member says unless he has something to back the info, I don't care who he is. If it's not someone with some kind of legal or public responsability (like reviewers who are exposed) I don't care what he has to say.
8- Be specific.
9- GX2 was faster than 9800GTX SLI considering it's clocks are 30% lower (EDIT: sorry 23% I compared it to my brother's OC 9800). Faster clock for clock that is. I don't have to ask to know SLI/Crossfire is usually faster than dual cards, I know. I didn't say the contrary.


----------



## Binge (Dec 11, 2008)

1. On review sites they do not tweak the fan controls of either card, making them both suck.
3. I don't care about what the average person does with their card because -I- have the ability to search on Google and find a number of like-minded people who are willing to put 20 minutes into solving a problem before sitting down and spanking off to Crysis.
5. See 3.
6. You're still living in the past along with nVidia's process...  I'm talking about 65nm, using a IHS to keep noobs from crushing their GPUs, and their design.
7. Running a few tests with cards you're borrowing and lacking the real time to play around with to maximize performance is a really silly way to respond to me when I've stated that I believe in thorough experience with all of these new cards.
8. Give me a break... I recognized and responded to your whole statement in a point by point analysis/breakdown.  I was very specific.
9. 9800s may not SLI as well as GTX260.  In fact I'm pretty sure they don't SLI as well as GTX260.


----------



## DarkMatter (Dec 11, 2008)

Binge said:


> 1. On review sites they do not tweak the fan controls of either card, making them both suck.
> 3. I don't care about what the average person does with their card because -I- have the ability to search on Google and find a number of like-minded people who are willing to put 20 minutes into solving a problem before sitting down and spanking off to Crysis.
> 5. See 3.
> 6. You're still living in the past along with nVidia's process...  I'm talking about 65nm, using a IHS to keep noobs from crushing their GPUs, and their design.
> ...



I find it funny this way of posting, haha:

1. 3. 5. When speaking about how a card IS, ACTUALLY HOW it is for 95% of the people is what matters. I don't care if you managed to make your Honda Civic faster than that other guy's Ferrari. Ferrari IS faster.

6. As I said 65nm yes. Everything else BS. Are you a GPU engineer? How do you know those things are REALLY of the past. You know, we still use wheels, because it's still the best solution. CHANGE != evolution. i.e RV670's ringbus was the OMFGL33T revolution until copying Nvidia in RV770 is the revolution now. Funny.

7. Running a few tests (4-6 hours with each piece of hardware, full review runs, 3DM06, Crysis, COD4, Bioshock, UT3, TF2...) that bring similar results as the ones present in reviews, yes I think is enough to get an idea. I don't make those things when the cards are supernew, so I already use the things that better worked for others. I only make one run per bench, just to check, more than looking everything for myself. It's always consistent with reviews until now, so I think I will continue believing in my "method", thank you very much. If something needs more than 5 hours to fix or find a solution, see point 1.

9. AND because of that the GTX295 can be much faster too. Point is who knows? It's not me saying the card WILL perform, I'm all the time speculating it CAN perform, in response to the guy who assured it won't. There's a VERI BIG difference. And it is you debating it won't again.


----------



## Binge (Dec 11, 2008)

DarkMatter said:


> I find it funny this way of posting, haha:
> 
> 1. 3. 5. When speaking about how a card IS, ACTUALLY HOW it is for 95% of the people is what matters. I don't care if you managed to make your Honda Civic faster than that other guy's Ferrari. Ferrari IS faster.
> 
> ...



1 & 7. Just to be fair I'm comparing cars like Mustangs to Ferraris here...  Not a every day 4 banger to a high end horsepower machine.  Be fair.  In terms of graphics, both are high end cards.  That is a poor analogy as I don't need to gut my graphics card to make it run faster.  Please be fair to the enthusiast here as well.  Some of us live, breathe, and eat tech for breakfast and if it came completely bundled with all the answers and no mystery then we would be out of a hobby.  Just because you're unwilling to make something work doesn't mean someone will and do it better than your lazy solution.

6. My father has worked for ATi.  I go to him with a lot of questions and random info and we both love to look at new tech.  That aside I don't need to be an engineer to know that the IHS on the GTX200 series of cards is causing ALL of the heat issues.  If they didn't have it then they would be pwning ATi's little arse in terms of heat.  Their design is last generation~  The difference between core architecture of the RV6xx and RV7xx is insane.

9. I addressed that in point 4 a few posts back.  This is all speculation, but current history is leading me to believe that the power/architecture restrictions on dual cards is keeping them from reaching as high of a potential as two singular cards in SLI/Crossfire.


----------



## DarkMatter (Dec 11, 2008)

Binge said:


> 1 & 7. Just to be fair I'm comparing cars like Mustangs to Ferraris here...  Not a every day 4 banger to a high end horsepower machine.  Be fair.  In terms of graphics, both are high end cards.  That is a poor analogy as I don't need to gut my graphics card to make it run faster.  Please be fair to the enthusiast here as well.  Some of us live, breathe, and eat tech for breakfast and if it came completely bundled with all the answers and no mystery then we would be out of a hobby.  Just because you're unwilling to make something work doesn't mean someone will and do it better than your lazy solution.
> 
> 6. My father has worked for ATi.  I go to him with a lot of questions and random info and we both love to look at new tech.  That aside I don't need to be an engineer to know that the IHS on the GTX200 series of cards is causing ALL of the heat issues.  If they didn't have it then they would be pwning ATi's little arse in terms of heat.  Their design is last generation~  The difference between core architecture of the RV6xx and RV7xx is insane.
> 
> 9. I addressed that in point 4 a few posts back.  This is all speculation, but current history is leading me to believe that the power/architecture restrictions on dual cards is keeping them from reaching as high of a potential as two singular cards in SLI/Crossfire.



1. I just wanted to be sure the point was caught. But I again emphatize that it's important, if you really want to be fair, to say things as they are and as they will be for most pleople (you can always and "but when..." after that). Enthusiast don't need to know what is faster than what and when, they learn it themselves. Newbies that enter this kind of forums everytime they need to buy their next card with their hard earned money (or 18 months of mommy's pay) does. For every enthusiast that posts here 100 visitors come in and read what we post here. Thse guys won't mod, probably won't even oC, so saying xxxx is better based on anything but what they can buy in stores, is missleading, and because they know nothing they will follow what's being said here as a Bibble. You want to be fair? BE fair then.

6. I studied 2 years of engineering, that doesn't make me an expert, but I do read a lot too. So I guess it just boils down to who has the bigger pennis? There's no need IMHO, when I'm always saying that I'm not sure of anything, as nothing is certain, I think it's this point of view the right one. On the other hand it is you and lemonadesoda saying HOW things ARE GOING to happen. Unless you have a cristal ball you should have not replied in the first place, because I just pointed out we did noy know nothing.

On the other hand, being that your father worked for Ati is not a surprise that you think what you think. It wouldn't matter if he was the lead engineer, you'd have only been told half the story. It's there where the key is. Ati believes in one way of doing things, Nvidia in the completely oposite one. Both have the best experts in the world. So the ABSOLUTE truth is Nvidia is wrong? Again you were only told half the story, or better said you just decided to believe half the story. I read both and I believe in both.

About the IHS, as you said is there for noobs to not break the card more than anything IMO. And it's not something to joke about, many friends broke their chips when changing the cooler. What are you going to say? That they were noobs and shouldn't have touched anything? They were noobs in fact, but they did take a lot of care. If you can't see the value an IHS can be for the masses, then your point 1. doesnt make sense at all, you are reducing even more the installed base of people that would run something different than stock. OH and what heat issues????


----------



## PCpraiser100 (Dec 11, 2008)

OMG its the 7900GX2's successor!!!!!! ROFL!!!


----------



## SteelSix (Dec 11, 2008)

AsRock said:


> Yeah thats a 3rd PCB but it's only about 1 inch long lol.



Looks like aluminium to me, part of the cooler..


----------



## Binge (Dec 11, 2008)

DarkMatter said:


> 1. I just wanted to be sure the point was caught. But I again emphatize that it's important, if you really want to be fair, to say things as they are and as they will be for most pleople (you can always and "but when..." after that). Enthusiast don't need to know what is faster than what and when, they learn it themselves. Newbies that enter this kind of forums everytime they need to buy their next card with their hard earned money (or 18 months of mommy's pay) does. For every enthusiast that posts here 100 visitors come in and read what we post here. Thse guys won't mod, probably won't even oC, so saying xxxx is better based on anything but what they can buy in stores, is missleading, and because they know nothing they will follow what's being said here as a Bibble. You want to be fair? BE fair then.
> 
> 6. I studied 2 years of engineering, that doesn't make me an expert, but I do read a lot too. So I guess it just boils down to who has the bigger pennis? There's no need IMHO, when I'm always saying that I'm not sure of anything, as nothing is certain, I think it's this point of view the right one. On the other hand it is you and lemonadesoda saying HOW things ARE GOING to happen. Unless you have a cristal ball you should have not replied in the first place, because I just pointed out we did noy know nothing.
> 
> ...



Whoa whoa whoa... My father has worked for a bunch of companies, and don't get the wrong impression.  He worked with them back when the Rage 128 was being put together.  I'm using that as a reference to sources I have for information.  If you're saying you want a completely fair comparison than a 4870 will wax and stomp the floor with 260s on the market that are running stock speeds.  Give the general population of gamers a bit more respect.  Not all of them, myself included, suck on mom's teet and buy a card only to learn nothing about it.

What heat issues???  Just look on Google "GTX280 IHS".  There you'll find people in strange discussions talking about how their GTX280 is overheating to 113C because of incorrect contact with the IHS causing an insulating effect to the GPU.  ATi does not have an IHS on their new chips and they are sold to the masses.  Get a grip...  I'm sorry your friends broke their chips.  There are unfortunate accidents every day, but that's no reason to put another thermal barrier between the die and the cooler.


----------



## Bjorn_Of_Iceland (Dec 11, 2008)

SteelSix said:


> Looks like aluminium to me, part of the cooler..


Yep its part of the cooler. check the other side.


----------



## DarkMatter (Dec 11, 2008)

Binge said:


> Whoa whoa whoa... My father has worked for a bunch of companies, and don't get the wrong impression.  He worked with them back when the Rage 128 was being put together.  I'm using that as a reference to sources I have for information.  If you're saying you want a completely fair comparison than a 4870 will wax and stomp the floor with 260s on the market that are running stock speeds.  Give the general population of gamers a bit more respect.  Not all of them, myself included, suck on mom's teet and buy a card only to learn nothing about it.
> 
> What heat issues???  Just look on Google "GTX280 IHS".  There you'll find people in strange discussions talking about how their GTX280 is overheating to 113C because of incorrect contact with the IHS causing an insulating effect to the GPU.  ATi does not have an IHS on their new chips and they are sold to the masses.  Get a grip...  I'm sorry your friends broke their chips.  There are unfortunate accidents every day, but that's no reason to put another thermal barrier between the die and the cooler.



Ha! don't try to fool anyone, you won't. Search HD4870 or HD4850 overheating issues because there are much more of them. So why have the problems? There's no IHS there.  LOL. What a valid point you made my friend. Anyway which performance card is free of some overheating samples nowadays?

I didn't understand the first paragraph well, are you saying the HD4870 stomps the floor of GTX260s??  I wouldn't call stomp a 2% difference... http://www.techpowerup.com/reviews/Leadtek/GeForce_GTX_260_Extreme_Plus/26.html

DON'T look at the Leadtek card and come with the typical claim it's overcloekd, etc. Look at the OTHER GTX260, and in higher resolutions if you want to see the HD4870 ahead. Aaaand... That's right 2% wohhooo Ati OWNS. WOOOAAaaaaHHhhHH!!

And the thing is that noobs can buy factory overclocked Nvidia cards for the same price than those with stock clocks (that Leadtek for example). Same with Ati, don't get me wrong, but usual OC's of Nvidia are 15%, 20% etc, while Ati cards rarely surpasses the 10% mark.

Man you just lost my respect, we were having a good discussion, but you just ruined it for me with that one. Stomp. :shadedshu

Anyway a little bit more focused in the topic:

http://www.guru3d.com/article/core-i7-multigpu-sli-crossfire-game-performance-review/6

I was curious about how GTX260 and HD4870 did in dual card setups, so I did a table in Excel with those results for a better/easier comparison. It will surprise you and many others I'm sure. 

In short, I was right. It's 19% faster overall in newest games and up to 57% eek at 2560x1600. I'm sure 8.12s improves performance a bit, but until I see facts, for me that's what matters. Specially with Ati, that always "improves performance in newer driver releases", but I've been seing driver comparisons along the year in many sites that proved it false, as it being an overall improvement.


----------



## Binge (Dec 11, 2008)

Alright.  You've hit a magic button and I'm actually hurt.  You've taken the point and twisted it so much you're even agreeing with me on a number of points!  Why are you attacking me in a way that makes you SEEM right in a non-existing argument?

I've never heard of anyone on these forums (not the schmucks you seem to cling to as your comparison for gamers/people who would use these cards) have an ATI card overheat on them at stock.  Hell!  One time while I had my water cooling setup on my graphics cards (4870 in crossfire) the motor in my pump melted and the cards hit TJMax and shut my PC down, but it never killed the cards.  After all was said and done I reattached the air cooling and they ran perfectly until I could RMA the pump.  No failure.  Why attack me about ATi cards overheating when you KNOW it's a fan speed fix away from good operating temperatures?

What I've been saying the WHOLE TIME has been that scaling between crossfire and SLI has become so good that single PCI-E slot solutions with dual GPU are less powerful than two single cards on their own.  You won't focus on the point I'm making.

While I am dispelling some of the crap you're spewing out about these cards without ever tweaking and maximizing a card out for yourself I am not here to argue about reviews!  I could care less because every review I've come across has been different.  On other reviews I've seen Guru3D.com has a ton of good reviews and sheets showing which card is the biggest and baddest.  The 4870s 512 compare neck and neck with a GTX260 192, but the GTX260s always need a bit of an increase as their stock clocks are 576/1242/2000.  You would change the values and you'd change the fan speeds to get the most out of your card, so stop being a hypocrite.  If someone can crank the crap out of two single cards and get higher #s in frames or benches that makes the dual card solution only appealing for restrictive PCI-E configurations.  Give up the attitude and face facts.  You're showing that SLI/Crossfire is improving because the motherboard is improving and I'm seeing through results with forum members that it's gone beyond the bridge chip!

Do me a huge favor and stick to the point.  I believe the GTX295 will not hold up against 2x GTX260 216 in clocks/temps/results.


----------



## Solaris17 (Dec 11, 2008)

Easy boys...facts are facts and you can dispute those freely but all of this sarcasm and slight is getting a little too eadgy and as such your both getting closer to severing your necks...so lets try to keeping an amazing discussion a little more cival..kk?


----------



## DarkMatter (Dec 11, 2008)

Binge said:


> Alright.  You've hit a magic button and I'm actually hurt.  You've taken the point and twisted it so much you're even agreeing with me on a number of points!  Why are you attacking me in a way that makes you SEEM right in a non-existing argument?
> 
> I've never heard of anyone on these forums (not the schmucks you seem to cling to as your comparison for gamers/people who would use these cards) have an ATI card overheat on them at stock.  Hell!  One time while I had my water cooling setup on my graphics cards (4870 in crossfire) the motor in my pump melted and the cards hit TJMax and shut my PC down, but it never killed the cards.  After all was said and done I reattached the air cooling and they ran perfectly until I could RMA the pump.  No failure.  Why attack me about ATi cards overheating when you KNOW it's a fan speed fix away from good operating temperatures?
> 
> ...



And as I told you in the first reply I KNOW it probably won't be faster!! It's a shame you are unable to read, if that's all what you are discussing. It doesn't matter if it doesn't, this is not a thread about GTX260 SLI, it's about the GTX295. 2x HD4870 are faster than a single X2 (not always though) and way cheaper right now, but doesn't mean the X2 is not worth of a card isn't it?? As I proved above the GTX260 scales better, that goes directly against the opinions that first, Crossfire scales better, and second that the GTX295 WILL be slower. Maybe yes, maybe not, but with the facts we already have the GTX295 has all the chances of being much better of a card. Assuming that the dual card won't scale as well comparatively to what the X2 is capable of is stupid. It doesn't need to be faster than GTX260 SLI, just the X2, that's what I've been discussing all the time. So far we know:

1- Chip in 65nm incarnation already scales better in dual card configs, beyond what the X2 and RV770 crossfire can do, while consuming less.
2- news/rumors from respectable sources that told that a 55nm 240 SP GT200 based Quadro card with 4GB of ram running at 650 Mhz, has 160W TDP compared to 234w of it's 65nm daddy or 183W of the GTX260.
3- in the past generation both dual cards from both vendors were "close" to their crossfired single cards, but the GX2 was underclocked (600 vs 675) and the X2 was clocked above its single card cousin (825 vs 775), which suggests the G92 already scaled better in the GX2 than the X2, IN A TIME where crossfire by itself scaled much better than SLI.

With those precedents the GTX295 has all the chances to be a fast, cool and not so power hungry card. Deliberately forgetting about those facts, lemonade made his BS comments and you followed suit. I respond.

It doesn't matter if a multicard slution is faster, it has always been, it will always be (until they make it viewable as a single card, single frame buffer card), but a single card solution even if it's dual has a lot of benefits and is definately worth to exist. No more no less Crossfired HD4850s are as fast as the X2 and far cheaper, but many people here, that have to know that fact, chose to go with the X2. 2xGTX260 are going to be faster, probably, but the GTX295 has a place in the market, right next to the X2 and it will probably be better on every front as I explained. So why in the hell has the X2 the right to exist but not the 295? That's what I've been discussing. You should had read better if everything you were replying to was that 2xGTX260 will still be faster, I said so in my first reply after the one calling BS on lemonade. What a waste of time. But given the facts and the rumors I have no doubt it will be both cooler and consume less than 2xGTX260 on the other hand.

About the overheating issues, I don't care what you heard or whatnot, there's been a lot of those in these forums and you will find a lot just googling. Prior to your post I had never heard of that problem with the GTX280s instead and God knows that there are far less entries when you google it that with the RV770. I guess that means we are tied, I know you seem to think you are the beholder of ABSOTUTE TRUTH, but it's not the case. Just as with the opinion that Nvidia is in the past, you are not right here. RV770 has as many overheating issues as the GT200 and has no IHS, so couldn't it be maybe because of the increasing processing power than EVERY new generation card suffers from some overheating samples and that we don't need to find a guilt???? I ask.

EDIT: http://en.expreview.com/2008/12/09/geforce-gtx295-with-480sp-surely-to-come-in-ces-09.html

Maybe not the best source Expreview, they've been pretty right lately. 289w TDP, so much less than 2x260GTX and HD4870 X2.


----------



## Crazy Buddhist (Dec 11, 2008)

*Two PCB's why?*



GFC said:


> How can it have less power ? It's built on 55nm process, unless they down-clock it because of the heat - it's going to be faster.
> The main concern for me is, why the hell did they use dual-PCB setup like last time? Because of that, there's barely any place for the fan, which means it's going to be a *really* hot card.



Maybe the dual PCB acts as two sides of a tunnel, if you consider fan placement and venting on these style cards, ducting the air past all the components on the two boards. The casing provides the rest of the tunnel enclosure. It seems to be a design that runs cool enough.

CB


----------



## Solaris17 (Dec 11, 2008)

AsRock said:


> Yeah thats a 3rd PCB but it's only about 1 inch long lol.



haha yes the 1 inch long dvi connector side.



cdawall said:


> thats what i thought but looking at there design there is no spot for it.



their is they didnt connect it the connector for that ribbon is under the pcb



Crazy Buddhist said:


> Maybe the dual PCB acts as two sides of a tunnel, if you consider fan placement and venting on these style cards, ducting the air past all the components on the two boards. The casing provides the rest of the tunnel enclosure. It seems to be a design that runs cool enough.
> 
> CB



thats exactly what it does however as for cool i have some bad news my cores will load at ~91c but i have 2 gx2's side by side so i suppose that might have alot to do with it.


----------



## Solaris17 (Dec 11, 2008)

drew up how their supposed to connect for you guys wondering or not getting it.


----------



## lemonadesoda (Dec 11, 2008)

DarkMatter said:


> Someone has to fight the increasing load of BS in the forums. Like everything in your post. BS.


Try to be less rude and less arrogant. You'll get along a lot better with people on this forum if you try harder.

PS. You wasted two years on your engineering degree if you think it is EASIER to cool 200W sandwiched within a 1 inch space (200W per inch) than it is to cool 100W over 1 inch then another 100W over another inch (100W per inch). Yes, you can do it. But it's not going to be as easy and it will be noisier.


----------



## Solaris17 (Dec 11, 2008)

lemonadesoda said:


> Try to be less rude and less arrogant. You'll get along a lot better with people on this forum if you try harder.
> 
> PS. You wasted two years on your engineering degree if you think it is EASIER to cool 200W sandwiched within a 1 inch space (200W per inch) than it is to cool 100W over 1 inch then another 100W over another inch (100W per inch). Yes, you can do it. But it's not going to be as easy and it will be noisier.



55nm or not i will bank with you on that the 260 cores will run hotter...they are bigger and have more internals they will load rediculosly hot ....and if someone gets 2 GTX295's i have some bad news for them and how hot it will run...however regardless of fact or speculation this thread needs to calm down and im absolutely serious in case no one bothered to heed it the first time.


----------



## btarunr (Dec 11, 2008)

Two G200b GPUs + dozens of memory chips + NVIO 2 + BR-03 (nForce 200) chip all powered by 6+8 pin in all...mighty impressive!


----------



## DarkMatter (Dec 11, 2008)

lemonadesoda said:


> Try to be less rude and less arrogant. You'll get along a lot better with people on this forum if you try harder.
> 
> PS. You wasted two years on your engineering degree if you think it is EASIER to cool 200W sandwiched within a 1 inch space (200W per inch) than it is to cool 100W over 1 inch then another 100W over another inch (100W per inch). Yes, you can do it. But it's not going to be as easy and it will be noisier.



Wrong. Look at reviews, both cores on the GX2 run cooler than the second GPU in any X2. That is the fact, no need for expeculation. The reason I gave is why that occurs. I know the basis of termodynamics, thanks and btw it's all about VOLUMES of air that will have contact with the chip/fins and SURFACES (of the chips and cooler fins in this case) not about the space between the pcbs, 1 inch or whatever. As long as you move enough air, and the GX2 does, it doesn't matter how much space you have. The ammount of air that is moved into a case is much higher, there's mor free space, and the fans move a hell of a lot more air than GPU or CPU coolers, but those cool much better. Tell me by your theories why... I'll tell you the overall VOLUME of air is bigger, but the VOLUME of air that makes contact with the SURFACES of the cooler are the "same" (relative to speed) in both cases, but case fans can't provide the chips with cool and clean air. Also as important is HOW that volume of air gets to the chip or the fins. Clean straight air cools a lot better. In the X2 the second GPU does not get clean straight air. Next time check your facts before calling me.

EDIT: I forgot to mention the most important one: the temperature of the air. It doesn't work exactly as in the example I'm going to give but similarly and you can get the idea. If with no air circulation a chip was 100 C and our air temperature is 20 C, over the time and with perfect heat tranfer both will eventually end at 60 C, on the contrary if our air is already at 60 C the next chip will only get to 80 C. As I said it doesn't work that way, so linearly and without taking into account volumes, densities, thermal properties, etc. The same effect does occur though. With moving air is the same, but quicker.

PD. Now, when you SLI two of them, in most mobos the upper one will be almost unable to get fresh air because the other one is in the way and the second one is usually below the path that the air takes inside most cases, so the very first need I mentioned, fresh clean air delivery is destroyed and the cards can get hot hot too.

EDIT2: There's yet another factor to take into account, and that is that the contact surface of the chip is also important. Trying to cool down 200w through a contact surface of 256mm^2 is much harder than doing it on a 480mm^2 one. And that's one of the reasons the GTX295 WILL be cooler (speculating, but I'm going to enjoy saying "I told you").


----------



## Hayder_Master (Dec 12, 2008)

i hope it is not another mistake like 9800gx2


----------



## Solaris17 (Dec 12, 2008)

hayder.master said:


> i hope it is not another mistake like 9800gx2



how was the GX2 a mistake?


----------



## TRIPTEX_CAN (Dec 12, 2008)

hayder.master said:


> i hope it is not another mistake like 9800gx2



I hope you're prepared for a 1000 word essay reply to that post


----------



## Solaris17 (Dec 12, 2008)

TRIPTEX_MTL said:


> I hope you're prepared for a 1000 word essay reply to that post



introduction

5 paragraphs

conclusion


----------



## TRIPTEX_CAN (Dec 12, 2008)

Solaris17 said:


> introduction
> 
> 5 paragraphs
> 
> conclusion



Nice of you to provide the layout for whoever writes the piece. I wasn't referring to you, but I can see a massive backlash to his claim.


----------



## Solaris17 (Dec 12, 2008)

TRIPTEX_MTL said:


> Nice of you to provide the layout for whoever writes the piece. I wasn't referring to you, but I can see a massive backlash to his claim.



im running dual GX2's so im quite intrested in this claim that was more of a layout for myself


----------



## TRIPTEX_CAN (Dec 12, 2008)

Solaris17 said:


> im running dual GX2's so im quite intrested in this claim that was more of a layout for myself



I did remember hearing the 9800GX2 had some scaling issues upon release mostly with min FPS and micro-stuttering in some games (probably fixed by now) but from what I understand the GTX200s have much better support for SLi so I don't see that being an issue. 

You better get started on that essay we're not accepting submissions after 11:00 AM EST.


----------



## Solaris17 (Dec 12, 2008)

TRIPTEX_MTL said:


> I did remember hearing the 9800GX2 had some scaling issues upon release mostly with min FPS and micro-stuttering in some games (probably fixed by now) but from what I understand the GTX200s have much better support for SLi so I don't see that being an issue.
> 
> You better get started on that essay we're not accepting submissions after 11:00 AM EST.



i need to work at 12 and i wont be home till 8 right now im cleaning up a F#%ing mess because my GF's mom put all the presents in the room in the basment and the basment floods every year...guess what foot of water this morning and she didnt think to put anything up so the 22" acer the 850$$ D-SLR the 4GB memeory kit $80oz purfumes and shit i got my GF are totally fucked.....yay.........so i dont know if i want to get started on the GX2 this things work fine idkwtf people were doing to get micro stuttering but that deff isnt an issue...if people think it still is they have more on their rig to look at than the gaphics card.

-sol out


----------



## Fitseries3 (Dec 12, 2008)

according to an article i just read this card is not the gtx295. the 295 is a 55nm dieshrink of a 280 but oced more for better performance. 

so wtf is this card called? are they just gonna call it the gtx260gx2? or since it has the new core would it be the gtx270gx2?


----------



## TRIPTEX_CAN (Dec 12, 2008)

Solaris17 said:


> i need to work at 12 and i wont be home till 8 right now im cleaning up a F#%ing mess because my GF's mom put all the presents in the room in the basment and the basment floods every year...guess what foot of water this morning and she didnt think to put anything up so the 22" acer the 850$$ D-SLR the 4GB memeory kit $80oz purfumes and shit i got my GF are totally fucked.....yay.........so i dont know if i want to get started on the GX2 this things work fine idkwtf people were doing to get micro stuttering but that deff isnt an issue...if people think it still is they have more on their rig to look at than the gaphics card.
> 
> -sol out



Ouch man that would piss me off so much. I hope to hell you have insurance for water damage. :shadedshu 

Good luck with the cleanup...


----------



## Solaris17 (Dec 12, 2008)

fitseries3 said:


> according to an article i just read this card is not the gtx295. the 295 is a 55nm dieshrink of a 280 but oced more for better performance.
> 
> so wtf is this card called? are they just gonna call it the gtx260gx2? or since it has the new core would it be the gtx270gx2?






			
				Techpowerup.com]the company is also planning a [U][COLOR="Red said:
			
		

> dual GPU[/COLOR] card named the GeForce GTX 295[/U]. Its single GPU flagship offering will be called GeForce GTX 285.



.



TRIPTEX_MTL said:


> Ouch man that would piss me off so much. I hope to hell you have insurance for water damage. :shadedshu
> 
> Good luck with the cleanup...



o the only thing going through my head is death....i want to destroy things.


----------



## Fitseries3 (Dec 12, 2008)

dir..

my dyslexia translated the 285 into a 295.


----------



## Solaris17 (Dec 12, 2008)

fitseries3 said:


> dir..
> 
> my dyslexia translated the 285 into a 295.



hah


----------



## Hayder_Master (Dec 14, 2008)

Solaris17 said:


> how was the GX2 a mistake?



hello my friend , do you know i expect you replay on this and im ready to answer you sure 
1- im talk about the 9800gx when release , extreme high price for normal performance , sure you are only win when you got it , cuz you got 2x9800gx in half price of an 9800gx2 when release or even it is still stick with high price for 2-3 months , after that nvidia kill it when gtx200 release , and if you remember this  is same story of 7950gx2 , so i see and most people see this is big mistake , so if this new one release let we say come with price at 600$ , after 2 moths nvidia release new generation card's with double performance and same price , in that time anyone who got this card see himself do big mistake.
there is no mistake if take 9800gx2 at 200$ or 8800 ultra at 160$ in this case called smart chose , and you do smart chose sure if i have your mobo i do same thing


----------



## Pixelated (Dec 23, 2008)

DarkMatter said:


> I don't know what makes you think it will draw more power than the 9800 GX2, they have both the same 6-pin + 8-pin conectors.
> 
> I don't know why it would be *really* hot neither. The single PCB design of the HD3870 X2 didn't help out too much, the GX2 was cooler and quieter and indeed the temps were about the same as in single 9800s. Everything else were memes created long before the card even launched that continued existing thanks to the internets pop culture.
> 
> Rememer it's 55nm GT200 being used, we don't know anything about it yet. I say it's going to draw less power than the HD4870 X2 and have lower temps. Specilating like anyone else of course.



It's simple really. What card draws more power? 55nm 9800GTX+ or a 55nm GTX 260? I would say the GTX260 draws more. So double the power requirements and you get close to 300W of draw at peak.


----------



## DarkMatter (Jan 10, 2009)

DarkMatter said:


> I'm going to enjoy saying "I told you").



*I told you.*

Sorry but I couldn't let this go.  100% Effectiveness in my assumptions. As I speculated (based on facts and precedents):

1- GTX295 is much faster than the X2 and also significantly faster than GTX260 SLI.

2- Consumes much less power than both the X2 and 260 SLI.

3- Temps are significantly better in the GTX295 than in the second GPU in the X2, even in Quad SLI configs, which doesn't seem to increase it's temps a lot.

There are tons of reviews in the front page of TPU, january 9th, so don't bother me with providing a link, do your homework.


----------



## wolf (Jan 11, 2009)

its now all fact.

1xGT200 beats 1XRV770

2xGT200 beats 2xRV770

naturally there are FEW circumstances where the other is possible, but if you've read half the reviews on the home page, you'll see its incontrovertible.

as for the cooling solution discussion, i think its obvious the 295 cooler is better, both cores at the same temp and low temps, not one core 6-10 degrees hotter. not to mention the over clocking potential it has over a 4870X2, just in case you want to trounce one extra hard.

i really love and cherish RV770, it came at just the right time, and was just what the gfx market needed.

but nvidia, again, hold the crown for fastest single GPU and fastest single card on the planet.

its like that, and thats the way it is.


----------

