# Single-PCB GeForce GTX 295 Pictured



## D_o_S (May 26, 2009)

Zol.com.cn has managed to take some pictures of the upcoming single-PCB GeForce GTX 295. Expected to arrive within a month, the single-PCB GTX 295 features the same specs as the dual-PCB model - 2x448bit memory interface, 480 Processing Cores and 1792MB of GDDR3 memory, and GPU/shader/memory clocks of 576/1242/1998 MHz respectively.



 

 

 

 

 

 

 



*View at TechPowerUp Main Site*


----------



## tofu (May 26, 2009)

D_o_S said:


> http://www.techpowerup.com/img/09-05-26/geforce_gtx_295_single-pcb_06.jpg



This shows that board designers COULD make a really small GTX275.


----------



## Bundy (May 26, 2009)

Kinda surprising they are not using a radial fan, given the air flow path. Must work ok I guess.


----------



## stefanels (May 26, 2009)

Looking great but that card must give a lot of heat for that small fan to cool it...?!


----------



## florence (May 26, 2009)

*Galaxy single PCB GTX295 performance detailed*











We can find this product is identical to leaked reference PCB GTX295.The only difference is Galaxy logo. There are two cores and 28 memory on one PCB.The core process is also upgraded to 55nm,GPU works at standard 576MHZ core, shader clock is still at 1242MHz while the memory works at 2000MHz. It provides dual-DVI output and S / PDIF audio input interface.


----------



## alexp999 (May 26, 2009)

You missed the Benchmarks DoS 



 

 

 



And here is some comparison shots of it next to the dual PCB version:



 

 



Finally a few close ups I found interesting:


----------



## btarunr (May 26, 2009)

Hypothetically they should perform the same, if they have the same clock speeds. Nothing has changed except for the construction. I also find it laughable that NVIDIA gagged one of its partners when we (and others) posted pictures of their GTX 295 1P accelerator. There's nothing to put an NDA on, really.


----------



## alexp999 (May 26, 2009)

Same thing for the XFX 850W, all the original pics at the source have been taken down


----------



## haffey (May 26, 2009)

There's no way those heatsinks are going to cut it.


----------



## Studabaker (May 26, 2009)

haffey said:


> There's no way those heatsinks are going to cut it.



They will, because nVidia obviously doesn't care if our GPUs run at up to 90C under load.


----------



## PlanetCyborg (May 26, 2009)

was abouth time that nvidia makes a single PCB dual chip monster!!!! i have do admit that it looks better then the two PCB model


----------



## mabszy (May 26, 2009)

So this means less power, less expensive to buy, better performance?


----------



## Studabaker (May 26, 2009)

mabszy said:


> So this means less power, less expensive to buy, better performance?



1) The power difference will be negligible, 2) I sure hope so, 3) why would that be?


----------



## mabszy (May 26, 2009)

Studabaker said:


> 1) The power difference will be negligible, 2) I sure hope so, 3) why would that be?



3. faster communication?


----------



## jagass (May 26, 2009)

It looks nice...Thanks for the info...


----------



## Animalpak (May 26, 2009)

It should cost less than the first model or this card become a FLOP.


----------



## MoonPig (May 26, 2009)

Should be cheaper, their not making as much stuff. This IS a PCB less.... lol.

Not sure about it though, unless its in the £275 mark, im not even going to consider it. No point spending that much when DX11 is soon (hope so).


----------



## FreedomEclipse (May 26, 2009)

I think they should really revise the cooler designs. cuz with the heat thats gonna be throwing out i doubt its gonna last very long before parts of the PCB start to melt. This card will give the term "up in flames" a new meaning


----------



## tkpenalty (May 26, 2009)

I want a GTX275 that small....


----------



## denice25 (May 26, 2009)

looks good.. but i'm with the idea of revising the cooler design! thanks for the info anyway...


----------



## tkpenalty (May 26, 2009)

why couldnt they make the coolers shorter but longer, and have three half height 92mm fans on top of them instead?


----------



## newtekie1 (May 26, 2009)

haffey said:


> There's no way those heatsinks are going to cut it.



I don't see why they wouldn't, 2 heatpipes per GPU leading to a lot more fins than the original GTX295 should give decent cooling performance.



Studabaker said:


> They will, because nVidia obviously doesn't care if our GPUs run at up to 90C under load.



Neither does ATi, and 90°C is fine for a GPU.


----------



## powerspec (May 26, 2009)

They took out the HDMI port?  Why?  Thats one of the reasons i bought my GTX 295 for native HDMI with audio (and I needed a upgrade).  AFAIK Nvidia cards can't do audio over DVI or has that changed?

They could of just done 1 DVI port and a HDMI port.


----------



## newtekie1 (May 26, 2009)

powerspec said:


> They took out the HDMI port?  Why?  Thats one of the reasons i bought my GTX 295 for native HDMI with audio (and I needed a upgrade).  AFAIK Nvidia cards can't do audio over DVI or has that changed?
> 
> They could of just done 1 DVI port and a HDMI port.



You can use a DVI to HDMI adaptor to get HDMI with sound with this card and most nVidia cards on the market today.  The audio is provided the same way in all of them, via an SPDIF audio passthrough connector.


----------



## Disparia (May 26, 2009)

Excellent! Now make a dual PCB, quad GPU solution!!


----------



## FreedomEclipse (May 26, 2009)

Jizzler said:


> Excellent! Now make a dual PCB, quad GPU solution!!



would you buy one if they did release a quad core gpu solution? Hell i wouldnt event think about it.


----------



## haffey (May 26, 2009)

newtekie1 said:


> I don't see why they wouldn't, 2 heatpipes per GPU leading to a lot more fins than the original GTX295 should give decent cooling performance.


I don't know about the original GTX295 heatsink design, but this can't be enough.  2 heatpipes per GPU may be just barely sufficient, but those fins are small and densely packed.  With that fan design you won't be getting much air going through them, and it's going to be of a low pressure.


----------



## h3llb3nd4 (May 26, 2009)

haffey said:


> I don't know about the original GTX295 heatsink design, but this can't be enough.  2 heatpipes per GPU may be just barely sufficient, but those fins are small and densely packed.  With that fan design you won't be getting much air going through them, and it's going to be of a low pressure.



And that's where aftermarket coolers come into effect


----------



## zAAm (May 26, 2009)

Another problem is that a ton of heat is now going into the case... So I think the overall temps of the single gpu vs dual gpu would probably be slightly higher if you factor in case temperature rise and so on... Still, it'd be worth it if they can reduce the price of the card (even though they probably had to increase the amount of layers)...


----------



## h3llb3nd4 (May 26, 2009)

zAAm said:


> Another problem is that a ton of heat is now going into the case... So I think the overall temps of the single gpu vs dual gpu would probably be slightly higher if you factor in case temperature rise and so on... Still, it'd be worth it if they can reduce the price of the card (even though they probably had to increase the amount of layers)...



WHO NEEDS A CASE WHEN YOU HAVE A TABLE?

Just realised you're a fellow SA citizen(nice rig there)


----------



## zAAm (May 26, 2009)

h3llb3nd4 said:


> WHO NEEDS A CASE WHEN YOU HAVE A TABLE?
> 
> Just realised you're a fellow SA citizen(nice rig there)



Haha, I guess you're right... If I bought one of those I'd have nothing left to buy a case anyway so that problem would just sort itself out. 

SA FTW!


----------



## Disparia (May 26, 2009)

FreedomEclipse said:


> would you buy one if they did release a quad core gpu solution? Hell i wouldnt event think about it.



Sure. Might never have the money for one though...

Plan on water from the beginning, just have the two PCB's sandwich a double-side water block


----------



## ShadowFold (May 26, 2009)

I can see this thing bending with out a support beam or something.. Plus, whats the point of this, seriously. Why don't they work on GT300?


----------



## RadeonX2 (May 26, 2009)

ShadowFold said:


> I can see this thing bending with out a support beam or something.. Plus, whats the point of this, seriously. Why don't they work on GT300?



ya GT300 would be DX11 I suppose? can't wait changing my single slot 9600GT as it's barely playable on most of today's games :shadedshu


----------



## ShadowFold (May 26, 2009)

I can't wait for my GTX 275, my HD 3300 chugs in some games at 1920x1080


----------



## El_Mayo (May 26, 2009)

Jizzler said:


> Excellent! Now make a dual PCB, quad GPU solution!!



that DOES sound good..


----------



## a_ump (May 26, 2009)

El_Mayo said:


> that DOES sound good..



eh highly doubtful, and is scaling even any good with quad SLI, like 2 9800GX2 or 2 GTX295's?


----------



## El_Mayo (May 26, 2009)

dunno lol
they could release a QUAD SLI card for workstations i guess.


----------



## h3llb3nd4 (May 26, 2009)

a_ump said:


> eh highly doubtful, and is scaling even any good with quad SLI, like 2 9800GX2 or 2 GTX295's?



who cares?
remember what CD said! _"A man's e-peen is determined entirely by how much hardware he doesn't need...yet has bolted onto his rig... "_


----------



## El_Mayo (May 26, 2009)

h3llb3nd4 said:


> who cares?
> remember what CD said! _"A man's e-peen is determined entirely by how much hardware he doesn't need...yet has bolted onto his rig... "_



WHO said that? rofl


----------



## h3llb3nd4 (May 26, 2009)

Cyber Druid


----------



## El_Mayo (May 26, 2009)

h3llb3nd4 said:


> Cyber Druid



haha. epic quote xD


----------



## RadeonX2 (May 26, 2009)

h3llb3nd4 said:


> who cares?
> remember what CD said! _"A man's e-peen is determined entirely by how much hardware he doesn't need...yet has bolted onto his rig... "_



agreed with that. If I had the money to spend heck I'll put every new hardware and build a monster rig and I'll surely join TPU's folding farm


----------



## El_Mayo (May 26, 2009)

RadeonX2 said:


> agreed with that. If I had the money to spend heck I'll put every new hardware and build a monster rig and I'll surely join TPU's folding farm



what's folding?


----------



## douglatins (May 26, 2009)

Animalpak said:


> It should cost less than the first model or this card become a FLOP.



Hey same avatar


----------



## El_Mayo (May 26, 2009)

douglatins said:


> Hey same avatar



disturbed buddies


----------



## RadeonX2 (May 26, 2009)

El_Mayo said:


> what's folding?



ey the thing with molecules on it? F@H? 

http://folding.stanford.edu/


----------



## El_Mayo (May 26, 2009)

Jizzler said:


> Excellent! Now make a dual PCB, quad GPU solution!!



they could do this.. but with say.. 28nm GPUs i guess.
maybe 28nm versions of the 9600GT 
four of those glued together i guess
i'm just spitballin' here =]


----------



## newtekie1 (May 26, 2009)

haffey said:


> I don't know about the original GTX295 heatsink design, but this can't be enough.  2 heatpipes per GPU may be just barely sufficient, but those fins are small and densely packed.  With that fan design you won't be getting much air going through them, and it's going to be of a low pressure.



The original only had a single flat heatpipe per GPU, and the increase number of fins will also aid even more.  Based on the size and design of the heatsinks on this card, I would bet this card actually runs cooler than the original.



zAAm said:


> Another problem is that a ton of heat is now going into the case... So I think the overall temps of the single gpu vs dual gpu would probably be slightly higher if you factor in case temperature rise and so on... Still, it'd be worth it if they can reduce the price of the card (even though they probably had to increase the amount of layers)...



Exhausting some of the air into the case does suck, but at least half of it is still be exhausted out the back of the case.  Any case with decent airflow shouldn't have a problem with case temps rising.  They might go up a couple degrees, but not much beyond that, IMO.



ShadowFold said:


> I can see this thing bending with out a support beam or something.. Plus, whats the point of this, seriously. Why don't they work on GT300?



Why would it bend?  I think people really underestimate the strength of a PCB, especially one with as many layers as this thing...


----------



## CyberDruid (May 26, 2009)

DO want


----------



## Easo (May 26, 2009)

Why do i think it will cost exactly the same price? Or i am just being pesimistic realist?


----------



## Hayder_Master (May 26, 2009)

at last nvidia solve single PCB GTX 295 problem when ATI prepare to lunch 5870X2


----------



## lemonadesoda (May 27, 2009)

That must be one of the most inefficient poorly designed cooling concepts I've ever seen for a premium product.  "Stock cooling" designer should be fired.


----------



## RadeonX2 (May 27, 2009)

lemonadesoda said:


> That must be one of the most inefficient poorly designed cooling concepts I've ever seen for a premium product.  "Stock cooling" designer should be fired.



so true... they should've put dual fans on top of each heatsink not in the center :shadedshu or redesign the heatsink for good.


----------



## CyberDruid (May 27, 2009)

Who would keep stock cooling on that thing anyway? I mean it's a crazy expensive card to start with...so what's another $150 in waterblocks


----------



## a_ump (May 27, 2009)

lemonadesoda said:


> That must be one of the most inefficient poorly designed cooling concepts I've ever seen for a premium product.  "Stock cooling" designer should be fired.



yea, definitely doesn't look that inventive....looks rather poor actually. i wonder why they didn't go with ATI's method of a single fan blowing over both chips out of the case. I suppose this will keep both chips cooler than to have it ATI's way. Those heatsinks remind me of CPU sinks lol. I would think it would be more efficient of one of the heat pipes on the sinks were lower than the other. i realize it would obstruct air flow some but hum idk lol


----------



## erocker (May 27, 2009)

I'll reserve judgment until I see performance/temps/price/etc...


----------



## ShadowFold (May 27, 2009)

It looks like the fan could push air to each sink, but two fans would've been so much better.


----------



## buggalugs (May 27, 2009)

not a good time to buy a big powerful card. When DX11 comes out it is worth nothing. But i guess if you are rich who cares?


----------



## gumpty (May 27, 2009)

I do wonder who is going to buy this thing? Anyone that is likely to want GTX 295 power will know that the next generation cards are just around the corner (well, end of the year or something like that), so they might as well wait. Or buy a better value mid-range card or SLI/Xfire setup until that time comes.

This is surely just NVIDIA & it's partners padding things out for the next few months until the new products arrive. Make it appear as though GT200 chips have life in them and are still being innovated until GT300 arrives.


----------



## newtekie1 (May 27, 2009)

A lot of people aren't going to wait the 6+ Months to upgrade just because the next set of cards is coming out.  Plus, some people actually plan to buy a card towards the end of its life cycle, when it is cheapest.  Look at the past.  The 9800GX2 dipped way down in price right before the GTX200 cards were released, to the point where newegg had them for $300.  Then the GTX280 came out, and the 9800GX2 matched it but was cheaper due to the new product calling for a price premium.  I'm guessing the GTX295 will have a similar fate, and now that manufacturing is cheaper, that leaves more room for companies to offer lower prices.

The DX11 features of the new cards won't be a factor for most of the people, simply because they should know that there won't be any DX11 titles until at least a year from now, if not longer.


----------



## a_ump (May 27, 2009)

well the GT300 is supposedly delayed till 2010, so from now that's at least 7 months if they do a january launch. ATI however are expected to release their RV870 this year. I wonder if TSMC's 40nm problems are part of nvidia's delay, since the GT300 die is likely to be as massive as GT200's adding to poor yields on top of w/e problems TSMC is having, but since ATI's die is much smaller, their yields are better which allows them to launch sooner? just a thought


----------



## tkpenalty (May 28, 2009)

we're going to see a few capacitor explosions at this rate...


----------



## cscgo (Jun 8, 2009)

*Clueless*



a_ump said:


> yea, definitely doesn't look that inventive....looks rather poor actually. i wonder why they didn't go with ATI's method of a single fan blowing over both chips out of the case. I suppose this will keep both chips cooler than to have it ATI's way. Those heatsinks remind me of CPU sinks lol. I would think it would be more efficient of one of the heat pipes on the sinks were lower than the other. i realize it would obstruct air flow some but hum idk lol



LOL!  Good thing you guys aren't working for nVidia cuz you don't have a clue.  The old 295 heatsink had a horrible design.  This thing is going to have tons more overclocking margin compared to the old one and hopefully will be quieter too.  I'll repost my "told you so" roundup once reviews hit the web.


----------



## [I.R.A]_FBi (Jun 8, 2009)

cscgo said:


> LOL!  Good thing you guys aren't working for nVidia cuz you don't have a clue.  The old 295 heatsink had a horrible design.  This thing is going to have tons more overclocking margin compared to the old one and hopefully will be quieter too.  I'll repost my "told you so" roundup once reviews hit the web.



orally?


----------



## PP Mguire (Jun 8, 2009)

Imo this cooler is a far better design than the ATI counterpart. You get cold air blown across both chips instead of just one. Sufficient case cooling will negatize the problems of hot air being blown into the case. Im pretty sure if it was that shitty Nvidia wouldnt release it and Galaxy of course would change up the cooler design. Just my 2 cents though.


----------

