Tuesday, May 26th 2009

Single-PCB GeForce GTX 295 Pictured

Zol.com.cn has managed to take some pictures of the upcoming single-PCB GeForce GTX 295. Expected to arrive within a month, the single-PCB GTX 295 features the same specs as the dual-PCB model - 2x448bit memory interface, 480 Processing Cores and 1792MB of GDDR3 memory, and GPU/shader/memory clocks of 576/1242/1998 MHz respectively.
Add your own comment

65 Comments on Single-PCB GeForce GTX 295 Pictured

#51
Hayder_Master
at last nvidia solve single PCB GTX 295 problem when ATI prepare to lunch 5870X2
Posted on Reply
#52
lemonadesoda


That must be one of the most inefficient poorly designed cooling concepts I've ever seen for a premium product. "Stock cooling" designer should be fired.
Posted on Reply
#53
RadeonX2
lemonadesodaThat must be one of the most inefficient poorly designed cooling concepts I've ever seen for a premium product. "Stock cooling" designer should be fired.
so true... they should've put dual fans on top of each heatsink not in the center :shadedshu or redesign the heatsink for good.
Posted on Reply
#54
CyberDruid
Who would keep stock cooling on that thing anyway? I mean it's a crazy expensive card to start with...so what's another $150 in waterblocks:p
Posted on Reply
#55
a_ump
lemonadesoda

That must be one of the most inefficient poorly designed cooling concepts I've ever seen for a premium product. "Stock cooling" designer should be fired.
yea, definitely doesn't look that inventive....looks rather poor actually. i wonder why they didn't go with ATI's method of a single fan blowing over both chips out of the case. I suppose this will keep both chips cooler than to have it ATI's way. Those heatsinks remind me of CPU sinks lol. I would think it would be more efficient of one of the heat pipes on the sinks were lower than the other. i realize it would obstruct air flow some but hum idk lol
Posted on Reply
#56
erocker
*
I'll reserve judgment until I see performance/temps/price/etc...
Posted on Reply
#57
ShadowFold
It looks like the fan could push air to each sink, but two fans would've been so much better.
Posted on Reply
#58
buggalugs
not a good time to buy a big powerful card. When DX11 comes out it is worth nothing. But i guess if you are rich who cares?
Posted on Reply
#59
gumpty
I do wonder who is going to buy this thing? Anyone that is likely to want GTX 295 power will know that the next generation cards are just around the corner (well, end of the year or something like that), so they might as well wait. Or buy a better value mid-range card or SLI/Xfire setup until that time comes.

This is surely just NVIDIA & it's partners padding things out for the next few months until the new products arrive. Make it appear as though GT200 chips have life in them and are still being innovated until GT300 arrives.
Posted on Reply
#60
newtekie1
Semi-Retired Folder
A lot of people aren't going to wait the 6+ Months to upgrade just because the next set of cards is coming out. Plus, some people actually plan to buy a card towards the end of its life cycle, when it is cheapest. Look at the past. The 9800GX2 dipped way down in price right before the GTX200 cards were released, to the point where newegg had them for $300. Then the GTX280 came out, and the 9800GX2 matched it but was cheaper due to the new product calling for a price premium. I'm guessing the GTX295 will have a similar fate, and now that manufacturing is cheaper, that leaves more room for companies to offer lower prices.

The DX11 features of the new cards won't be a factor for most of the people, simply because they should know that there won't be any DX11 titles until at least a year from now, if not longer.
Posted on Reply
#61
a_ump
well the GT300 is supposedly delayed till 2010, so from now that's at least 7 months if they do a january launch. ATI however are expected to release their RV870 this year. I wonder if TSMC's 40nm problems are part of nvidia's delay, since the GT300 die is likely to be as massive as GT200's adding to poor yields on top of w/e problems TSMC is having, but since ATI's die is much smaller, their yields are better which allows them to launch sooner? just a thought
Posted on Reply
#62
tkpenalty
we're going to see a few capacitor explosions at this rate...
Posted on Reply
#63
cscgo
Clueless
a_umpyea, definitely doesn't look that inventive....looks rather poor actually. i wonder why they didn't go with ATI's method of a single fan blowing over both chips out of the case. I suppose this will keep both chips cooler than to have it ATI's way. Those heatsinks remind me of CPU sinks lol. I would think it would be more efficient of one of the heat pipes on the sinks were lower than the other. i realize it would obstruct air flow some but hum idk lol
LOL! Good thing you guys aren't working for nVidia cuz you don't have a clue. The old 295 heatsink had a horrible design. This thing is going to have tons more overclocking margin compared to the old one and hopefully will be quieter too. I'll repost my "told you so" roundup once reviews hit the web.
Posted on Reply
#64
[I.R.A]_FBi
cscgoLOL! Good thing you guys aren't working for nVidia cuz you don't have a clue. The old 295 heatsink had a horrible design. This thing is going to have tons more overclocking margin compared to the old one and hopefully will be quieter too. I'll repost my "told you so" roundup once reviews hit the web.
orally?
Posted on Reply
#65
PP Mguire
Imo this cooler is a far better design than the ATI counterpart. You get cold air blown across both chips instead of just one. Sufficient case cooling will negatize the problems of hot air being blown into the case. Im pretty sure if it was that shitty Nvidia wouldnt release it and Galaxy of course would change up the cooler design. Just my 2 cents though.
Posted on Reply
Add your own comment
Dec 18th, 2024 01:43 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts