• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 295 Spotted

O my god. Are we going to return to the same debate once again?? :banghead:
Where's the debate? Did you determine it over? Hadn't realised you had been promoted to the discussion police.

This 295 monster isn't an evolutionary feat. It's a frankenstein monster. By its design it's clear it will suffer from heat problems (trapped between the PCB) limiting aftermarket coolers and OCability.

It's also going to be power hungry.

IMO, two GTX260's (55nm refresh) in SLI are a superior product combination. It will allow greater flexibility and heat distribution will probably allow for higher OC and the use of larger, quieter, coolers.

woot! this is the fastest card? it will rock? meh. This is mutton dressed as lamb :banghead:
 
Tell that to dark2099 :laugh:



And then why is it that Tatty's 260 that had a soft-volt mod owns any GTX280 under the sun? The GTX200 series is more or less bogged down by shader cores... it never needed more than say 192 or 216.

tatty's super-high clocked 260 owned any 280 at stock... and the reason for that is that it had 25% fewer shaders, but ran at 30% greater clocks on the same architecture. once a 280 is overclocked to 700+ core and 1400+ it would beat that 260 no question. 240 shaders is not even enough IMO they wanted to add more but couldnt because of fab limitations. I like the twin PCB idea it worked for the 9800GX2 and it will work for this too...
 
This 295 monster isn't an evolutionary feat. It's a frankenstein monster. By its design it's clear it will suffer from heat problems (trapped between the PCB) limiting aftermarket coolers and OCability.
woot! this is the fastest card? it will rock? meh. This is mutton dressed as lamb :banghead:

I agree the design of this card looks dated and will have some issues aside from being wildly expensive to produce. Nvidia has the capitol to invest in some better designs, rushing to take back the performance "crown" wouldn't be my product strategy but this card will be fast and it will have a colorful box.
 
lol... i love it how people are like "it looks bad"... who cares if the cooler is stapled on to the cards and the whole assembly uses duct tape to hold it together? bring on the benchmarks... on the other hand it is a pretty ugly card... they'll probably put plastic around it to cover it up a bit.
 
lol... i love it how people are like "it looks bad"... who cares if the cooler is stapled on to the cards and the whole assembly uses duct tape to hold it together? bring on the benchmarks... on the other hand it is a pretty ugly card... they'll probably put plastic around it to cover it up a bit.

Um I care if it's duct taped together. This card will cost close to $600-700 when released. I can staple a few 260s together and charge you double if you want. :rolleyes:
 
Um I care if it's duct taped together. This card will cost close to $600-700 when released. I can staple a few 260s together and charge you double if you want. :rolleyes:

1. if you could, you would have already... and you would have made tons of money.
2. you should put some artwork in front of your computer.

point is: youre ultimately paying for performance, not looks - otherwise i have a really nifty looking X1550 in my other rig...

If the only complaint people can come up with is "it looks bad" then Nvdia has a successful product.
 
Last edited:
1. if you could, you would have already... and you would have made tons of money.
2. you should put some artwork in front of your computer.

point is: youre ultimately paying for performance, not looks - otherwise i have a really nifty looking X1550 in my other rig...

If the biggest complaint people can come up with is "it looks bad" then Nvdia has a successful product.

It's not the aesthetics that bother me with the Nvidia GX2 cards. I said it looks dated, I didn't say it was ugly tbh they look fine (once the fan shroud is assembled). If the card doesn't suffer from heat issues and isn't ridiculously expensive from manufacturing it will be a winner.
 
Last edited:
lol... i love it how people are like "it looks bad"... who cares if the cooler is stapled on to the cards and the whole assembly uses duct tape to hold it together? bring on the benchmarks... on the other hand it is a pretty ugly card... they'll probably put plastic around it to cover it up a bit.

I would. If it looks like crap, that means it was hastily put together. And things that are hastily put together usually have high failure rates.

Now, it's neat to see Nvidia compete for the top stop again, but if the card's crappily made, it's going to be a damn shame, because the G200 is a pretty good chip -- could even be one of the last of its kind.
 
well at least the boxart will look nice XD
 
Won't these have weaker G200 cores? I heard they have lower shader counts. I still cant beleive they didn't get all of this on one PCB tho.
 
im just waiting for the 350, dual GPU cards just run to much up and down in games, put it this way some games run sli good and some run them crap lol, i had a 9800GX2 and it was ok but i got the 280 and all my games ran better.
 
im just waiting for the 350, dual GPU cards just run to much up and down in games, put it this way some games run sli good and some run them crap lol, i had a 9800GX2 and it was ok but i got the 280 and all my games ran better.

Single GPU will always win for consistency in FPS. It doesn't matter if it's crossfire or SLi. Until someone manages to create a seamless dual GPU design that is recognized but the system as a single core there will always be games that suffer from having an extra card.
 
on a side note the new cat 8.12 is out
 
ok, im going to try to remain calm.
its a sample, its not what it will look like, so chill about the looks.
the GTX260 core 216 already beats the 4870 512/1gig 1v1 and in SLI vs crossfire, and the SLI core 216 beat the 4870x2, so there is no reason to be saying any thing about the 55nm SLI in one box is not going to beat the 4870x2.
The only way ATI can pull this out again, is a major price cut on the 4870s and X2s wich means we all win again.
Yes we all know their are some games the 4870x2 beats SLI GTX280s, but its a small % of games and over all the SLI gtx260s are better, so no reason to beat that dead horse.
 
If you have to make an effort to remain calm you might be taking this all too seriously. :)

The card will most likely look just like that but with a fan shroud covering it just like the 9800GX2. When pics of the 9800GX2 were "leaked" people said it wouldn't look the way it was pictured either... and it did, just with a fan shroud.

This thread really doesnt need to be another ATI vs Nvidia thread. We're discussing the projected release of a product.
 
intresting that that is the only picture they managed to get.
 
ok, im going to try to remain calm.
its a sample, its not what it will look like, so chill about the looks.
the GTX260 core 216 already beats the 4870 512/1gig 1v1 and in SLI vs crossfire, and the SLI core 216 beat the 4870x2, so there is no reason to be saying any thing about the 55nm SLI in one box is not going to beat the 4870x2.
The only way ATI can pull this out again, is a major price cut on the 4870s and X2s wich means we all win again.
Yes we all know their are some games the 4870x2 beats SLI GTX280s, but its a small % of games and over all the SLI gtx260s are better, so no reason to beat that dead horse.

I don't know about all of that... got any stats to back it up? Just saw this review yesterday, and it looks like the GTX260/216 beats the 4870 by a slight margin overall, and only on some of the benchmarks, and I don't see how it beats it in SLI vx Crossfire, since everything I've seen shows that Crossfire is more efficient than SLI. :confused:
 
- Two pcb's with a 448bit memory interface (i guess being minimum 10 layers)
- Four chips (2 GT200 + display chip + PCIE bridge) - with a 55nm GT200 being around 470mm², it's still almost twice the size of a RV770.
- 1792MB Ram
...

I wonder if nVidia is going to make any money on these things? The manufacturing costs are just really high. ATI's margins are high at the moment so they can slash their prices anytime.
 
- Two pcb's with a 448bit memory interface (i guess being minimum 10 layers)
- Four chips (2 GT200 + display chip + PCIE bridge) - with a 55nm GT200 being around 470mm², it's still almost twice the size of a RV770.
- 1792MB Ram
...

I wonder if nVidia is going to make any money on these things? The manufacturing costs are just really high. ATI's margins are high at the moment so they can slash their prices anytime.


i think they will the cost of making it wont be too much as they know that wouldnt be cost effective no i think everything is already in place. its basically going to be the same design as the GX2 i can tell you that just looking at that pic and remembering what mine looks like taken apart. the only thing they are goig to do it move a couple capacitors to fit the biger core.
 
Won't these have weaker G200 cores? I heard they have lower shader counts. I still cant beleive they didn't get all of this on one PCB tho.
It has higher shader units than a gtx 260 216.. so theoretically, its faster than an SLId GTX 260 216.
 
- Two pcb's with a 448bit memory interface (i guess being minimum 10 layers)
- Four chips (2 GT200 + display chip + PCIE bridge) - with a 55nm GT200 being around 470mm², it's still almost twice the size of a RV770.
- 1792MB Ram
...

I wonder if nVidia is going to make any money on these things? The manufacturing costs are just really high. ATI's margins are high at the moment so they can slash their prices anytime.

If they can deliver enough performance to clearly take the performance crown (15-20% in every game) then selling massive quantity at a smaller margin will work for them. I don't really see this being a cost effective solution.
 
look at his picture is that a 3rd PCB? and i found the quad SLi connector they just dont have it as a plug on the PCB yet

Untitled.png

Yeah thats a 3rd PCB but it's only about 1 inch long lol.
 
since everything I've seen shows that Crossfire is more efficient than SLI. :confused:

That was then and this is now... everyithing crossfire before the g200 architecture was more efficient... now its actaully the other way.. and that review is using newer drivers and a gtx 216... he was referring to the gtx 260 192... those were the ones (with old drivers) that would get stomped by a 4870 1gb and then catch right up and beat the 4870's in cf...
 
Back
Top