Wednesday, December 10th 2008

NVIDIA GeForce GTX 295 Spotted

VR-Zone scored the first photo of the upcoming GeForce GTX 295 card that's reported to make first appearance at next year's Consumer Electronics Show (CES). Unlike previous reports, the card will feature a sandwich design, like most dual GPU cards released by NVIDIA. Two 55nm GT200 CPUs will be incorporated in this card. From the picture we also see two DVI and one Display ports. The source also reports the card is using 8+6 pin connector combo to deliver external power. The pricing is yet to be disclosed, but card makers are speculating that NVIDIA will price it competitively against AMD 4870X2.
Source: VR-Zone
Add your own comment

96 Comments on NVIDIA GeForce GTX 295 Spotted

#26
phanbuey
BingeTell that to dark2099 :laugh:



And then why is it that Tatty's 260 that had a soft-volt mod owns any GTX280 under the sun? The GTX200 series is more or less bogged down by shader cores... it never needed more than say 192 or 216.
tatty's super-high clocked 260 owned any 280 at stock... and the reason for that is that it had 25% fewer shaders, but ran at 30% greater clocks on the same architecture. once a 280 is overclocked to 700+ core and 1400+ it would beat that 260 no question. 240 shaders is not even enough IMO they wanted to add more but couldnt because of fab limitations. I like the twin PCB idea it worked for the 9800GX2 and it will work for this too...
Posted on Reply
#27
TRIPTEX_CAN
lemonadesodaThis 295 monster isn't an evolutionary feat. It's a frankenstein monster. By its design it's clear it will suffer from heat problems (trapped between the PCB) limiting aftermarket coolers and OCability.
woot! this is the fastest card? it will rock? meh. This is mutton dressed as lamb :banghead:
I agree the design of this card looks dated and will have some issues aside from being wildly expensive to produce. Nvidia has the capitol to invest in some better designs, rushing to take back the performance "crown" wouldn't be my product strategy but this card will be fast and it will have a colorful box.
Posted on Reply
#28
phanbuey
lol... i love it how people are like "it looks bad"... who cares if the cooler is stapled on to the cards and the whole assembly uses duct tape to hold it together? bring on the benchmarks... on the other hand it is a pretty ugly card... they'll probably put plastic around it to cover it up a bit.
Posted on Reply
#29
TRIPTEX_CAN
phanbueylol... i love it how people are like "it looks bad"... who cares if the cooler is stapled on to the cards and the whole assembly uses duct tape to hold it together? bring on the benchmarks... on the other hand it is a pretty ugly card... they'll probably put plastic around it to cover it up a bit.
Um I care if it's duct taped together. This card will cost close to $600-700 when released. I can staple a few 260s together and charge you double if you want. :rolleyes:
Posted on Reply
#30
phanbuey
TRIPTEX_MTLUm I care if it's duct taped together. This card will cost close to $600-700 when released. I can staple a few 260s together and charge you double if you want. :rolleyes:
1. if you could, you would have already... and you would have made tons of money.
2. you should put some artwork in front of your computer.

point is: youre ultimately paying for performance, not looks - otherwise i have a really nifty looking X1550 in my other rig...

If the only complaint people can come up with is "it looks bad" then Nvdia has a successful product.
Posted on Reply
#31
TRIPTEX_CAN
phanbuey1. if you could, you would have already... and you would have made tons of money.
2. you should put some artwork in front of your computer.

point is: youre ultimately paying for performance, not looks - otherwise i have a really nifty looking X1550 in my other rig...

If the biggest complaint people can come up with is "it looks bad" then Nvdia has a successful product.
It's not the aesthetics that bother me with the Nvidia GX2 cards. I said it looks dated, I didn't say it was ugly tbh they look fine (once the fan shroud is assembled). If the card doesn't suffer from heat issues and isn't ridiculously expensive from manufacturing it will be a winner.
Posted on Reply
#32
mdm-adph
phanbueylol... i love it how people are like "it looks bad"... who cares if the cooler is stapled on to the cards and the whole assembly uses duct tape to hold it together? bring on the benchmarks... on the other hand it is a pretty ugly card... they'll probably put plastic around it to cover it up a bit.
I would. If it looks like crap, that means it was hastily put together. And things that are hastily put together usually have high failure rates.

Now, it's neat to see Nvidia compete for the top stop again, but if the card's crappily made, it's going to be a damn shame, because the G200 is a pretty good chip -- could even be one of the last of its kind.
Posted on Reply
#33
kysg
well at least the boxart will look nice XD
Posted on Reply
#34
ShadowFold
Won't these have weaker G200 cores? I heard they have lower shader counts. I still cant beleive they didn't get all of this on one PCB tho.
Posted on Reply
#35
LiveOrDie
im just waiting for the 350, dual GPU cards just run to much up and down in games, put it this way some games run sli good and some run them crap lol, i had a 9800GX2 and it was ok but i got the 280 and all my games ran better.
Posted on Reply
#36
TRIPTEX_CAN
Live OR Dieim just waiting for the 350, dual GPU cards just run to much up and down in games, put it this way some games run sli good and some run them crap lol, i had a 9800GX2 and it was ok but i got the 280 and all my games ran better.
Single GPU will always win for consistency in FPS. It doesn't matter if it's crossfire or SLi. Until someone manages to create a seamless dual GPU design that is recognized but the system as a single core there will always be games that suffer from having an extra card.
Posted on Reply
#37
springs113
on a side note the new cat 8.12 is out
Posted on Reply
#38
TRIPTEX_CAN
springs113on a side note the new cat 8.12 is out
Side note? That came out of left field. :roll:
Posted on Reply
#39
Selene
ok, im going to try to remain calm.
its a sample, its not what it will look like, so chill about the looks.
the GTX260 core 216 already beats the 4870 512/1gig 1v1 and in SLI vs crossfire, and the SLI core 216 beat the 4870x2, so there is no reason to be saying any thing about the 55nm SLI in one box is not going to beat the 4870x2.
The only way ATI can pull this out again, is a major price cut on the 4870s and X2s wich means we all win again.
Yes we all know their are some games the 4870x2 beats SLI GTX280s, but its a small % of games and over all the SLI gtx260s are better, so no reason to beat that dead horse.
Posted on Reply
#40
TRIPTEX_CAN
If you have to make an effort to remain calm you might be taking this all too seriously. :)

The card will most likely look just like that but with a fan shroud covering it just like the 9800GX2. When pics of the 9800GX2 were "leaked" people said it wouldn't look the way it was pictured either... and it did, just with a fan shroud.

This thread really doesnt need to be another ATI vs Nvidia thread. We're discussing the projected release of a product.
Posted on Reply
#41
Solaris17
Super Dainty Moderator
intresting that that is the only picture they managed to get.
Posted on Reply
#42
mdm-adph
Seleneok, im going to try to remain calm.
its a sample, its not what it will look like, so chill about the looks.
the GTX260 core 216 already beats the 4870 512/1gig 1v1 and in SLI vs crossfire, and the SLI core 216 beat the 4870x2, so there is no reason to be saying any thing about the 55nm SLI in one box is not going to beat the 4870x2.
The only way ATI can pull this out again, is a major price cut on the 4870s and X2s wich means we all win again.
Yes we all know their are some games the 4870x2 beats SLI GTX280s, but its a small % of games and over all the SLI gtx260s are better, so no reason to beat that dead horse.
I don't know about all of that... got any stats to back it up? Just saw this review yesterday, and it looks like the GTX260/216 beats the 4870 by a slight margin overall, and only on some of the benchmarks, and I don't see how it beats it in SLI vx Crossfire, since everything I've seen shows that Crossfire is more efficient than SLI. :confused:
Posted on Reply
#43
MrMilli
- Two pcb's with a 448bit memory interface (i guess being minimum 10 layers)
- Four chips (2 GT200 + display chip + PCIE bridge) - with a 55nm GT200 being around 470mm², it's still almost twice the size of a RV770.
- 1792MB Ram
...

I wonder if nVidia is going to make any money on these things? The manufacturing costs are just really high. ATI's margins are high at the moment so they can slash their prices anytime.
Posted on Reply
#44
Solaris17
Super Dainty Moderator
MrMilli- Two pcb's with a 448bit memory interface (i guess being minimum 10 layers)
- Four chips (2 GT200 + display chip + PCIE bridge) - with a 55nm GT200 being around 470mm², it's still almost twice the size of a RV770.
- 1792MB Ram
...

I wonder if nVidia is going to make any money on these things? The manufacturing costs are just really high. ATI's margins are high at the moment so they can slash their prices anytime.
i think they will the cost of making it wont be too much as they know that wouldnt be cost effective no i think everything is already in place. its basically going to be the same design as the GX2 i can tell you that just looking at that pic and remembering what mine looks like taken apart. the only thing they are goig to do it move a couple capacitors to fit the biger core.
Posted on Reply
#45
Bjorn_Of_Iceland
Won't these have weaker G200 cores? I heard they have lower shader counts. I still cant beleive they didn't get all of this on one PCB tho.
It has higher shader units than a gtx 260 216.. so theoretically, its faster than an SLId GTX 260 216.
Posted on Reply
#46
TRIPTEX_CAN
MrMilli- Two pcb's with a 448bit memory interface (i guess being minimum 10 layers)
- Four chips (2 GT200 + display chip + PCIE bridge) - with a 55nm GT200 being around 470mm², it's still almost twice the size of a RV770.
- 1792MB Ram
...

I wonder if nVidia is going to make any money on these things? The manufacturing costs are just really high. ATI's margins are high at the moment so they can slash their prices anytime.
If they can deliver enough performance to clearly take the performance crown (15-20% in every game) then selling massive quantity at a smaller margin will work for them. I don't really see this being a cost effective solution.
Posted on Reply
#47
AsRock
TPU addict
cdawalllook at his picture is that a 3rd PCB? and i found the quad SLi connector they just dont have it as a plug on the PCB yet

Yeah thats a 3rd PCB but it's only about 1 inch long lol.
Posted on Reply
#49
phanbuey
mdm-adphsince everything I've seen shows that Crossfire is more efficient than SLI. :confused:
That was then and this is now... everyithing crossfire before the g200 architecture was more efficient... now its actaully the other way.. and that review is using newer drivers and a gtx 216... he was referring to the gtx 260 192... those were the ones (with old drivers) that would get stomped by a 4870 1gb and then catch right up and beat the 4870's in cf...
Posted on Reply
#50
TRIPTEX_CAN
phanbueyThat was then and this is now... everyithing crossfire before the g200 architecture was more efficient... now its actaully the other way.. and that review is using newer drivers and a gtx 216... he was referring to the gtx 260 192... those were the ones (with old drivers) that would get stomped by a 4870 1gb and then catch right up and beat the 4870's in cf...
Also that was then.... lol

If the claims ATI has made in the release notes for their latest drivers hold true the lead the GTX 260 216 has just got much smaller if not removed. This is all speculation but they are claiming up to 57% increase in FC2 with crossfire systems among other things.
Posted on Reply
Add your own comment
Dec 15th, 2024 05:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts