Saturday, December 13th 2008

More GeForce GTX 295 Details Trickle-in

Slated for CES '09, the GeForce GTX 295 would spearhead NVIDIA's quest for performance supremacy. The dual-GPU card consists of two G200b graphics processors working in an internal multi-GPU mode. VR-Zone collected a few more details about this card.

To begin with, the two GPUs will offer all their 240 stream processors unlike what earlier reports suggested. On the other hand, the memory subsystem of this card is peculiar. The card features a total of 1792 MB of memory (896 MB x 2), indicating that the memory configurations of the cores resemble those of the GeForce GTX 260, while the shader domains resemble those of the GTX 280 (240 SPs). The entire card is powered by an 8-pin and a 6-pin power connector. The construction resembles that of the GeForce 9800 GX2 in many aspects, where a monolithic cooler is sandwiched between two PCBs holding a GPU system each. The total power draw of the card is rated at 289W. The card has a single SLI bridge finger, indicating that it supports Quad-SLI in the same way the GeForce 9800 GX2 did (a maximum of two cards can be used in tandem).
Source: VR-Zone
Add your own comment

51 Comments on More GeForce GTX 295 Details Trickle-in

#26
Solaris17
Super Dainty Moderator
yes im sure...they are very very much similar the only reason i could tell the diff is because the 4870x2 has AMD silscreened above the pci-e teeth
Posted on Reply
#27
a_ump
well i suppose, it will perform close or a little worse than GTX 260 in SLI maybe better since it'll have 240SPU though i wonder if it'll really matter, it all is going to depend on the performance of the HD 5870 if it sells good and i look for it to do as good or slightly better than the GTX 280, and if it's priced competitvely say 320-399 at launch i thk it'll make good profits for ATI as the HD 48XX has. though i'm guessing this GTX295 will retail 699 or more.
Posted on Reply
#28
douglatins
This is no joke about nvidia wanting the performance crown, they are desperate so it seems. But i still dont compare this to 4870X2. They might as well sell two gtx270 and call it gtx295
Posted on Reply
#29
CDdude55
Crazy 4 TPU!!!
I'm going to Tri-SLI three of these 295's.:)(If it was possible:()

Threre going to be like $300 tops.lol
Posted on Reply
#30
a_ump
newtekie1NVidia didn't need to improve their SLi bridge connection used in the 9800GX2, it worked perfectly fine. AMD had to improve their bridge chip because it didn't support PCI-E 2.0. There really isn't anything superior about the way ATi does the internal crossfire connector vs. nVidia's SLi bridge cable.
you say that, but i wonder is it the xfire chip or the drivers that make the HD 4870x2 perform better than xfire HD 4870's?, the 9800GX2 wasn't better than 8800GTS's in SLI, benchmark, so their connection interfaces maybe equal but it seems ATI puts more work into their drivers, though nvidia maybe different this time around with their drivers for their dual PCB/Chip design.
Posted on Reply
#31
douglatins
CDdude55I'm going to Tri-SLI three of these 295's.:)(If it was possible:()

Threre going to be like $300 tops.lol
Yes, more like 600
Posted on Reply
#32
newtekie1
Semi-Retired Folder
a_umpwell i suppose, it will perform close or a little worse than GTX 260 in SLI maybe better since it'll have 240SPU though i wonder if it'll really matter, it all is going to depend on the performance of the HD 5870 if it sells good and i look for it to do as good or slightly better than the GTX 280, and if it's priced competitvely say 320-399 at launch i thk it'll make good profits for ATI as the HD 48XX has. though i'm guessing this GTX295 will retail 699 or more.
Have we even seen any details on the HD5870? I've only seen a few forum posts on speculation, but nothing even close to being final or even offical. Will we even see the HD5870 in the next year? I doubt it, and if we do, it won't be until near the end of the year. So I really don't think we need to be considering the HD5870 in any serious way at this point.
a_umpyou say that, but i wonder is it the xfire chip or the drivers that make the HD 4870x2 perform better than xfire HD 4870's?, the 9800GX2 wasn't better than 8800GTS's in SLI, benchmark, so their connection interfaces maybe equal but it seems ATI puts more work into their drivers, though nvidia maybe different this time around with their drivers for their dual PCB/Chip design.
Several factors go into the HD4870x2 outperforming the HD4870 in Crossfire, none of which are the chip used in the card to connect the cores. The first reason is that the HD4870x2 was released with 1GB of RAM per core. The regular HD4870s did not have 1GB per core at this time, so all the reviews of the HD4870x2 had it against 512MB HD4870s, which obviously perform worse than cards with 1GB, especially on extreme resolutions, which is where dual-GPU setups shine. Later reviews, using 1GB HD4870's show that there is very little performance difference between the two. Besides that, a lot of the reviews done with Crossfire HD4870's use Intel boards that only have x8 PCI-E slots, this also limits the dual card solution. As each card only gets half the bandwidth, while the single HD4870x2 gets the full bandwidth of the PCI-E x16 slot. Tweaktown actually did a test about this. When the HD4870x2 was put in a P45 board, it was about 10% faster than two HD4870s in the same board. However, when both were tested on an x48 board, the setups performed almost identically.

Now, as for the 9800GX2 not outperforming the 8800GTS's in SLi, the main reason is that the 9800GX2 is actually clocked lower than the 8800GTS's. It lacks 50MHz on the core clock, but more imporantly 125MHz on the shaders, this makes a huge impact on performance. When the two are clocked equally, they tend to perform equally.
Posted on Reply
#33
a_ump
the fact that they tested on the P45 at x8 each card never crossed my mind, i luv TPU :laugh: i get corrected and learn so much:laugh:
Posted on Reply
#34
Mark_Hardware
I'm still waiting for nVidia to utilize GDDR5. ati can't keep that to themselves forever. I was originally gonna go with a 4870 X2, but I got bashed on pretty hard, so I'm reconsidering. I don't wanna use dual slot cards, I'm gonna be using most of my slots...
Posted on Reply
#35
a_ump
best single slot card is HD 4850 :p
Posted on Reply
#36
Binge
Overclocking Surrealism
Or a water cooled GTX280 :rolleyes:
Posted on Reply
#37
tkpenalty
BingeOr a water cooled GTX280 :rolleyes:
Water cooled 4870X2 :laugh:

Anyway, the GTX295 is basically a 9800GX2 of the current generation; take a hot but fast GPU and cut it down, followed by a dieshrink and less power usage, as well as lowerclock speeds, and fabricate two PCBs and an internal SLI bridge.

Anyway 300W power draw is a bit high... how much does the 4870X2 draw again?
Posted on Reply
#38
Haytch
I buy the cards to stay in the game CDdude55. Its an expensive hobby, but one i enjoy. I dont complain about the money, or the advance in technology, just the waste of slots when i need the space for something simple like a sound card.

newteckie1 i dont think they even make single slot gfx cards with remotely enough power for the enthusiest. That being said, i think we all have to get the dual slot cards. I dont think anyone expected the series 8 to be as superior at the time as it was. I cant explain why i assumed the 7950 would have been better then it was, but the 8800GTX did indeed shit all over it.

If graphics card technology doesnt go back to single slot with equal or better performance, then we will continue to lose functionality. AMD/ATi and Nvidia are constantly shrinking their technology but the cards are getting bigger.
Posted on Reply
#39
Hayder_Master
mmmm, we expect that from nvidia , as you see my friend solaris this is another 9800gx2 but more than double price im sure
Posted on Reply
#40
theJesus
So who wants to get me one? :laugh:
Posted on Reply
#41
btarunr
Editor & Senior Moderator
theJesusSo who wants to get me one? :laugh:
Santa. Be a good boy. Expensive :)
Posted on Reply
#42
newtekie1
Semi-Retired Folder
HaytchI buy the cards to stay in the game CDdude55. Its an expensive hobby, but one i enjoy. I dont complain about the money, or the advance in technology, just the waste of slots when i need the space for something simple like a sound card.

newteckie1 i dont think they even make single slot gfx cards with remotely enough power for the enthusiest. That being said, i think we all have to get the dual slot cards. I dont think anyone expected the series 8 to be as superior at the time as it was. I cant explain why i assumed the 7950 would have been better then it was, but the 8800GTX did indeed shit all over it.

If graphics card technology doesnt go back to single slot with equal or better performance, then we will continue to lose functionality. AMD/ATi and Nvidia are constantly shrinking their technology but the cards are getting bigger.
The only single slot card in the current generation that would probably fit the bill is the HD4850. Though most enthusiest have come to actually want two slot cards, I know I have. As the GPUs get more powerful, they just keep putting out more heat. Most do not want that heat trapped in their case. Just as an example, when I put dual slot coolers on my 7900GT's no only did the GPU temperatures drop 5°C but my CPU tempurates dropped also.
Posted on Reply
#43
wolf
Better Than Native
first off realllly interesting thead to read.

second im really looking forward to this card, given how much a single GTX260 core 216 rocks.

honestly i'd say if a single GTX260 was released with the full 240 sp's, it really wouldn't need much overclocking at all for GTX280 speeds.

i dont think they will need to clock it slower, if they do its only from a heat perspective given the GTX285 is clock faster to the value of 10% more performance, all whist chewing 22.5% less power, given that, a pair of 55nm GTX260's should do well for themselves.

lets just weigh up how beasty this card will be (assuming SAME clocks as stock 260)

56 ROPS - 32 gigapixel fillrate
480 sp's - 80 gigatexel fillrate
1792mb of memory on a 896 bit bus (naturally halved per gpu)
theoretical 223.8 GB/s memory bandwith

wowza. me wantie. right meow.

all in all the 55nm iterations of GT200 + RV770 revamp? should kick some tail until the new cards hit hard late next year, ie the GT300 and RV870.
Posted on Reply
#44
Zubasa
newtekie1That makes little to no sense.

1.) nVidia has been doing the dual GPU cards for longer than ATi has. nVidia started it with the 7900GX2 which came out more than a year before ATi's first dual GPU card. And even then, it wasn't actually an ATi card, it was just a Sapphire exclusive designed by Sapphire.

2.) nVidia has taken their original idea, and continued to refine it. The dual PCB design of the 7900GX2 has evolved into GTX295. ATi has done the same.

3.) Yes, the GTX295 is similar in design to the 9800GX2, but how different is the HD4870x2 from the HD3870x2? Look at this picture, and tell me which is the HD4870x2 and which is the HD3870x2.

4.) ATi has been the one that has needed to create dual GPU cards to take the performance crown. For the bast 2 generations, this has been the only way ATi has been able to outperform nVidia's single GPU cards.
For #3) Top card is HD3870X2 and the Bottom is HD4870X2 :p

For #4 honestly whats the big deal about the card being dual GPU or not?
As long as it preforms well and are priced properly I don't give a damn:slap:

Not saying the guy you are qouting is right, but that 79X0 GX2 has the worst drivers ever who cares its nVidia that made the first crap?
(Well 3DFX did the first multiple GPU crap, but nVidia owns them anyways.)
Posted on Reply
#45
tkpenalty
newtekie1That makes little to no sense.

1.) nVidia has been doing the dual GPU cards for longer than ATi has. nVidia started it with the 7900GX2 which came out more than a year before ATi's first dual GPU card. And even then, it wasn't actually an ATi card, it was just a Sapphire exclusive designed by Sapphire.

2.) nVidia has taken their original idea, and continued to refine it. The dual PCB design of the 7900GX2 has evolved into GTX295. ATi has done the same.

3.) Yes, the GTX295 is similar in design to the 9800GX2, but how different is the HD4870x2 from the HD3870x2? Look at this picture, and tell me which is the HD4870x2 and which is the HD3870x2.

4.) ATi has been the one that has needed to create dual GPU cards to take the performance crown. For the bast 2 generations, this has been the only way ATi has been able to outperform nVidia's single GPU cards.
1. Radeon Maxx?

2. The dual PCB design of the 7900GX2 is really, nothing new; other comapnies have already used that idea so many times its not funny.

3. The GTX295's design from what I can see is almost identical to the 9800GX2, probably with beefed up phases. The 4870X2 and 3870X2 both share a similar PCB too, but with more changes; to the memory bus (please note that GDDR5 and GDDR3 has a different layout), completely beefed up phases, etc. The Bottom is the 4870X2 for sure >_>. Again is there any problem with recycling designs? "Oh lets make a whole new PCB design to be original so that consumers complain less"-is the logic that you'd operate on. Basically doing that would jack up the retail price as extra redundant R&D would be required. Theres no need, in short.

4. Does it matter? Why do people bitch about how they attain the result? Its not like its immoral or anything. AMD could very easily just fabricate two RV770 cores in one package, but they wont, for several good reasons.
Posted on Reply
#46
newtekie1
Semi-Retired Folder
ZubasaFor #3) Top card is HD3870X2 and the Bottom is HD4870X2 :p

For #4 honestly whats the big deal about the card being dual GPU or not?
As long as it preforms well and are priced properly I don't give a damn:slap:

Not saying the guy you are qouting is right, but that 79X0 GX2 has the worst drivers ever who cares its nVidia that made the first crap?
(Well 3DFX did the first multiple GPU crap, but nVidia owns them anyways.)
It isn't a big deal where the power comes from, as long as it is there. I was just responding to his point that nVidia has needed to create these dual GPU cards to top ATi. While that is true, it isn't in the context that he put it in, as ATi has been the one needed dual GPU cards to top nVidia's single GPU cards, nVidia just responds with a dual-GPU card of their own. Thats the cycle when nVidia has the stonger GPU Core. When the roles are reversed, and ATi has the stronger GPU Core again, nVidia will probably be the first to pump out dual-GPU cards to top ATi's single GPU cards.

Personally, I would prefer single GPU solutions, simply because of all the problems SLi and Crossfire solutions add to the mix. You have games not supporting the technology, with users have to wait for patches from both the graphics card manufacturers and game developers. You have situations like GTA:IV, where SLi isn't supported. So everyone that bought 9800GX2's are stuck with the performance of a single GPU. Crysis, for the longest time, didn't support Crossfire properly, so users of the HD3870x2 and HD4870x2 were stuck with the performance of a single GPU.
tkpenalty1. Radeon Maxx?

2. The dual PCB design of the 7900GX2 is really, nothing new; other comapnies have already used that idea so many times its not funny.

3. The GTX295's design from what I can see is almost identical to the 9800GX2, probably with beefed up phases. The 4870X2 and 3870X2 both share a similar PCB too, but with more changes; to the memory bus (please note that GDDR5 and GDDR3 has a different layout), completely beefed up phases, etc. The Bottom is the 4870X2 for sure >_>. Again is there any problem with recycling designs? "Oh lets make a whole new PCB design to be original so that consumers complain less"-is the logic that you'd operate on. Basically doing that would jack up the retail price as extra redundant R&D would be required. Theres no need, in short.

4. Does it matter? Why do people bitch about how they attain the result? Its not like its immoral or anything. AMD could very easily just fabricate two RV770 cores in one package, but they wont, for several good reasons.
1.) Yes, we've gone over that. Read the thread. I mean modern implementations using Crossfire and SLi. If you go back far enough, you will find plenty of dual GPU implmentations.

2.) Yes, and the single PCB design of ATi's dual GPU cards is nothing new either, it has probably been used just as much. Your point?

3.) I have no problem with recycling designs. I say pick a design and continue to refine it. But what I want to know is how you have jumped to the conclusion that the GTX295 is almost identical to the 9800GX2 from a few off angle pictures, and no real picture of the PCBs. How can you make the claim that there are more changes from the HD4870x2 to the HD3870x2 without any good information on the GTX295's PCBs? The G200 is a completely different beast from the G92, there are likely huge changes to the PCB design. Funny how you see a card from ATi with essentially the same layout/form factor as the prevous generation, and say there are huge changes, but on the nVidia side, you see the same thing, and say there are no changes at all.

4.) See above.
Posted on Reply
#47
a_ump
well like you said we have no proof that there have been improvements, so can only speculate and assume off nvidia's previous actions, and they aren't big on starting something new ATI has been the one to take steps forward in the past 3 years. We're just speculating, though when the HD 4870x2 came about there were said improvements such as the xfire chip that were mentioned. It wasn't just an upgrade to making it PCI-E 2.0 there was a lot less micro-stutter between the 2 chips than before also they have a sideport that allows direct communication between the chips that could eliminate micro-stutter or or lessen it greatly. Though it's not activated yet but if it does get activated it could improve performance, will nvidia also improve their micro-stuttering? IDK maybe nvidia has done one hell of a job and refined and improved the PCB design greatly which is a possibility since it's a much larger chip, different bus size; or it could be dam near the same as the 9800GX2's. time will tell. will it matter? idk
Posted on Reply
#48
dalekdukesboy
newconroerAs much as I'd like to see Nvidia offer up something worthwhile with this...there's two things that bother me.

1. Video ram/ram dac is still shared. Yes, having a total texture pool of near 2gb is helpful, but more so in theory, not in practice. If it was independent, thus being true 2gb, that would be another story. I'm wondering when dual processed GPUs are going to break that trend.

2. The most previous dual process solution, the 4870 X2(yes 4850 is more 'previous' sue me...)is, nothing to shake a stick at, but I've said it before and will always continue to say it - for the amount of horespower under it's hood, I feel it almost falls completely on it's face. It should perform twice as well as it does; but like a vehicle motor, slapping on a super charger can only take you so far, while the rest of the factory parts drag you down or limit your potential and real-world performance.

I don't think this Nvidia product is going to break either of those trends. It might be fast, in fact I'm fairly certain it will be, but if it doesn't perform at least 1 1/2 the amount of a normal 280, then...bleh.
I never thought I'd actually hear someone slam the 4870x2...since it's inception and till current times all I hear is how wonderful it is and how heavenly it is to own one. Truthfully, for almost 600 dollars it's no less a disappointment than when the gtx280 was that much and you could get a 4870 or 4850 for less than half that with 80 % plus the performance in most games. To me, this seems much the same scenario, I love the idea of dual gpu's on one card and yes this (gtx295) will absolute destroy anything out there currently (possibly in this next gpu round as well) but depending on pricing this will be just as uneconomical as the 4870x2...which particularly at high resolutions is actually BEATEN by the 4850x2 which is 200 bucks cheaper and uses much less electricity! How ridiculous is that? Anyhow this card will be much the same I believe and obviously it's the single slot champ in likely hood but it'll be just as bad a value as the 4870x2 is particularly once it reaches end of life...THEN it might be cool to pick one up if the price bombs, which amazingly of all the products of both companies they ALL have dropped even multiple times except for the 4870x2 so even that might be a wish and a prayer for quite a while!
Posted on Reply
#49
newtekie1
Semi-Retired Folder
a_umpwell like you said we have no proof that there have been improvements, so can only speculate and assume off nvidia's previous actions, and they aren't big on starting something new ATI has been the one to take steps forward in the past 3 years
What previous actions do we really have to go on? The 7900GX2, 7950GX2, and 9800GX2. Not enough to really make a trend, IMO. However, if we do look at it. The 7950GX2 was a huge improvement over the 7900GX2 in terms of PCB design, in fact the whole purpose of the 7950GX2 was to improve the PCB design. Then comes the 9800GX2, which is extremely different from the 7950GX2, obviously.

What steps forward have ATi made in the past 3 years that have been something new?
Posted on Reply
#50
a_ump
read my entire last post and think about it newtekie, ATI have been the ones, from what i've read, that get to the smaller fab processes, improved GPU-GPU interface plus the possible sideport, sound through HDMI up to 7.1 on the HD 48XX cards, pumping out very good performance on a chip half the size of the GT200(HD 4870vsGTX 260), and going to GDDR5 first are just a few technologies and steps that ATI has taken first and some of them NVIDIA still hasn't yet.
Posted on Reply
Add your own comment
Dec 20th, 2024 07:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts