Saturday, December 13th 2008
More GeForce GTX 295 Details Trickle-in
Slated for CES '09, the GeForce GTX 295 would spearhead NVIDIA's quest for performance supremacy. The dual-GPU card consists of two G200b graphics processors working in an internal multi-GPU mode. VR-Zone collected a few more details about this card.
To begin with, the two GPUs will offer all their 240 stream processors unlike what earlier reports suggested. On the other hand, the memory subsystem of this card is peculiar. The card features a total of 1792 MB of memory (896 MB x 2), indicating that the memory configurations of the cores resemble those of the GeForce GTX 260, while the shader domains resemble those of the GTX 280 (240 SPs). The entire card is powered by an 8-pin and a 6-pin power connector. The construction resembles that of the GeForce 9800 GX2 in many aspects, where a monolithic cooler is sandwiched between two PCBs holding a GPU system each. The total power draw of the card is rated at 289W. The card has a single SLI bridge finger, indicating that it supports Quad-SLI in the same way the GeForce 9800 GX2 did (a maximum of two cards can be used in tandem).
Source:
VR-Zone
To begin with, the two GPUs will offer all their 240 stream processors unlike what earlier reports suggested. On the other hand, the memory subsystem of this card is peculiar. The card features a total of 1792 MB of memory (896 MB x 2), indicating that the memory configurations of the cores resemble those of the GeForce GTX 260, while the shader domains resemble those of the GTX 280 (240 SPs). The entire card is powered by an 8-pin and a 6-pin power connector. The construction resembles that of the GeForce 9800 GX2 in many aspects, where a monolithic cooler is sandwiched between two PCBs holding a GPU system each. The total power draw of the card is rated at 289W. The card has a single SLI bridge finger, indicating that it supports Quad-SLI in the same way the GeForce 9800 GX2 did (a maximum of two cards can be used in tandem).
51 Comments on More GeForce GTX 295 Details Trickle-in
Threre going to be like $300 tops.lol
Now, as for the 9800GX2 not outperforming the 8800GTS's in SLi, the main reason is that the 9800GX2 is actually clocked lower than the 8800GTS's. It lacks 50MHz on the core clock, but more imporantly 125MHz on the shaders, this makes a huge impact on performance. When the two are clocked equally, they tend to perform equally.
Anyway, the GTX295 is basically a 9800GX2 of the current generation; take a hot but fast GPU and cut it down, followed by a dieshrink and less power usage, as well as lowerclock speeds, and fabricate two PCBs and an internal SLI bridge.
Anyway 300W power draw is a bit high... how much does the 4870X2 draw again?
newteckie1 i dont think they even make single slot gfx cards with remotely enough power for the enthusiest. That being said, i think we all have to get the dual slot cards. I dont think anyone expected the series 8 to be as superior at the time as it was. I cant explain why i assumed the 7950 would have been better then it was, but the 8800GTX did indeed shit all over it.
If graphics card technology doesnt go back to single slot with equal or better performance, then we will continue to lose functionality. AMD/ATi and Nvidia are constantly shrinking their technology but the cards are getting bigger.
second im really looking forward to this card, given how much a single GTX260 core 216 rocks.
honestly i'd say if a single GTX260 was released with the full 240 sp's, it really wouldn't need much overclocking at all for GTX280 speeds.
i dont think they will need to clock it slower, if they do its only from a heat perspective given the GTX285 is clock faster to the value of 10% more performance, all whist chewing 22.5% less power, given that, a pair of 55nm GTX260's should do well for themselves.
lets just weigh up how beasty this card will be (assuming SAME clocks as stock 260)
56 ROPS - 32 gigapixel fillrate
480 sp's - 80 gigatexel fillrate
1792mb of memory on a 896 bit bus (naturally halved per gpu)
theoretical 223.8 GB/s memory bandwith
wowza. me wantie. right meow.
all in all the 55nm iterations of GT200 + RV770 revamp? should kick some tail until the new cards hit hard late next year, ie the GT300 and RV870.
For #4 honestly whats the big deal about the card being dual GPU or not?
As long as it preforms well and are priced properly I don't give a damn:slap:
Not saying the guy you are qouting is right, but that 79X0 GX2 has the worst drivers ever who cares its nVidia that made the first crap?
(Well 3DFX did the first multiple GPU crap, but nVidia owns them anyways.)
2. The dual PCB design of the 7900GX2 is really, nothing new; other comapnies have already used that idea so many times its not funny.
3. The GTX295's design from what I can see is almost identical to the 9800GX2, probably with beefed up phases. The 4870X2 and 3870X2 both share a similar PCB too, but with more changes; to the memory bus (please note that GDDR5 and GDDR3 has a different layout), completely beefed up phases, etc. The Bottom is the 4870X2 for sure >_>. Again is there any problem with recycling designs? "Oh lets make a whole new PCB design to be original so that consumers complain less"-is the logic that you'd operate on. Basically doing that would jack up the retail price as extra redundant R&D would be required. Theres no need, in short.
4. Does it matter? Why do people bitch about how they attain the result? Its not like its immoral or anything. AMD could very easily just fabricate two RV770 cores in one package, but they wont, for several good reasons.
Personally, I would prefer single GPU solutions, simply because of all the problems SLi and Crossfire solutions add to the mix. You have games not supporting the technology, with users have to wait for patches from both the graphics card manufacturers and game developers. You have situations like GTA:IV, where SLi isn't supported. So everyone that bought 9800GX2's are stuck with the performance of a single GPU. Crysis, for the longest time, didn't support Crossfire properly, so users of the HD3870x2 and HD4870x2 were stuck with the performance of a single GPU. 1.) Yes, we've gone over that. Read the thread. I mean modern implementations using Crossfire and SLi. If you go back far enough, you will find plenty of dual GPU implmentations.
2.) Yes, and the single PCB design of ATi's dual GPU cards is nothing new either, it has probably been used just as much. Your point?
3.) I have no problem with recycling designs. I say pick a design and continue to refine it. But what I want to know is how you have jumped to the conclusion that the GTX295 is almost identical to the 9800GX2 from a few off angle pictures, and no real picture of the PCBs. How can you make the claim that there are more changes from the HD4870x2 to the HD3870x2 without any good information on the GTX295's PCBs? The G200 is a completely different beast from the G92, there are likely huge changes to the PCB design. Funny how you see a card from ATi with essentially the same layout/form factor as the prevous generation, and say there are huge changes, but on the nVidia side, you see the same thing, and say there are no changes at all.
4.) See above.
What steps forward have ATi made in the past 3 years that have been something new?