Friday, November 19th 2010

NVIDIA Readying Dual-GF110 Graphics Accelerator, Eying Total Performance Leadership

NVIDIA stunned the computing world with a speedy launch of the GeForce GTX 580. The GPU was able to increase NVIDIA's single-GPU performance leadership, and also iron-out some serious issues with the power-draw and thermal characteristics of previous generation GeForce GTX 480. It is now that a dual-GPU implementation of the GF110 graphics processor, on which the GTX 580 is based, looks inevitable. NVIDIA seems to be ready with a prototype of such a dual-GPU accelerator, which the Chinese media is referring to as the "GTX 595".

The reference design PCB of the dual-GF110 accelerator (which still needs some components fitted) reveals quite a lot about the card taking shape. First, it's a single PCB card, both the GPU systems are located on the same PCB. Second, there are slots for three DVI output connectors present, indicating that the card with be 3D Vision Surround ready in a single card. You just have to get one of these, plug in three displays over standard DVI, and you're ready with a large display head spanning three physical displays.
Third, it could feature a total of 3 GB of video memory (or 1.5 GB per GPU system). Each GPU system has six memory chips on the obverse side of the PCB. At this point we can't comment on the memory bus width of each GPU. The core configuration of the GPUs are also unknown. Fourth, power is drawn in from two 8-pin PCI-E power connectors. The card is 2-way SLI capable with another of its kind.
Source: enet.com.cn
Add your own comment

153 Comments on NVIDIA Readying Dual-GF110 Graphics Accelerator, Eying Total Performance Leadership

#27
Bjorn_Of_Iceland
Just in time for Crysis 2 :D

Board layout pretty much looks like GTX295 single pcb. Would probly run ok on a 750w+ psu, with at least 48amps on the +12v. Im seeing a 600 USD tag there..
Posted on Reply
#28
_JP_
That's one hell of a mess in that board! Asus did tidy up the components better. Seems like it will be one hell of a performer. Can't wait! :D
And that SLI finger, THAT SLI finger...
Bjorn_Of_IcelandIm seeing a 600 USD tag there..
That's waaaaaaaaaaaaaaaaaaaaay too low...around 750 USD should be more adequate...
Posted on Reply
#29
MikeX
600 watt card...
10 years later 2000watt card?
Posted on Reply
#30
(FIH) The Don
MikeX600 watt card...
10 years later 2000watt card?
so? go plant a tree if it bothers you :roll::roll:
Posted on Reply
#31
Tatty_Two
Gone Fishing
qamulekA 580 is currently power limited due to spiking above the total available power of 150+75+75=300watt max. In tpu's article here the 580 power limits itself after spiking up then it settles to a max watts of 200(question why 200 and not something more like 250 or 290??? Probably due to the limits of the electronics used on board...). One question I didn't see answered was did the performance increase proportionately to the increase in power used after the power limit was taken away?

In any case the point is a 580 really needs an 8pin+8pin power to keep itself from being power limited from the cables(let alone the electronics used on the board as well as possible problems with load balancing), so how is a dual gpu going to fare better if a single gpu is already power limited? I guess I will just have to wait till someone who actually knows what they're talking about gives a go at it, or just wait till the card comes out and reviews are posted.

ah! My guess: The gtx580 is so powerful to the point it has to be power limited when used to its fullest, however most applications will be limited by the weakest link in the gpu before using the full power of the gtx580(example: cut the rops in half and suddenly the card is limited by the rop count). The dual gpu card will still need to be power limited, but it won't matter as most normal applications will be limited by the weakest link in the gpu before reaching the power limit.
You also have to remember that since PCI-E 2.0 there is the capability to get 150W from the slot, in 2.1 and 3.0 this is even "smart" and can give you what you need rather than the whole 150W, therefore at least theoretically, 450W could be available here.
Posted on Reply
#33
TheMailMan78
Big Member
SabreWulf69Pure PWNAGE, w00t, I'm all up for one if they are indeed 2x 580's :-D
It should render all your kiddie porn flawlessly.
Posted on Reply
#34
alwayssts
Tatty_OneYou also have to remember that since PCI-E 2.0 there is the capability to get 150W from the slot, in 2.1 and 3.0 this is even "smart" and can give you what you need rather than the whole 150W, therefore at least theoretically, 450W could be available here.
75W from the slot, and iirc the power management came about in 2.0?

At any rate, the 'spec' is anything but one, as we can see GF100/GF110 clearly can draw more juice under load than 300W at stock, and AMD/nVIDIA implement their TDPs differently.

One has to question this product though, power consumption Nazi or not. It's outside the pci-e spec, and (if a nVIDIA-sanctioned card) can almost certainly be seen as a concession that the same configuration using GF114 will not beat 6990...which is perhaps a given.

For it even to be feasible, it needs to be faster than the same configuration using GF114 GPUs instead, even at a greater clockspeed. Since we know GF104 GPUs clock into the 800+mhz range, and would assume GTX560 will be clocked in the 750-800mhz range, this would need to be clocked at least around ~600mhz. I wouldn't think there is a lot of wiggle room between that and what they could get away with using a < 375W spec, even using tricks like lower clocked/voltage (1.35v, 3.6/4.0Gbps) 2Gb (denser, not GB) GDDR5.
Posted on Reply
#35
Hayder_Master
ATI 6990, LOL ATI from this moment show your white flag
Posted on Reply
#36
Tatty_Two
Gone Fishing
alwayssts75W from the slot, and iirc the power management came about in 2.0?

At any rate, the 'spec' is anything but one, as we can see GF100/GF110 clearly can draw more juice under load than 300W at stock, and AMD/nVIDIA implement their TDPs differently.

One has to question this product though, power consumption Nazi or not. It's outside the pci-e spec, and (if a nVIDIA-sanctioned card) can almost certainly be seen as a concession that the same configuration using GF114 will not beat 6990...which is perhaps a given.

For it even to be feasible, it needs to be faster than the same configuration using GF114 GPUs instead, even at a greater clockspeed. Since we know GF104 GPUs clock into the 800+mhz range, and would assume GTX560 will be clocked in the 750-800mhz range, this would need to be clocked at least around ~600mhz. I wouldn't think there is a lot of wiggle room between that and what they could get away with using a < 375W spec, even using tricks like lower clocked/voltage (1.35v, 3.6/4.0Gbps) 2Gb (denser, not GB) GDDR5.
Not quite sure what you are saying there in the first part of your post, to make it clear, the PCI-E 2.0 and 2.1 specification (and 3.0) is for 150W from the slot, the thing I am not sure of is whether all motherboard manufacturers actually conform to the specification.... ie.... their boards actually do draw more than 75W but potentially with a PCI-E 2.0 and onwards board 450W could be drawn with two 8 pin connections, whether in reality that is the case I really don't know..... my point being simply, don't assume that a dual GPU card would be starved of powerrrrzzzz, I doubt very much that NVidia would bring out a dual GPU card if they didn't think it would be competative.
Posted on Reply
#37
overclocking101
hells to the muthafuckin yeah! finally nvidia! hello $800 gpu!
Posted on Reply
#38
bear jesus
It would be so awesome if nvidia get a dual chip card in on the 5xx range, it was a little disappointing that there was no top end dual chip card from the 4xx range to go up against the 5970.

*thinks about the crazy frame rate this card in sli could get* I wonder how well a pair of these would do in 3dmark11.
Posted on Reply
#40
N3M3515
it's not going to be dual 580
maybe dual 570, or dual 560(gtx 460 @ 750Mhz) sounds more realistic.
Posted on Reply
#41
alexsubri
hayder.masterATI 6990, LOL ATI from this moment show your white flag



The 6990 didn't even come out yet, so it's too premature to spectulate
Posted on Reply
#42
bear jesus
N3M3515it's not going to be dual 580
maybe dual 570, or dual 560(gtx 460 @ 750Mhz) sounds more realistic.
I would expect them to try something like what AMD/ATI did with the 5970, maybe use the 580 core with 570 clocks if they could get the power usage low enough by doing so.
alexsubriwww.seoboy.com/wp-content/uploads/2009/12/implied-facepalm.jpg


The 6990 didn't even come out yet, so it's too premature to spectulate
I agree, even the 6970 is not out so the 6990 speed can't even be guessed using 6970 crossfire numbers yet.
Posted on Reply
#43
KainXS
N3M3515it's not going to be dual 580
maybe dual 570, or dual 560(gtx 460 @ 750Mhz) sounds more realistic.
thats a good catch

it actually does have the 1038A1 like on the sample w1zzard used in his review but the other marks are different and the identifyer the GF110 at the bottom isn't there so this could actually not even be a 512sp(disabled shaders) part but still based on the GF110.
Posted on Reply
#44
wolf
Better Than Native
Batou1986Good news ive almost finished converting my computer to nuclear power :shadedshu
har har har you so witty.

this will draw little more than a GTX295, while whoppin' it twice over.

good on them I say, get it out fast like the GTX580.
Posted on Reply
#45
alexsubri
wolfhar har har you so witty.

this will draw little more than a GTX295, while whoppin' it twice over.

good on them I say, get it out fast like the GTX580.
I wouldn't be suprised if this goes neck to neck with the 5990
Posted on Reply
#46
Over_Lord
News Editor
Wait, so finally I've got a room heater replacement???? I thought Fermi was a bit too cool, 350W LOL I was laughing at that. I guess 600W or 700W would do good in the winters
Posted on Reply
#47
Lionheart
(FIH) The Donso? go plant a tree if it bothers you :roll::roll:
It bother's me:wtf: "plants 5,000 tree's" aahh I feel better, now I can play Crysis 3 with my 2000W GTX 795:toast::laugh:
Posted on Reply
#50
1nf3rn0x
Wow, that seems great. This will be war of the worlds. 5990 vs. GTX 595
Posted on Reply
Add your own comment
Nov 21st, 2024 11:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts