Friday, November 19th 2010

NVIDIA Readying Dual-GF110 Graphics Accelerator, Eying Total Performance Leadership

NVIDIA stunned the computing world with a speedy launch of the GeForce GTX 580. The GPU was able to increase NVIDIA's single-GPU performance leadership, and also iron-out some serious issues with the power-draw and thermal characteristics of previous generation GeForce GTX 480. It is now that a dual-GPU implementation of the GF110 graphics processor, on which the GTX 580 is based, looks inevitable. NVIDIA seems to be ready with a prototype of such a dual-GPU accelerator, which the Chinese media is referring to as the "GTX 595".

The reference design PCB of the dual-GF110 accelerator (which still needs some components fitted) reveals quite a lot about the card taking shape. First, it's a single PCB card, both the GPU systems are located on the same PCB. Second, there are slots for three DVI output connectors present, indicating that the card with be 3D Vision Surround ready in a single card. You just have to get one of these, plug in three displays over standard DVI, and you're ready with a large display head spanning three physical displays.
Third, it could feature a total of 3 GB of video memory (or 1.5 GB per GPU system). Each GPU system has six memory chips on the obverse side of the PCB. At this point we can't comment on the memory bus width of each GPU. The core configuration of the GPUs are also unknown. Fourth, power is drawn in from two 8-pin PCI-E power connectors. The card is 2-way SLI capable with another of its kind.
Source: enet.com.cn
Add your own comment

153 Comments on NVIDIA Readying Dual-GF110 Graphics Accelerator, Eying Total Performance Leadership

#101
mdsx1950
We all have to wait and see what the 6950/6970/6990 and GTX595 will be before making huge assumptions.
nvidiaintelftwand theres also a high chance that the 6970 will fail. remember the 2900 when ati tried a bit to hard?
nVidia fanboy much? :shadedshu

EDIT - Your username pretty much explains everything lol.
Posted on Reply
#102
Judas
2 of those in SLI ,would need your own nuclear power reactor in your shed to run them :)
Posted on Reply
#103
Yellow&Nerdy?
Only two 8-pin power connections? That's pretty surprising. Was waiting to see two 8-pin and a 6-pin. But the power circuitry looks pretty robust, indicating huge power consumption and heat output. TDP is probably going to end up being around 400W and the stock cooler will be either a 3-slot monster or a waterblock. So it's most likely two GF110s with some shader cores locked and downclocked. Personally I would of liked to see a dual GF104/GF114 card, like two fully enabled GF104.
Posted on Reply
#104
claylomax
Judas2 of those in SLI ,would need your own nuclear power reactor in your shed to run them :)
*sigh*
Posted on Reply
#106
Unregistered
SabreWulf69Let down or not, the fact is at this very moment the GTX 580 is still the fastest single-GPU card out there, like it or lump it, prove me otherwise. As I'm pretty sure I have also mentioned, SLI has always scaled better too than ATI/AMD's Crossfire. This is all I care about. I want links of reviews to prove me otherwise, please no more speculation, the title is "NVIDIA Readying Dual-GF110 Graphics Accelerator, Eying Total Performance Leadership".
You guys are keep talking about single GPU, dual-GPU.....This is nothing. The ONLY important thing, is that AMD has now the most powerful card on the market. It can have 1000 GPUs, this is irrelevant. The RELEVANT thing is there is no single card that can match AMD video card.
You are to young and innocent to remember that back on the time, the best 3D card was 3dfx VooDoo with 2 GPUs, than Voodoo 2 with 3 GPUs, than the prototype Voodoo 5 with 4 GPUs, etc. Everybody was talking about the card, not about how many GPUs had.
The same thing with this card. If they will sale this, probably nvidia will became the producer with the fastest graphics card in the market. That is all that matters. ;)
#107
SabreWulf69
Those 3dfx cards were the bomb, add-ons with daughter boards, and yeah the prototypes as you mentioned. If only they were still around... I guess stats wise, most powerful overall card - 5970, most powerful dual-gpu card - 5970, most powerful single-gpu card - 580, and yeah here's to hoping both companies pull out in front from time to time, healthy competition is always good and keeps prices on both sides lower.
Posted on Reply
#108
mdsx1950
TAViXYou guys are keep talking about single GPU, dual-GPU.....This is nothing. The ONLY important thing, is that AMD has now the most powerful card on the market. It can have 1000 GPUs, this is irrelevant. The RELEVANT thing is there is no single card that can match AMD video card.
You are to young and innocent to remember that back on the time, the best 3D card was 3dfx VooDoo with 2 GPUs, than Voodoo 2 with 3 GPUs, than the prototype Voodoo 5 with 4 GPUs, etc. Everybody was talking about the card, not about how many GPUs had.
The same thing with this card. If they will sale this, probably nvidia will became the producer with the fastest graphics card in the market. That is all that matters. ;)
Well said mate! :respect::rockout:
Posted on Reply
#109
KainXS
damn man we went from
CDdude55It's usually the other way around in Nvidia threads, i'm actually surprised we got to over three pages without consistent trolling that usually manifests itself in these threads.
to this:shadedshu
Posted on Reply
#110
Fourstaff
I shall now propose that we have a special Goodwin's law for TPU: Reductio ad Fanboyum and Reductio ad 3Dfx
Posted on Reply
#111
mlee49
I want one! Hell two!!
SabreWulf69Need 4 of them in SLI on a SR2 mobo :-D
Too bad it's only 2 way SLI capable. Only one sli connector on the top of the card.
Posted on Reply
#112
a_ump
mlee49I want one! Hell two!!



Too bad it's only 2 way SLI capable. Only one sli connector on the top of the card.
well they only needs one...there's 2 gpus per card, 4gpus in SLI. they haven't moved past 4 gpus for consumers.
Posted on Reply
#113
SabreWulf69
Awww :-( lol, also would removing the power limiting chips and using them with PCI-E 3.0 be of any benefit?
Posted on Reply
#114
CDdude55
Crazy 4 TPU!!!
SabreWulf69Awww :-( lol, also would removing the power limiting chips and using them with PCI-E 3.0 be of any benefit?
The power limiter only affects certain programs (OCCT and Furmark it looks like), removing it won't give you some type of hidden performance in real world situations.

PCI-E bandwidth doesn't have anything to do with the limiter, and keep in mind, PCI-E 3.0 isn't out and we still have barely saturated 1.0/1.1 anyways.
Posted on Reply
#115
Benetanegia
the54thvoidWell done Ben. But you entirely miss the point here.

The arguement wasn't about who generally scales better. What Sabrewulf explicitly said was:

SLI has always scaled better too than ATI/AMD's Crossfire

So i gave the link (pics) as requested:

I want links of reviews to prove me otherwise

My link proves otherwise. Yes, of course Nvidia cards scale better on the whole but the post was short sighted enough to say NV ALWAYS scales better which is no longer true as the 6 series is making headway.

You dont always have to defend NV from me. I'm 90% on way to buying a new card and it's more than likely going to be NV (unless HD 6970 is surprisingly good). But to defend a post that is in fact wrong by using irrelevant info doesn't help.

Your post didnt disprove me at all.
I was not proving nor disproving anything much less defending any POV from this thread, I just posted some of W1zzard's charts, in order to show a wider range of cards. I said nothing, nor it was my intetion to say anything, just present a wider ammount of data than the one you posted, because it was limited to mid-range compared to high-end, when it's obvious to everyone that mid-range will always scale better...

But just look how true the "SLI scales better" argument is, generally, that without any comment, by only posting some charts, some empirical data collected by W1zz, so many people in this thread suddenly thought I was implying SLI is better. Sorry guys that's only your own subconscious betraying yourselves and indirectly making you agree with something you would never admit...
entropy13LOL you use the data from the max. resolution only against the data that represents all resolutions :roll::roll::roll:

Over in guru3d DiRT2, Far Cry 2 and Crysis Warhead at lower resolutions shows the 2-way SLI of the 580 slightly ahead of 3-way, but if I were to follow your reasoning Benetanegia that's inconsequential because apparently only max resolution matters and not the data for ALL resolutions...:roll::roll::roll:
First of all, read above. :roll:

Second, of course only max resolution matters on this particular SLI/Crossfire debate (GF110 vs Cayman). I refuse to judge $800-1000 graphics setups based on low resolutions, that's stupid. I can go even farther, anyone who does $500-1000 SLI/Crossfire in order to play on anything below 1920x1200 8xAA or 2560x1600 4xAA is just stupid, let alone at the lowest 3 out of 5 of the resolutions that W1zz uses in his reviews. If you want to game at a max res of 1680x1050, buy a single GTX460 or HD6850 and that's it.
Posted on Reply
#117
SabreWulf69
CDdude55The power limiter only affects certain programs (OCCT and Furmark it looks like), removing it won't give you some type of hidden performance in real world situations.

PCI-E bandwidth doesn't have anything to do with the limiter, and keep in mind, PCI-E 3.0 isn't out and we still have barely saturated 1.0/1.1 anyways.
Shoulda looked that one up myself, "Q15: Does PCIe 3.0 enable greater power delivery to cards?
A15: The PCIe Card Electromechanical (CEM) 3.0 specification consolidates all previous form factor power delivery specifications, including the 150W and the 300W specifications.", I'm wondering if this GTX 595 won't be then what cards will be using the new specification.
Posted on Reply
#118
Unregistered
Benetanegia... of course only max resolution matters on this particular SLI/Crossfire debate (GF110 vs Cayman). I refuse to judge $800-1000 graphics setups based on low resolutions, that's stupid. I can go even farther, anyone who does $500-1000 SLI/Crossfire in order to play on anything below 1920x1200 8xAA or 2560x1600 4xAA is just stupid, let alone at the lowest 3 out of 5 of the resolutions that W1zz uses in his reviews. If you want to game at a max res of 1680x1050, buy a single GTX460 or HD6850 and that's it.
I second that.
Posted on Edit | Reply
#119
the54thvoid
Super Intoxicated Moderator
BenetanegiaSorry guys that's only your own subconscious betraying yourselves and indirectly making you agree with something you would never admit...
Well, not sure who that was for. When it comes to technology and comments i have no betrayals. And have no problem admitting this or that. My subconscious knows it's place and it's at home with a pint and a copy of New Scientist.

As for the position of a dual GF110 for total performance leadership, it becomes very relevant for high end scaling as it will be two near top gpu's from both camps slugging it out. Fudzilla suggests it will be out soon, others say it will wait for the 6990.
I think these cards will be crippled by their own siblings. Does putting two gpu's onto one core (and surely downclocking or underpowering them) not give less performance than two sli or crossfired cards?
i.e lets say the green dualie is two GF104's. (i can't believe it will be two fully operational 580's.) Therefore to 580's in sli would be better. So cost will have to be carefully considered.
Likewise for AMD. Two fully operational 6970's seems too much (though we dont know how they are yet!).

Like i've said before in this thread somewhere - not for me. I'd rather have the one powerful gpu from now on - and i'm leaning green right now. Maybe Dec 13th will change my mind?
Posted on Reply
#120
Selene
It's just like the GTX295 was 2x GTX275s with the clocks turned down, but it still out did the GTX285 by a good amount and holds it on still today in DX9/10 apps.
I have 2 GTX260's that for the most part are on par with the 295 witch is why I have yet to upgrade.
The GTX 595 should be good a great card, I will more then likely pick on up due to it having NV surround on one card, I am running 2d surround on my 260s now and for the most part is great, I do run out of Vram on some games.
Posted on Reply
#121
Unregistered
I'm dead curious if this card will be more expensive than the GTX 8800 ULTRA back on the days... ;)
Posted on Edit | Reply
#122
wolf
Better Than Native
TAViXYou guys are keep talking about single GPU, dual-GPU.....This is nothing. The ONLY important thing, is that AMD has now the most powerful card on the market. It can have 1000 GPUs, this is irrelevant. The RELEVANT thing is there is no single card that can match AMD video card.
You are to young and innocent to remember that back on the time, the best 3D card was 3dfx VooDoo with 2 GPUs, than Voodoo 2 with 3 GPUs, than the prototype Voodoo 5 with 4 GPUs, etc. Everybody was talking about the card, not about how many GPUs had.
The same thing with this card. If they will sale this, probably nvidia will became the producer with the fastest graphics card in the market. That is all that matters. ;)
this is my answer to that.
newtekie1There are a couple problems with that possition.

The main one, and the one that caused me to stop using dual-card solutions, is that when a new game comes out Crossfire and SLi both have to be optimized for it before they really work. So while BC2 might be one of the better examples of a game that gives very good performance scaling with SLi and Crossfire, other games do not yeild that great of performance scaling especially when they are first released.

In fact sometimes when a game is released, SLi or Crossfire doesn't work at all, so then you are stuck with the performance of a single card, and there have even been cases where a driver that allowed Crossfire or SLi took 2+ Weeks to be released.

With a single card, you know right away that if you buy a game on launch day, you are going to get the kick-ass performance you paid for and not mid-range performance when you put out high-end money.
the GTX580 is what, a few % behind a 5970? I'd take that gap anday for the above reasons.
Posted on Reply
#123
Unregistered
I think this that was valuable a few years ago, nowadays 90% (10% percent are crappy console ports anyways, that don't even use Quad Core Procs) of the good games released can take Crossfire/SLI. I agree that every month there are more and more optimizations for dual/tri cards but not more than 5-15%.
Posted on Edit | Reply
#124
TheMailMan78
Big Member
TAViXI think this that was valuable a few years ago, nowadays 90% (10% percent are crappy console ports anyways, that don't even use Quad Core Procs) of the good games released can take Crossfire/SLI. I agree that every month there are more and more optimizations for dual/tri cards but not more than 5-15%.
Ever play BC2 or Metro? Try running one of those without a quad.
Posted on Reply
Add your own comment
Nov 21st, 2024 12:02 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts