# NVIDIA Readying Dual-GF110 Graphics Accelerator, Eying Total Performance Leadership



## btarunr (Nov 19, 2010)

NVIDIA stunned the computing world with a speedy launch of the GeForce GTX 580. The GPU was able to increase NVIDIA's single-GPU performance leadership, and also iron-out some serious issues with the power-draw and thermal characteristics of previous generation GeForce GTX 480. It is now that a dual-GPU implementation of the GF110 graphics processor, on which the GTX 580 is based, looks inevitable. NVIDIA seems to be ready with a prototype of such a dual-GPU accelerator, which the Chinese media is referring to as the "GTX 595". 

The reference design PCB of the dual-GF110 accelerator (which still needs some components fitted) reveals quite a lot about the card taking shape. First, it's a single PCB card, both the GPU systems are located on the same PCB. Second, there are slots for three DVI output connectors present, indicating that the card with be 3D Vision Surround ready in a single card. You just have to get one of these, plug in three displays over standard DVI, and you're ready with a large display head spanning three physical displays. 






Third, it could feature a total of 3 GB of video memory (or 1.5 GB per GPU system). Each GPU system has six memory chips on the obverse side of the PCB. At this point we can't comment on the memory bus width of each GPU. The core configuration of the GPUs are also unknown. Fourth, power is drawn in from two 8-pin PCI-E power connectors. The card is 2-way SLI capable with another of its kind.

*View at TechPowerUp Main Site*


----------



## Fourstaff (Nov 19, 2010)

That PCB looks abit crowded to me. Any chance its going to be GF104 rather than GF114/GF110?


----------



## newtekie1 (Nov 19, 2010)

Wow, looks beastly.

Waits for someone to asking if it comes with its own nuclear power plant to power it.


----------



## Batou1986 (Nov 19, 2010)

Good news ive almost finished converting my computer to nuclear power :shadedshu

Happy now newtekie


----------



## mechtech (Nov 19, 2010)

Looks really expensive lol


----------



## yogurt_21 (Nov 19, 2010)

looks sexy but it's a low res image so i can't tell if any editing has been done. 

if they are truly launching a dual 580 it will be epic, expensive, and powerhungry.

edit: also there must be 12 more ram chips on the back if this is truly supposed to be a dual 580.


----------



## (FIH) The Don (Nov 19, 2010)

and now people are gonna whine about power usage

WHY!!!!!!!!!!!!!!!!!!

YOU DO NOT BUY HIGHEND CARDS TO SAVE POWER YOU FREAKIN IDIOT !!! :shadedshu


----------



## yogurt_21 (Nov 19, 2010)

(FIH) The Don said:


> and now people are gonna whine about power usage
> 
> WHY!!!!!!!!!!!!!!!!!!
> 
> YOU DO NOT BUY HIGHEND CARDS TO SAVE POWER YOU FREAKIN IDIOT !!! :shadedshu



well if a 1kw psu doesn't run it I'm going to whine. lol


----------



## KainXS (Nov 19, 2010)

So these HAVE to have the power limiter chips and I wonder whats gonna happen if you turn it off

can't wait for it.

now when they make a new power guzzler nobody will be able to point the finger at the HD4870X2 lol


----------



## JrRacinFan (Nov 19, 2010)

Time for a 1.1Kw power supply and factory watercooling block.


----------



## the54thvoid (Nov 19, 2010)

For the memory - it must have 3GB.  1GB per card (for 2 total) would suffer the same problems at 2560 that the AMD cards have with high AA and quality settings - too much texturing for the 1Gb each gpu to handle.  So, surely, following 1.5Gb for a GTX 580 (and 480) it would need 3GB, otherwise I wouldn't touch it with a bargepole.

Yeah, the power issue is irrelevant.  The 6990 will be just as bad.

I dont think both cards will be very good thermally or acoustically.  But you never know...


----------



## KainXS (Nov 19, 2010)

the54thvoid said:


> For the memory - it must have 3GB.  1GB per card (for 2 total) would suffer the same problems at 2560 that the AMD cards have with high AA and quality settings - too much texturing for the 1Gb each gpu to handle.  So, surely, following 1.5Gb for a GTX 580 (and 480) it would need 3GB, otherwise I wouldn't touch it with a bargepole.
> 
> Yeah, the power issue is irrelevant.  The 6990 will be just as bad.
> 
> I dont think both cards will be very good thermally or acoustically.  But you never know...



when the 6970 comes out we will have an idea of how bad the 6990 will be but right now nobody knows so you can't say the 6990 will be just as bad

I think what they can do is set up the power circuitry in a loop so when one gpu down clocks the other one upclocks and that might help

Ima guess 370 watts with limiter at least 400 without


----------



## newtekie1 (Nov 19, 2010)

Batou1986 said:


> Good news ive almost finished converting my computer to nuclear power :shadedshu
> 
> Happy now newtekie



Yes I am.


----------



## v12dock (Nov 19, 2010)

595? vs 5990?


----------



## KainXS (Nov 19, 2010)

595 vs 5990

?


----------



## Lionheart (Nov 19, 2010)

I hope it is 2 GTX 580's on one board, jizz time, but Im gonna still wait for AMD's shiny red beasts


----------



## Sasqui (Nov 19, 2010)

That is SICK!  I'm surprised (and I've been saying this all along) that they didn't go with a sandwich design, but they managed to get two of those beastly peices of silicon on a single PCB it appears.

The cooling is going to have to be very creative, so is the price.


----------



## ToTTenTranz (Nov 19, 2010)

I kinda doubt a dual GF110 card will ever see the light of day, at least not outside some kind of very limited edition.

We're talking about a ~550W card here.  Even if they figure out how to fit it in regular cases, it'll need a huge cooler (3-slot?).

But a dual GF104 card, with all 384 ALUs enabled in each GPU, would be much more believable.


----------



## hv43082 (Nov 19, 2010)

So let's see which camp will have their top dog out first.


----------



## SabreWulf69 (Nov 19, 2010)

Pure PWNAGE, w00t, I'm all up for one if they are indeed 2x 580's :-D


----------



## Mistral (Nov 19, 2010)

GF110x2? I'm baffled... Is the third 8-pin power connector on the other side of the PCB?


----------



## Tatty_One (Nov 19, 2010)

Mistral said:


> GF110x2? I'm baffled... Is the third 8-pin power connector on the other side of the PCB?



Don't be baffled, it will be downclocked therefore less voltage/draw and 375W should be enuff (hopefully).


----------



## Kreij (Nov 19, 2010)

KainXS said:


> now when they make a new power guzzler nobody will be able to point the finger at the HD4870X2 lol



If this thing uses more power than my 4870x2 I will be really saddened.
Having the largest carbon footprint is the only claim to fame my poor card has left. 

Anyway ... I hope this thing rocks. The GPU war never gets old in my book.


----------



## qamulek (Nov 19, 2010)

*power limit?*

A 580 is currently power limited due to spiking above the total available power of 150+75+75=300watt max.  In tpu's article here the 580 power limits itself after spiking up then it settles to a max watts of 200(question why 200 and not something more like 250 or 290???  Probably due to the limits of the electronics used on board...).  One question I didn't see answered was did the performance increase proportionately to the increase in power used after the power limit was taken away?  

In any case the point is a 580 really needs an 8pin+8pin power to keep itself from being power limited from the cables(let alone the electronics used on the board as well as possible problems with load balancing), so how is a dual gpu going to fare better if a single gpu is already power limited?  I guess I will just have to wait till someone who actually knows what they're talking about gives a go at it, or just wait till the card comes out and reviews are posted.

ah!  My guess:  The gtx580 is so powerful to the point it has to be power limited when used to its fullest, however most applications will be limited by the weakest link in the gpu before using the full power of the gtx580(example:  cut the rops in half and suddenly the card is limited by the rop count).  The dual gpu card will still need to be power limited, but it won't matter as most normal applications will be limited by the weakest link in the gpu before reaching the power limit.


----------



## yogurt_21 (Nov 19, 2010)

Tatty_One said:


> Don't be baffled, it will be downclocked therefore less voltage/draw and 375W should be enuff (hopefully).



shoot at 375w my psu could handle 2 of em with a decent i7...wouldn't be able to clock it at all though.

even downclocked a dual gf110 would be amazing, now I'm starting to wonder if the 6990 is going to dominate as the 5970 has, the 5970 didn't have any dual card competition.


----------



## LAN_deRf_HA (Nov 19, 2010)

Wonder if nvidia asked asus for advice after seeing their dual 480 board. Looks very similar. Even the same off center mounting holes. Actually are we sure this isn't a new mars card? I mean has nvidia said anything about it?


----------



## yogurt_21 (Nov 19, 2010)

LAN_deRf_HA said:


> Wonder if nvidia asked asus for advice after seeing their dual 480 board. Looks very similar. Even the same off center mounting holes. Actually are we sure this isn't a new mars card? I mean has nvidia said anything about it?
> 
> http://www.techpowerup.com/img/10-11-19/147a.jpg
> http://www.techpowerup.com/img/10-07-16/mars_ii_1.jpg



idk it seems like the top one is far more eloquent which to me says it's mor likely nvidia than asus.


----------



## Bjorn_Of_Iceland (Nov 19, 2010)

Just in time for Crysis 2 

Board layout pretty much looks like GTX295 single pcb. Would probly run ok on a 750w+ psu, with at least 48amps on the +12v. Im seeing a 600 USD tag there..


----------



## _JP_ (Nov 19, 2010)

That's one hell of a mess in that board! Asus did tidy up the components better. Seems like it will be one hell of a performer. Can't wait! 
And that SLI finger, THAT SLI finger...


Bjorn_Of_Iceland said:


> Im seeing a 600 USD tag there..


That's waaaaaaaaaaaaaaaaaaaaay too low...around 750 USD should be more adequate...


----------



## MikeX (Nov 19, 2010)

600 watt card...
10 years later 2000watt card?


----------



## (FIH) The Don (Nov 19, 2010)

MikeX said:


> 600 watt card...
> 10 years later 2000watt card?



so? go plant a tree if it bothers you


----------



## Tatty_One (Nov 19, 2010)

qamulek said:


> A 580 is currently power limited due to spiking above the total available power of 150+75+75=300watt max.  In tpu's article here the 580 power limits itself after spiking up then it settles to a max watts of 200(question why 200 and not something more like 250 or 290???  Probably due to the limits of the electronics used on board...).  One question I didn't see answered was did the performance increase proportionately to the increase in power used after the power limit was taken away?
> 
> In any case the point is a 580 really needs an 8pin+8pin power to keep itself from being power limited from the cables(let alone the electronics used on the board as well as possible problems with load balancing), so how is a dual gpu going to fare better if a single gpu is already power limited?  I guess I will just have to wait till someone who actually knows what they're talking about gives a go at it, or just wait till the card comes out and reviews are posted.
> 
> ah!  My guess:  The gtx580 is so powerful to the point it has to be power limited when used to its fullest, however most applications will be limited by the weakest link in the gpu before using the full power of the gtx580(example:  cut the rops in half and suddenly the card is limited by the rop count).  The dual gpu card will still need to be power limited, but it won't matter as most normal applications will be limited by the weakest link in the gpu before reaching the power limit.



You also have to remember that since PCI-E 2.0 there is the capability to get 150W from the slot, in 2.1 and 3.0 this is even "smart" and can give you what you need rather than the whole 150W, therefore at least theoretically, 450W could be available here.


----------



## Selene (Nov 19, 2010)

I need!


----------



## TheMailMan78 (Nov 19, 2010)

SabreWulf69 said:


> Pure PWNAGE, w00t, I'm all up for one if they are indeed 2x 580's :-D



It should render all your kiddie porn flawlessly.


----------



## alwayssts (Nov 19, 2010)

Tatty_One said:


> You also have to remember that since PCI-E 2.0 there is the capability to get 150W from the slot, in 2.1 and 3.0 this is even "smart" and can give you what you need rather than the whole 150W, therefore at least theoretically, 450W could be available here.



75W from the slot, and iirc the power management came about in 2.0?

At any rate, the 'spec' is anything but one, as we can see GF100/GF110 clearly can draw more juice under load than 300W at stock, and AMD/nVIDIA implement their TDPs differently.

One has to question this product though, power consumption Nazi or not.  It's outside the pci-e spec, and (if a nVIDIA-sanctioned card) can almost certainly be seen as a concession that the same configuration using GF114 will not beat 6990...which is perhaps a given.

For it even to be feasible, it needs to be faster than the same configuration using GF114 GPUs instead, even at a greater clockspeed.  Since we know GF104 GPUs clock into the 800+mhz range, and would assume GTX560 will be clocked in the 750-800mhz range, this would need to be clocked at least around ~600mhz.  I wouldn't think there is a lot of wiggle room between that and what they could get away with using a < 375W spec, even using tricks like lower clocked/voltage (1.35v, 3.6/4.0Gbps) 2Gb (denser, not GB) GDDR5.


----------



## Hayder_Master (Nov 19, 2010)

ATI 6990, LOL ATI from this moment show your white flag


----------



## Tatty_One (Nov 19, 2010)

alwayssts said:


> 75W from the slot, and iirc the power management came about in 2.0?
> 
> At any rate, the 'spec' is anything but one, as we can see GF100/GF110 clearly can draw more juice under load than 300W at stock, and AMD/nVIDIA implement their TDPs differently.
> 
> ...



Not quite sure what you are saying there in the first part of your post, to make it clear, the PCI-E 2.0 and 2.1 specification (and 3.0) is for 150W from the slot, the thing I am not sure of is whether all motherboard manufacturers actually conform to the specification.... ie.... their boards actually do draw more than 75W but potentially with a PCI-E 2.0 and onwards board 450W could be drawn with two 8 pin connections, whether in reality that is the case I really don't know..... my point being simply, don't assume that a dual GPU card would be starved of powerrrrzzzz, I doubt very much that NVidia would bring out a dual GPU card if they didn't think it would be competative.


----------



## overclocking101 (Nov 20, 2010)

hells to the muthafuckin yeah! finally nvidia! hello $800 gpu!


----------



## bear jesus (Nov 20, 2010)

It would be so awesome if nvidia get a dual chip card in on the 5xx range, it was a little disappointing that there was no top end dual chip card from the 4xx range to go up against the 5970.

*thinks about the crazy frame rate this card in sli could get* I wonder how well a pair of these would do in 3dmark11.


----------



## ariff_tech (Nov 20, 2010)

*h o l e e  c r a p...*


----------



## N3M3515 (Nov 20, 2010)

it's not going to be dual 580
maybe dual 570, or dual 560(gtx 460 @ 750Mhz) sounds more realistic.


----------



## alexsubri (Nov 20, 2010)

hayder.master said:


> ATI 6990, LOL ATI from this moment show your white flag









The 6990 didn't even come out yet, so it's too premature to spectulate


----------



## bear jesus (Nov 20, 2010)

N3M3515 said:


> it's not going to be dual 580
> maybe dual 570, or dual 560(gtx 460 @ 750Mhz) sounds more realistic.



I would expect them to try something like what AMD/ATI did with the 5970, maybe use the 580 core with 570 clocks if they could get the power usage low enough by doing so.



alexsubri said:


> http://www.seoboy.com/wp-content/uploads/2009/12/implied-facepalm.jpg
> 
> 
> The 6990 didn't even come out yet, so it's too premature to spectulate



I agree, even the 6970 is not out so the 6990 speed can't even be guessed using 6970 crossfire numbers yet.


----------



## KainXS (Nov 20, 2010)

N3M3515 said:


> it's not going to be dual 580
> maybe dual 570, or dual 560(gtx 460 @ 750Mhz) sounds more realistic.



thats a good catch

it actually does have the 1038A1 like on the sample w1zzard used in his review but the other marks are different and the identifyer the GF110 at the bottom isn't there so this could actually not even be a 512sp(disabled shaders) part but still based on the GF110.


----------



## wolf (Nov 20, 2010)

Batou1986 said:


> Good news ive almost finished converting my computer to nuclear power :shadedshu



har har har you so witty.

this will draw little more than a GTX295, while whoppin' it twice over.

good on them I say, get it out fast like the GTX580.


----------



## alexsubri (Nov 20, 2010)

wolf said:


> har har har you so witty.
> 
> this will draw little more than a GTX295, while whoppin' it twice over.
> 
> good on them I say, get it out fast like the GTX580.



I wouldn't be suprised if this goes neck to neck with the 5990


----------



## Over_Lord (Nov 20, 2010)

Wait, so finally I've got a room heater replacement???? I thought Fermi was a bit too cool, 350W LOL I was laughing at that. I guess 600W or 700W would do good in the winters


----------



## Lionheart (Nov 20, 2010)

(FIH) The Don said:


> so? go plant a tree if it bothers you



It bother's me "plants 5,000 tree's" aahh I feel better, now I can play Crysis 3 with my 2000W GTX 795


----------



## CDdude55 (Nov 20, 2010)

Awesome.


----------



## Animalpak (Nov 20, 2010)

5970 is kia


----------



## 1nf3rn0x (Nov 20, 2010)

Wow, that seems great. This will be war of the worlds. 5990 vs. GTX 595


----------



## motasim (Nov 20, 2010)

I would really love to see a dual-GF110 GPU but given the power requirements and thermal print unfortunately it won't be possible. What is more realistic; and is most probably the case here is a GPU with dual fully-enabled GF104 chips (aka GF114). As usual; time will tell


----------



## Jiraiya (Nov 20, 2010)

http://www.arabhardware.net/forum/showpost.php?p=1643093&postcount=1217

http://www.arabhardware.net/forum/showpost.php?p=1643079&postcount=1216


----------



## LAN_deRf_HA (Nov 20, 2010)

Whats the amorphous gray blob on the back there?


----------



## Jiraiya (Nov 20, 2010)

LAN_deRf_HA said:


> Whats the amorphous gray blob on the back there?



this original






first image After cleaning


----------



## the54thvoid (Nov 20, 2010)

LAN_deRf_HA said:


> Whats the amorphous gray blob on the back there?



Oh that's proof of what it is.  Fuzzy pictures ftl.

I love all the fanaticism coming out the woodwork again.  People saying the 6990 is dead and Nvidia have won.  Jeez.

Won't people learn.

Given the GTX580 is a vapor chamber cooled part that still can reach 90+ degrees and without power throttling guzzle 300+ watts (links below) do you really think they can get two GF110 cores on one card?  Just like the GTX 295 wasn't 2x285's (it was 2 x 275's) and just like the 5970 is 2x5870's downclocked to 5850 speed, this would seem very likely not be two fully operational 580 cores.
Likewise, the 6970 rumours suggest a high power guzzler as it is a huge die (why it should come close to the 580).  Given the leaks that point to a cayman XT Antilles card, the 6990 will probably be a monster also.
Both cards will surely be made in very limited numbers as they'll both be ultra high end parts.

Despite how good they might be, i wouldn't buy one simply because it's dual gpu and having had GTX 295 and 2x5850's, i want a single card now.  A GTX 580 Super Overclock from Gigabyte would suffice 

GTX 580 power draw:
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/25.html
GTX 580 temp:
http://www.hexus.net/content/item.php?item=27307&page=15


----------



## Ferrum Master (Nov 20, 2010)

LAN_deRf_HA said:


> Wonder if nvidia asked asus for advice after seeing their dual 480 board. Looks very similar. Even the same off center mounting holes. Actually are we sure this isn't a new mars card? I mean has nvidia said anything about it?
> 
> http://www.techpowerup.com/img/10-11-19/147a.jpg
> http://www.techpowerup.com/img/10-07-16/mars_ii_1.jpg



Actually they look completely different. Power supply solution, mounting holes, power connector (power draw) etc... it is a completely different device.


----------



## TAViX (Nov 20, 2010)

I don't want to sound like an a$$, but if a GTX580 consume 10% more than a 5970, imagine how much this beast will eat for breakfast!!!


----------



## SabreWulf69 (Nov 20, 2010)

I hope it consumes 5000W/s. Power to the people. lol


----------



## LAN_deRf_HA (Nov 20, 2010)

Ferrum Master said:


> Actually they look completely different. Power supply solution, mounting holes, power connector (power draw) etc... it is a completely different device.



It doesn't look different enough given that the asus pic was a one off proof of concept prototype with different cores. I wouldn't rule out it being a new mars card as it may simply be a refinement and evolution of that concept.


----------



## TAViX (Nov 20, 2010)

The power problem would be solved, theoretical, if they would reduce the manufacturing process down to 22nm or 18nm, or even lower. With the actual technology is not physical possible to increase the performance of a GPU without increasing its power consumption. The smaller the trasistor, the less power they eat, the hottest will get...I know, it's a vicious circle...


----------



## [I.R.A]_FBi (Nov 20, 2010)

SabreWulf69 said:


> I hope it consumes 5000W/s. Power to the people. lol



actually ... power from teh ppl


----------



## micropage7 (Nov 20, 2010)

it makes me remember the processor development, it started from single core, then the push the clock, when the max clock reached they switched into multi core until now, it looks gonna be like that they started with single, then dual, then put multi core on one package
lets see..


----------



## TAViX (Nov 20, 2010)

1nf3rn0x said:


> Wow, that seems great. This will be war of the worlds. _5_990 vs. GTX 595



You mean *6*990 vs GTX595...


----------



## Ferrum Master (Nov 20, 2010)

LAN_deRf_HA said:


> It doesn't look different enough given that the asus pic was a one off proof of concept prototype with different cores. I wouldn't rule out it being a new mars card as it may simply be a refinement and evolution of that concept.



Then we could match 9800GX with GTX295... the power circuit uses different Ics with with less phase count. Also nvidia's board has only DVI port connections... It isn't an evolution... it is just a cheaper design... using different components. It could be, that mars card has more layers too... 
The layout rules are same for every card concerning wire length for memory bus, power lines etc... so that the components are placed more or less in the same place.

The power consumption cannot be more than we can attach to it - 2*150W + 75W from PCIe.


----------



## Over_Lord (Nov 20, 2010)

CHAOS_KILLA said:


> It bother's me "plants 5,000 tree's" aahh I feel better, now I can play Crysis 3 with my 2000W GTX 795



Thank you, I thought everyone here (barring some) were insolent and insensitive fools


----------



## SabreWulf69 (Nov 20, 2010)

Meh, meh and meh. This ain't about carbon neutral foolery, it's about an awesome graphics card. We pay our power bill, and we WILL use it the way we see fit, and what a hell of a way to use a load of it. What would be awesome is a dual-gpu with them on the same die, like CPU's (pardon me if they exist already).


----------



## H82LUZ73 (Nov 20, 2010)

So in theory this would be about 5fps faster then a GTX580 sli setup right?


----------



## a_ump (Nov 20, 2010)

I'll believe this when it's released, though i'm def getting the feeling its going to have a weaker core config than the GTX 580, perhaps 2 570 chips, which would be kewl bc it'd show us what the 570's hardware specs. I feel that the 570 is gonna be a hella great deal, kinda like the 5850 was. 

As for Nvidia taking back the single and dual gpu crown. I highly doubt it. Im not gonna be surprised if AMD purposelly helped the leaked rumors saying their 6970 has 1500 some shaders. sorta given em the benefit of the doubt or at least that they learned from Nvidia's mistake of dilly dallying. 

Watch it release with 1920 shaders or something, 900mhz core clock. that'd be tight, n of course ROP's n whatnot would get n increase. its been said that the 6970/50 chip are a lil off coure from AMD's usual small but extremely efficient per mm2, So yea with the bumps in specs they spoke of, this chip would be smaller than 580


----------



## mdsx1950 (Nov 20, 2010)

Finally a nVidia card worth being in my rig!


----------



## Eva01Master (Nov 20, 2010)

I hope they call this monster if ever produced GTX 580 X2/GTX 570 X2/GTX 560 X2, so it's very clear which kind of GPU core they're using to power it to the average gamer with deep pockets.


----------



## TAViX (Nov 20, 2010)

H82LUZ73 said:


> So in theory this would be about 5fps faster then a GTX580 sli setup right?


NOT A CHANCE!!!!! :shadedshu
maybe 10% slower, my guess....
And If you ask me there will be two 580GPUs with 460 (560??) freqs.




Eva01Master said:


> I hope they call this monster if ever produced GTX 580 X2/GTX 570 X2/GTX 560 X2, so it's very clear which kind of GPU core they're using to power it to the average gamer with deep pockets.



*X2* is (was...) ATI trademark bro!!!:shadedshu


----------



## cadaveca (Nov 20, 2010)

LAN_deRf_HA said:


> http://www.techpowerup.com/img/10-11-19/147a.jpg
> http://www.techpowerup.com/img/10-07-16/mars_ii_1.jpg



To me these are the same card, but with different power section layouts.


----------



## xtremesv (Nov 20, 2010)

The sleeping lion woke up

AMD run!!!


----------



## lism (Nov 20, 2010)

micropage7 said:


> it makes me remember the processor development, it started from single core, then the push the clock, when the max clock reached they switched into multi core until now, it looks gonna be like that they started with single, then dual, then put multi core on one package
> lets see..



But its not true with GPU's.

AMD / Ati has chosen to develop really tight GPU's with lesser power consumption, but lower developing-costs. Nvidia makes GPU's which are larger, tougher to make and having problems with TSMC making good wafles of chips lol.

I'd prefer a chip with less power consumption and being efficient as possible instead of a chip that generates more then 400Watts of heat in games.

Anyway, this is more of an answer towards AMD's upcoming 6990.... Nividia knows its going to get their ass kicked.


----------



## TheMailMan78 (Nov 20, 2010)

I love reading all the comments in a thread like this....

"Oh this is going to destroy the 5970!" and "AMD better run" 

I sure as hell hope a duel GPU offering thats a YEAR newer can beat the 5970. Also if the 580 is any indicator of where Nvidia is going then you better get ready to have your bubble burst.

Personally I am waitting this gen. out unless a 6970 is faster then two 5870s Ill call fail. The 580 was let down enough.


----------



## SabreWulf69 (Nov 20, 2010)

Let down or not, the fact is at this very moment the GTX 580 is still the fastest single-GPU card out there, like it or lump it, prove me otherwise. As I'm pretty sure I have also mentioned, SLI has always scaled better too than ATI/AMD's Crossfire. This is all I care about. I want links of reviews to prove me otherwise, please no more speculation, the title is "NVIDIA Readying Dual-GF110 Graphics Accelerator, Eying Total Performance Leadership".


----------



## CDdude55 (Nov 20, 2010)

TheMailMan78 said:


> I love reading all the comments in a thread like this....
> 
> "Oh this is going to destroy the 5970!" and "AMD better run"
> 
> ...



It's usually the other way around in Nvidia threads, i'm actually surprised we got to over three pages without consistent trolling that usually manifests itself in these threads.

I think the 6970 will beat the 580 by a fair amount, because as i said, the 580 is still building off the same structure, while AMD is working from the ground up. And that doesn't mean it will definitely be better, but it gives them a very high chance of toppling Nvidia on that front.


----------



## KainXS (Nov 20, 2010)

SabreWulf69 said:


> Let down or not, the fact is at this very moment the GTX 580 is still the fastest single-GPU card out there, like it or lump it, prove me otherwise. As I'm pretty sure I have also mentioned, SLI has always scaled better too than ATI/AMD's Crossfire. This is all I care about. I want links of reviews to prove me otherwise, please STFU n00bs.



yes nvidia's highest end card that paper launched 2 weeks ago is currently the fastest single gpu card, it would be a fuckin shame if it wasn't and SLI has always scaled better than crossfire, . . . . o . . . k . . . 

hold back on calling people noobs

I'm also thinking the 6970 will have similar performance to the to the 580 because we know its in the area based on everything we have seen, and by Nvidia releasing their card first it gave AMD the chance to see if they needed to boost clocks or not to compete.

going back in ATI vs Nvidia, its usually been Nvidia dominating ATI since the X1900XTX but now Nvidia has some real competition and competition = lower prices for us.


----------



## N3M3515 (Nov 20, 2010)

SabreWulf69 said:


> Let down or not, the fact is at this very moment the GTX 580 is still the fastest single-GPU card out there, like it or lump it, prove me otherwise. As I'm pretty sure I have also mentioned, SLI has always scaled better too than ATI/AMD's Crossfire. This is all I care about. I want links of reviews to prove me otherwise, please STFU n00bs, no more speculation, the title is "NVIDIA Readying Dual-GF110 Graphics Accelerator, Eying Total Performance Leadership".



Acording to this, crossfire is on the same level or better than sli in terms of scaling.


----------



## CDdude55 (Nov 20, 2010)

It depends on what cards are being used and what software/games are being ran (and drivers). Generally it's been said that SLI scales better.


----------



## the54thvoid (Nov 20, 2010)

SabreWulf69 said:


> ...As I'm pretty sure I have also mentioned, *SLI has always scaled better too than ATI/AMD's Crossfire*. This is all I care about. I want links of reviews to prove me otherwise, please STFU n00bs...



This is too easy...

Can I quote you? *"I want links of reviews to prove me otherwise"*
One GTX 580 is 77% performance of an sli set up.





One HD 6870 is 73% performance of crossfire set up.




One HD 6850 is 69% performance of a crossfire set up.





So the 6 series scale better than the GTX 580.

As i'm an adult i wont resort to saying "STFU n00b", instead i'll say, read some reviews about what you're talking about before you troll.


----------



## Fourstaff (Nov 20, 2010)

High end scaling generally encounters some problems with bottleneck elsewhere. 

http://www.techpowerup.com/reviews/ATI/Radeon_HD_6870_CrossFire/23.html
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580_SLI/24.html

If you look at the 2560X1600 segment (where graphics is the main suspect for bottleneck, vram or not) the picture becomes less rosy.

Edit: look at mid end graphics card SLi 
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_460_SLI/27.html

Can't see ATi winning now, can we?


----------



## HossHuge (Nov 21, 2010)

N3M3515 said:


> Acording to this, crossfire is on the same level or better than sli in terms of scaling.



I don't think anyone else read this.


----------



## Benetanegia (Nov 21, 2010)

the54thvoid said:


> This is too easy...
> 
> Can I quote you? *"I want links of reviews to prove me otherwise"*
> One GTX 580 is 77% performance of an sli set up.
> ...


----------



## LAN_deRf_HA (Nov 21, 2010)

I think this is a good argument for why modular performance charts are so great. It'd be very useful if I could just check off all the games I play on a list, on the res I play at, and then see relative performance for that selection. What's most relevant to that person. The whole Sli/xfire scaling thing is different depending on the website because naturally they're not all using the same game selection. Hell I remember seeing a game where xfire scaled 101% It really not a definitive for everyone type of thing.


----------



## Selene (Nov 21, 2010)

The GTX580 just came out, so I would say allot more scaling will come from drivers.


----------



## mdsx1950 (Nov 21, 2010)

SabreWulf69 said:


> Let down or not, the fact is at this very moment the GTX 580 is still the fastest single-GPU card out there, like it or lump it, prove me otherwise. As I'm pretty sure I have also mentioned, SLI has always scaled better too than ATI/AMD's Crossfire. This is all I care about. I want links of reviews to prove me otherwise, please STFU n00bs, no more speculation, the title is "NVIDIA Readying Dual-GF110 Graphics Accelerator, Eying Total Performance Leadership".



The GTX 580 is the fastest single-GPU card till Dec 22nd and it cant even beat a card (HD5970) that's almost an year older. And no SLI hasn't always scaled better. (look at the54thvoid's post)

This card is going to be powerful, i agree. But I'm not sure it will beat the HD6990.


----------



## CDdude55 (Nov 21, 2010)

mdsx1950 said:


> The GTX 580 is the fastest single-GPU card till Dec 22nd and it cant even beat a card (HD5970) that's almost an year older. And no SLI hasn't always scaled better. (look at the54thvoid's post)
> 
> This card is going to be powerful, i agree. But I'm not sure it will beat the HD6990.



The GTX 580 mingles with the 5970, and really that's all you can ask for when using the same architecture.

And lower end cards tend to scale better, im not surprised something like the 6800 series would scale better, but generally it's been SLI that has been known for better scaling, but of course as i said before, there are a lot of factors that determine that, GPU architecture, drivers, software/games being ran, resolution etc.


----------



## entropy13 (Nov 21, 2010)

Benetanegia said:


> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_580_SLI/images/perfrel_2560.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_SLI/images/perfrel_2560.gifhttp://tpucdn.com/reviews/ATI/Radeon_HD_6850_CrossFire/images/perfrel_2560.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_460_SLI/images/perfrel_2560.gif



LOL you use the data from the max. resolution only against the data that represents all resolutions 

Over in guru3d DiRT2, Far Cry 2 and Crysis Warhead at lower resolutions shows the 2-way SLI of the 580 slightly ahead of 3-way, but if I were to follow your reasoning Benetanegia that's inconsequential because apparently only max resolution matters and not the data for ALL resolutions...


----------



## SabreWulf69 (Nov 21, 2010)

*Sigh* Go figure, takes a dual-GPU card to compete with NVIDIA's single-GPU ones at this very moment, the dual NVIDIA is gonna be win.






"Release timing for the 6970 isn't certain and may be in flux. One leak has claimed that chip yield problems at AMD's manufacturing partner, TSMC, have prevented the new Radeon from shipping for an intended end of November release. With less than 10 percent of 6970 chips at an acceptable quality, AMD can't make enough stock, TechEye said. The slip may push the faster 6000-series card into early 2011 and leave just the Radeon HD 6870 as the most recent graphics component."

Sorry for the hypocritical speculation but we are still waiting.


----------



## entropy13 (Nov 21, 2010)

SabreWulf69 said:


> *Sigh* Go figure, takes a dual-GPU card to compete with NVIDIA's single-GPU ones at this very moment, the dual NVIDIA is gonna be win.
> 
> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_580/images/perfrel.gif
> 
> ...



You entirely missed the point. It wasn't about performance but scaling. Let me lol. 

There. Done. Let me "*sigh*" as well.


----------



## SabreWulf69 (Nov 21, 2010)

LOL at me indeed, shouldn't have quoted, should have replied... There fixed, my bad, sorry.


----------



## fochkoph (Nov 21, 2010)

Either way nVidia recovering from the the GTX 400 series reminds me of McLaren's comeback during 2009's F1 season. Pretty impressive to say the least.


----------



## Eva01Master (Nov 21, 2010)

TAViX said:


> *X2* is (was...) ATI trademark bro!!!:shadedshu



My bad, you're right, I rectify; they should be called GTX-580GX2 as nVidia called the 7950GX2 and the 9800GX2.


----------



## MxPhenom 216 (Nov 21, 2010)

CDdude55 said:


> It's usually the other way around in Nvidia threads, i'm actually surprised we got to over three pages without consistent trolling that usually manifests itself in these threads.
> 
> I think the 6970 will beat the 580 by a fair amount, because as i said, the 580 is still building off the same structure, while AMD is working from the ground up. And that doesn't mean it will definitely be better, but it gives them a very high chance of toppling Nvidia on that front.



and theres also a high chance that the 6970 will fail. remember the 2900 when ati tried a bit to hard?


----------



## wolf (Nov 21, 2010)

the54thvoid said:


> This is too easy...
> So the 6 series scale better than the GTX 580.
> 
> As i'm an adult i wont resort to saying "STFU n00b", instead i'll say, read some reviews about what you're talking about before you troll.





Benetanegia said:


> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_580_SLI/images/perfrel_2560.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_SLI/images/perfrel_2560.gifhttp://tpucdn.com/reviews/ATI/Radeon_HD_6850_CrossFire/images/perfrel_2560.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_460_SLI/images/perfrel_2560.gif



lol

Given the right circumstances both camps have proven they can scale between 80-100% this is a moot argument.

one site's review is not an all-indicative factore of scaling performance, especially given t
he massive variety of games tested, some being AMD friendly, some NV, and some scaling badly for both.

generally, over the populous of polular (mostly MP) games, Nvidia tend to have better scaling, given you need and insanely powerfull base system to feed GPU's like 2x GF110 and 4xCypress.


----------



## CDdude55 (Nov 21, 2010)

nvidiaintelftw said:


> and theres also a high chance that the 6970 will fail. remember the 2900 when ati tried a bit to hard?



How does the 2900 link to today's manufacturing?, inevitably each company will hit or miss with a series eventually, sometimes you don't get what you hoped. Look at Fermi, it was a failure efficiency wise but you rework it for the greater good.

The 6900's have a higher chances of being better then the 580 due to the new design, they have an early look at Nvidias cards already and if they focus and design the card to really be monstrous, then i think inevitably the 6900's will come out on top.

I'm probably one of the few people that don't actually think the 580's are supposed to compete against the 6900's. I think if it was we would be seeing a much different card, the 580 was a refresh righting the wrongs of the 480 and adding a good 10%-12% or so increase performance wise, that is all.


----------



## bear jesus (Nov 21, 2010)

CDdude55 said:


> The 6900's have a higher chances of being better then the 580 due to the new design, they have an early look at Nvidias cards already and if they focus and design the card to really be monstrous, then i think inevitably the 6900's will come out on top.
> 
> I'm probably one of the few people that don't actually think the 580's are supposed to compete against the 6900's. I think if it was we would be seeing a much different card, the 580 was a refresh righting the wrongs of the 480 and adding a good 10%-12% or so increase performance wise, that is all.



I agree but i think the performance increase caused by increased clocks and all 512 cuda cores enabled really would have been where the 480 would have been if it could have been made right from the start but this whole thing was not really Nvidia's fault, it was down to he 40nm process issues.

But it makes me wonder if the 480 was as it should have been (the 580) what would nvidia be doing now, if its single chip card beat ATI's dual chip card from when it should have been released (around the 58xx launch) would nvidia be releasing the 5xx cards this year? would they have even released anything other than a dual chip card before the transition to 28nm?

I really hope the 6970 beats the 580 as if everything had worked out then AMD would be trying to catch up to the year old 480 with the power of the 580, unfortunately however it turns out i will probably have to buy a 6970 to get back to using a single card/chip and then just keep hoping that nvidia will support triple monitor setups on a single card/chip for the 680 so it can be an option for me.


----------



## the54thvoid (Nov 21, 2010)

Benetanegia said:


> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_580_SLI/images/perfrel_2560.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_SLI/images/perfrel_2560.gifhttp://tpucdn.com/reviews/ATI/Radeon_HD_6850_CrossFire/images/perfrel_2560.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_460_SLI/images/perfrel_2560.gif



Well done Ben.  But you entirely miss the point here. 

The arguement wasn't about who generally scales better.  What Sabrewulf explicitly said was:

SLI *has always* scaled better too than ATI/AMD's Crossfire

So i gave the link (pics) as requested:

I want links of reviews to *prove me otherwise*

My link proves otherwise.  Yes, of course Nvidia cards scale better on the whole but the post was short sighted enough to say NV ALWAYS scales better which is no longer true as the 6 series is making headway.

You dont always have to defend NV from me.  I'm 90% on way to buying a new card and it's more than likely going to be NV (unless HD 6970 is surprisingly good).  But to defend a post that is in fact wrong by using irrelevant info doesn't help.

Your post didnt disprove me at all.


----------



## SabreWulf69 (Nov 21, 2010)

Yup, taken with a grain of salt  Not been good with my wording today, my brain is feeling melty lol, here's to hoping this card at least makes it to commercial availability.


----------



## mdsx1950 (Nov 21, 2010)

We all have to wait and see what the 6950/6970/6990 and GTX595 will be before making huge assumptions.



nvidiaintelftw said:


> and theres also a high chance that the 6970 will fail. remember the 2900 when ati tried a bit to hard?



nVidia fanboy much? :shadedshu

EDIT - Your username pretty much explains everything lol.


----------



## Judas (Nov 21, 2010)

2 of those in SLI ,would need your own nuclear power reactor in your shed to run them


----------



## Yellow&Nerdy? (Nov 21, 2010)

Only two 8-pin power connections? That's pretty surprising. Was waiting to see two 8-pin and a 6-pin. But the power circuitry looks pretty robust, indicating huge power consumption and heat output. TDP is probably going to end up being around 400W and the stock cooler will be either a 3-slot monster or a waterblock. So it's most likely two GF110s with some shader cores locked and downclocked. Personally I would of liked to see a dual GF104/GF114 card, like two fully enabled GF104.


----------



## claylomax (Nov 21, 2010)

Judas said:


> 2 of those in SLI ,would need your own nuclear power reactor in your shed to run them



*sigh*


----------



## SabreWulf69 (Nov 21, 2010)

Need 4 of them in SLI on a SR2 mobo :-D


----------



## TAViX (Nov 21, 2010)

SabreWulf69 said:


> Let down or not, the fact is at this very moment the GTX 580 is still the fastest single-GPU card out there, like it or lump it, prove me otherwise. As I'm pretty sure I have also mentioned, SLI has always scaled better too than ATI/AMD's Crossfire. This is all I care about. I want links of reviews to prove me otherwise, please no more speculation, the title is "NVIDIA Readying Dual-GF110 Graphics Accelerator, Eying Total Performance Leadership".



You guys are keep talking about single GPU, dual-GPU.....This is nothing. The ONLY important thing, is that AMD has now the most powerful card on the market. It can have 1000 GPUs, this is irrelevant. The RELEVANT thing is there is no single card that can match AMD video card.
You are to young and innocent to remember that back on the time, the best 3D card was 3dfx VooDoo with 2 GPUs, than Voodoo 2 with *3* GPUs, than the prototype Voodoo 5 with *4* GPUs, etc. Everybody was talking about the card, not about how many GPUs had. 
The same thing with this card. If they will sale this, probably nvidia will became the producer with the fastest graphics card in the market. That is all that matters.


----------



## SabreWulf69 (Nov 21, 2010)

Those 3dfx cards were the bomb, add-ons with daughter boards, and yeah the prototypes as you mentioned. If only they were still around... I guess stats wise, most powerful overall card - 5970, most powerful dual-gpu card - 5970, most powerful single-gpu card - 580, and yeah here's to hoping both companies pull out in front from time to time, healthy competition is always good and keeps prices on both sides lower.


----------



## mdsx1950 (Nov 21, 2010)

TAViX said:


> You guys are keep talking about single GPU, dual-GPU.....This is nothing. The ONLY important thing, is that AMD has now the most powerful card on the market. It can have 1000 GPUs, this is irrelevant. The RELEVANT thing is there is no single card that can match AMD video card.
> You are to young and innocent to remember that back on the time, the best 3D card was 3dfx VooDoo with 2 GPUs, than Voodoo 2 with *3* GPUs, than the prototype Voodoo 5 with *4* GPUs, etc. Everybody was talking about the card, not about how many GPUs had.
> The same thing with this card. If they will sale this, probably nvidia will became the producer with the fastest graphics card in the market. That is all that matters.



Well said mate!


----------



## KainXS (Nov 21, 2010)

damn man we went from



CDdude55 said:


> It's usually the other way around in Nvidia threads, i'm actually surprised we got to over three pages without consistent trolling that usually manifests itself in these threads.



to this:shadedshu


----------



## Fourstaff (Nov 21, 2010)

I shall now propose that we have a special Goodwin's law for TPU: Reductio ad Fanboyum and Reductio ad 3Dfx


----------



## mlee49 (Nov 21, 2010)

I want one! Hell two!!



SabreWulf69 said:


> Need 4 of them in SLI on a SR2 mobo :-D



Too bad it's only 2 way SLI capable. Only one sli connector on the top of the card.


----------



## a_ump (Nov 21, 2010)

mlee49 said:


> I want one! Hell two!!
> 
> 
> 
> Too bad it's only 2 way SLI capable. Only one sli connector on the top of the card.



well they only needs one...there's 2 gpus per card, 4gpus in SLI. they haven't moved past 4 gpus for consumers.


----------



## SabreWulf69 (Nov 21, 2010)

Awww :-( lol, also would removing the power limiting chips and using them with PCI-E 3.0 be of any benefit?


----------



## CDdude55 (Nov 21, 2010)

SabreWulf69 said:


> Awww :-( lol, also would removing the power limiting chips and using them with PCI-E 3.0 be of any benefit?



The power limiter only affects certain programs (OCCT and Furmark it looks like), removing it won't give you some type of hidden performance in real world situations.

PCI-E bandwidth doesn't have anything to do with the limiter, and keep in mind, PCI-E 3.0 isn't out and we still have barely saturated 1.0/1.1 anyways.


----------



## Benetanegia (Nov 21, 2010)

the54thvoid said:


> Well done Ben.  But you entirely miss the point here.
> 
> The arguement wasn't about who generally scales better.  What Sabrewulf explicitly said was:
> 
> ...



I was not proving nor disproving anything much less defending any POV from this thread, I just posted some of W1zzard's charts, in order to show a wider range of cards. I said nothing, nor it was my intetion to say anything, just present a wider ammount of data than the one you posted, because it was limited to mid-range compared to high-end, when it's obvious to everyone that mid-range will always scale better...

But just look how true the "SLI scales better" argument is, generally, that without any comment, by only posting some charts, some empirical data collected by W1zz, so many people in this thread suddenly thought I was implying SLI is better. Sorry guys that's only your own subconscious betraying yourselves and indirectly making you agree with something you would never admit...



entropy13 said:


> LOL you use the data from the max. resolution only against the data that represents all resolutions
> 
> Over in guru3d DiRT2, Far Cry 2 and Crysis Warhead at lower resolutions shows the 2-way SLI of the 580 slightly ahead of 3-way, but if I were to follow your reasoning Benetanegia that's inconsequential because apparently only max resolution matters and not the data for ALL resolutions...



First of all, read above. 

Second, of course only max resolution matters on this particular SLI/Crossfire debate (GF110 vs Cayman). I refuse to judge $800-1000 graphics setups based on low resolutions, that's stupid. I can go even farther, anyone who does $500-1000 SLI/Crossfire in order to play on anything below 1920x1200 8xAA or 2560x1600 4xAA is just stupid, let alone at the lowest 3 out of 5 of the resolutions that W1zz uses in his reviews. If you want to game at a max res of 1680x1050, buy a single GTX460 or HD6850 and that's it.


----------



## kid41212003 (Nov 21, 2010)

Ben... I admire your patience.


----------



## SabreWulf69 (Nov 21, 2010)

CDdude55 said:


> The power limiter only affects certain programs (OCCT and Furmark it looks like), removing it won't give you some type of hidden performance in real world situations.
> 
> PCI-E bandwidth doesn't have anything to do with the limiter, and keep in mind, PCI-E 3.0 isn't out and we still have barely saturated 1.0/1.1 anyways.



Shoulda looked that one up myself, "Q15: Does PCIe 3.0 enable greater power delivery to cards?
A15: The PCIe Card Electromechanical (CEM) 3.0 specification consolidates all previous form factor power delivery specifications, including the 150W and the 300W specifications.", I'm wondering if this GTX 595 won't be then what cards will be using the new specification.


----------



## TAViX (Nov 21, 2010)

Benetanegia said:


> ... of course only max resolution matters on this particular SLI/Crossfire debate (GF110 vs Cayman). I refuse to judge $800-1000 graphics setups based on low resolutions, that's stupid. I can go even farther, anyone who does $500-1000 SLI/Crossfire in order to play on anything below 1920x1200 8xAA or 2560x1600 4xAA is just stupid, let alone at the lowest 3 out of 5 of the resolutions that W1zz uses in his reviews. If you want to game at a max res of 1680x1050, buy a single GTX460 or HD6850 and that's it.



I second that.


----------



## the54thvoid (Nov 21, 2010)

Benetanegia said:


> Sorry guys that's only your own subconscious betraying yourselves and indirectly making you agree with something you would never admit...



Well, not sure who that was for.  When it comes to technology and comments i have no betrayals.  And have no problem admitting this or that.  My subconscious knows it's place and it's at home with a pint and a copy of New Scientist.

As for the position of a dual GF110 for total performance leadership, it becomes very relevant for high end scaling as it will be two near top gpu's from both camps slugging it out.  Fudzilla suggests it will be out soon, others say it will wait for the 6990.  
I think these cards will be crippled by their own siblings.  Does putting two gpu's onto one core (and surely downclocking or underpowering them) not give less performance than two sli or crossfired cards?
i.e lets say the green dualie is two GF104's.  (i can't believe it will be two fully operational 580's.)  Therefore to 580's in sli would be better.  So cost will have to be carefully considered.
Likewise for AMD.  Two fully operational 6970's seems too much (though we dont know how they are yet!).

Like i've said before in this thread somewhere - not for me.  I'd rather have the one powerful gpu from now on - and i'm leaning green right now.  Maybe Dec 13th will change my mind?


----------



## Selene (Nov 21, 2010)

It's just like the GTX295 was 2x GTX275s with the clocks turned down, but it still out did the GTX285 by a good amount and holds it on still today in DX9/10 apps.
I have 2 GTX260's that for the most part are on par with the 295 witch is why I have yet to upgrade.
The GTX 595 should be good a great card, I will more then likely pick on up due to it having NV surround on one card, I am running 2d surround on my 260s now and for the most part is great, I do run out of Vram on some games.


----------



## TAViX (Nov 22, 2010)

I'm dead curious if this card will be more expensive than the GTX 8800 ULTRA back on the days...


----------



## wolf (Nov 22, 2010)

TAViX said:


> You guys are keep talking about single GPU, dual-GPU.....This is nothing. The ONLY important thing, is that AMD has now the most powerful card on the market. It can have 1000 GPUs, this is irrelevant. The RELEVANT thing is there is no single card that can match AMD video card.
> You are to young and innocent to remember that back on the time, the best 3D card was 3dfx VooDoo with 2 GPUs, than Voodoo 2 with *3* GPUs, than the prototype Voodoo 5 with *4* GPUs, etc. Everybody was talking about the card, not about how many GPUs had.
> The same thing with this card. If they will sale this, probably nvidia will became the producer with the fastest graphics card in the market. That is all that matters.



this is my answer to that.



newtekie1 said:


> There are a couple problems with that possition.
> 
> The main one, and the one that caused me to stop using dual-card solutions, is that when a new game comes out Crossfire and SLi both have to be optimized for it before they really work.  So while BC2 might be one of the better examples of a game that gives very good performance scaling with SLi and Crossfire, other games do not yeild that great of performance scaling especially when they are first released.
> 
> ...



the GTX580 is what, a few % behind a 5970? I'd take that gap anday for the above reasons.


----------



## TAViX (Nov 22, 2010)

I think this that was valuable a few years ago, nowadays 90% (10% percent are crappy console ports anyways, that don't even use Quad Core Procs) of the good games released can take Crossfire/SLI. I agree that every month there are more and more optimizations for dual/tri cards but not more than 5-15%.


----------



## TheMailMan78 (Nov 22, 2010)

TAViX said:


> *I think this that was valuable a few years ago, nowadays 90% (10% percent are crappy console ports anyways, that don't even use Quad Core Procs)* of the good games released can take Crossfire/SLI. I agree that every month there are more and more optimizations for dual/tri cards but not more than 5-15%.



Ever play BC2 or Metro? Try running one of those without a quad.


----------



## SabreWulf69 (Nov 22, 2010)

Quite Easily


----------



## TheMailMan78 (Nov 22, 2010)

SabreWulf69 said:


> Quite Easily



You must have very low standards.


----------



## SabreWulf69 (Nov 22, 2010)

I'll take that with a pinch of salt before having to report you again for flaming. Alas I don't know why it takes you so much to run it, but as you can see by these which I just took, I have no trouble what so ever playing at max graphics at 1680x1050 on either. (These are also action shots of me trying to get the frame rate to plummet).

BC2:






Metro 2033:





Then there's all of these located here -->  My Personal Generalized Benchmark Screenshots

Not sure what low standards related and relevant to this that you are talking about really.


----------



## D4S4 (Nov 22, 2010)

I really can't wait to see the cooling solution for this. And i'll be disappointed if they simply slap on a waterblock.


----------



## SabreWulf69 (Nov 22, 2010)

Indeed, I hope it is as innovative as the cooler was for the GTX 580 and the VaporX ones for those ATI cards. I can't wait either, should be interesting


----------



## TheMailMan78 (Nov 22, 2010)

SabreWulf69 said:


> I'll take that with a pinch of salt before having to report you again for flaming. Alas I don't know why it takes you so much to run it, but as you can see by these which I just took, I have no trouble what so ever playing at max graphics at 1680x1050 on either. (These are also action shots of me trying to get the frame rate to plummet).
> 
> BC2:
> http://img.techpowerup.org/101122/BFBC2Game 2010-11-23 04-52-00-46.jpg
> ...



No one is flaming. You are just sensitive. But if you feel like reporting me go right ahead. Just make sure you send the complaint to Black Panther. I haven't got an infraction from her yet.

Anyway 39 fps with nothing going on (try multiplayer) kinda sucks man.


----------



## SabreWulf69 (Nov 22, 2010)

You got any screenshots? It works fine for me, as mentioned they are action shots in the middle of me shooting peoples, with snow going on, me shooting at them and them shooting at me


----------



## TheMailMan78 (Nov 22, 2010)

SabreWulf69 said:


> You got any screenshots? It works fine for me



Ill post some tonight.


----------



## TAViX (Nov 22, 2010)

I find games playable from 30 fps just fine. And from 40 fps I cannot tell the difference...


----------



## KashunatoR (Nov 23, 2010)

you need at least constant 60 fps for a game to run perfectly. funny thing is no dual gpu can deliver that so far due to microstuttering(it's really that bad with both hd 5970 and gtx 295, trust me, i've tested them a lot). that's why I always stick to the best single gpu, preferably nvidia since i hate software problems and i call myself an enthusiast not giving a rat's ass about bang/buck. hence the gtx 580 replacing the current gtx 480 the day after tomorrow  the sad thing though is that i will hardly notice any gameplay improvement since gtx 480 is already overkill for the games i play


----------



## SabreWulf69 (Nov 23, 2010)

I'm gonna love seeing this new card walk all over these framerates with ease. Will just be so damned impressive. I hope these shots are a bit better then. Now I've shown you mine, will you show me yours ?

BC2 With nothing going on:





BC2 With lots going on:





Metro 2033 loaded screen:





(All full graphics @ 1680x1050)


----------



## Synci (Nov 23, 2010)

MikeX said:


> 600 watt card...
> 10 years later 2000watt card?



 2000watt card... so when used its cooler should be placed outside your room just like the air conditioner ... that's funny.


----------



## TAViX (Nov 23, 2010)

KashunatoR said:


> you need at least constant 60 fps for a game to run perfectly. funny thing is no dual gpu can deliver that so far due to microstuttering(it's really that bad with both hd 5970 and gtx 295, trust me, i've tested them a lot). that's why I always stick to the best single gpu, preferably nvidia since i hate software problems and i call myself an enthusiast not giving a rat's ass about bang/buck. hence the gtx 580 replacing the current gtx 480 the day after tomorrow  the sad thing though is that i will hardly notice any gameplay improvement since gtx 480 is already overkill for the games i play



Do you want to bet that you cannot feel the difference between 40 fps and 60 fps in...let's say GRiD or any shooter that doesn't require ultra precise snipping and stuff???


----------



## KashunatoR (Nov 23, 2010)

TAViX said:


> Do you want to bet that you cannot feel the difference between 40 fps and 60 fps in...let's say GRiD or any shooter that doesn't require ultra precise snipping and stuff???



you may be right about racing games but try playing bfbc2 online at 40 fps and you'll be obliterated . you'll nevere get steady fps in online gaming, that's why you need as many fps as possible so that you're minimum framerate is 60.


----------



## bear jesus (Nov 23, 2010)

KashunatoR said:


> you may be right about racing games but try playing bfbc2 online at 40 fps and you'll be obliterated . you'll nevere get steady fps in online gaming, that's why you need as many fps as possible so that you're minimum framerate is 60.



I think the point here really is about minimum frame rates, if the average is 40 to 60 then its very easy to get lows of below 30 and for most of the games i play if there is drops down from 60 (always have vsync on) to 30 or below the difference is *very *noticeable... lucky that only happened on my 4870, my 6870's make everything silky smooth no matter what


----------



## MikeX (Nov 23, 2010)

Synci said:


> 2000watt card... so when used its cooler should be placed outside your room just like the air conditioner ... that's funny.



It could occupied 6 pci cooling slots


----------



## SabreWulf69 (Nov 23, 2010)

how can an average be 40-60? an average at least mean wise is a set value...


----------



## bear jesus (Nov 23, 2010)

SabreWulf69 said:


> how can an average be 40-60? an average at least mean wise is a set value...



It was meant as somewhere between there not actually 40 through to 60, as in 40, 45, 50, 55, 60 or anything between them.


----------



## SabreWulf69 (Nov 23, 2010)

Amount of values = the divider value, then use this number as the divisor of all the values added together to get the mean average. As in 2 values, Value 1=40, Value 2=60, amount of values = 2, so 40+60=100, then divide that by 2 and you get 50. 50 Is then the average framerate


----------



## bear jesus (Nov 23, 2010)

SabreWulf69 said:


> Amount of values = the divider value, then use this number as the divisor of all the values added together to get the mean average. As in 2 values, Value 1=40, Value 2=60, amount of values = 2, so 40+60=100, then divide that by 2 and you get 50. 50 Is then the average framerate



yes i know how to work out an average 

I just meant an average within the range of 40 to 60, as in either 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59 or 60 as the average as any of those as an average would easily apply to what i was saying i just did not want to be too specific and pick one.


----------



## TAViX (Nov 24, 2010)

KashunatoR said:


> you may be right about racing games but try playing bfbc2 online at 40 fps and you'll be obliterated . you'll nevere get steady fps in online gaming, that's why you need as many fps as possible so that you're minimum framerate is 60.



Do you have any other examples except bfbc2, bfbc2, bfbc2?!?! Why in the world do you think that everybody is playing bfbcbcb2 or similar online games?!?? Some are playing maybe Starcraft2, CS, Unreal/Quake, etc, that's for online gaming....And the rest are not playing at all, or almost... Personally I enjoy single player games like car games, Mass Effect 2, Dragon Age,  GTA, FreeSpace 2, HAWX, Lock On, Starcraft, C&C, etc etc etc. Online games are just a bonus....


----------



## mdsx1950 (Nov 24, 2010)

TAViX said:


> Do you have any other examples except bfbc2, bfbc2, bfbc2?!?! Why in the world do you think that everybody is playing bfbcbcb2 or similar online games?!?? Some are playing maybe Starcraft2, CS, Unreal/Quake, etc, that's for online gaming....And the rest are not playing at all, or almost... Personally I enjoy single player games like car games, Mass Effect 2, Dragon Age,  GTA, FreeSpace 2, HAWX, Lock On, Starcraft, C&C, etc etc etc. Online games are just a bonus....



Yeah Bad Company 2 is old school now. I don't why some people only talk about that game!


----------



## SabreWulf69 (Nov 24, 2010)

Either do I, I didn't bring it up. God knows. *shrugs*

Still waiting for those shots....


----------



## pantherx12 (Nov 24, 2010)

SabreWulf69 said:


> Either do I, I didn't bring it up. God knows. *shrugs*
> 
> Still waiting for those shots....





What other settings were you using in those screen shots?

( not "dissin'" but the edges are really blurry. the action shot has a LOT less going on then what I used to have going in that section ( unless you took shot early : ]) (I've dust clouds and explosions and debris coming out the wazoo) 

Also I've never played metro but is this really the game that brings systems to their knees now? The floor looks like ass  It's just a texture over a really really low poly mesh, ( their is no deph to the textures, so no tessellation or that other 3d texture thing I forget the name off  )


----------



## SabreWulf69 (Nov 24, 2010)

Everything is at max. Yeah I had a shot of all the debris but I thought that one was better action wise for testing the CPU. My card isn't DX11 (it's a GTX285), so ya and resolution is also at 1680x1050. I can even fraps a video and upload to youtube if need be to show anyone. Yeah never saw what the big deal of Metro 2033's graphics is, cool sorta game, but yeah not that impressive graphically. The point is I can run them fine with a dual-core.


----------



## TheMailMan78 (Nov 24, 2010)

SabreWulf69 said:


> Either do I, I didn't bring it up. God knows. *shrugs*
> 
> Still waiting for those shots....



Been working......A LOT. Plus I forgot. Ill try and post them tonight.


----------



## TAViX (Nov 25, 2010)

Does this card come before X-mass??


----------



## CDdude55 (Nov 25, 2010)

TAViX said:


> Does this card come before X-mass??



Doubt it'll be out anytime soon.


----------



## erocker (Dec 30, 2010)

Some posts have been removed. Please keep your posts on the topic of this news article.


----------

