# GF100 512 Core Graphics Card Tested Against GeForce GTX 480



## btarunr (Aug 9, 2010)

NVIDIA seems to have overcome initial hiccups with the GF100 graphics processor, and could release a new graphics card that makes use of all 512 CUDA cores, and 64 TMUs on the GPU. The GeForce GTX 480 was initially released as a top SKU based on the GF100, with 480 out of the 512 CUDA cores enabled. What NVIDIA calls the new SKU is subject to some speculation. While GPU-Z screenshots show that the 512 core model has the same device ID (hence the same name, GeForce GTX 480), leading us to believe that this is a specifications update for the same SKU à la GeForce GTX 260 (216 SP), it seems possible that the release-grade models could carry a different device ID and name.

Expreview carried out a couple of tests on the 512 core "GTX 480" graphics card, and compared it to the 480 core model that's out in the market. NVIDIA GeForce 258.96 drivers were used. The 512 core card got a GPU Score of 10,072 points compared to 9,521 points of the 480 core card, in 3DMark Vantage Extreme preset. The additional TMUs showed an evident impact on the texture fillrate, 41.55 GTexel/s for the 512 core card against 38.82 GTexel/s for the 480 core card. 



 

 




In the second test, Crysis Warhead, with Enthusiast preset, 1920 x 1080 px, and 8x AA, the 512 core card churned out a framerate of 34.72 fps, while the 480 core card trailed at 32.96 fps. In this short bench, the 512 core laden GF100 card is 5~6% faster than the GeForce GTX 480. If NVIDIA manages to release the SKU at the same price-point as the GTX 480 as it did with the GTX 260-216, it will increase NVIDIA's competitiveness further against AMD's ATI Radeon HD 5970, which is still the fastest graphics SKU in the market. Below are screenshot comparing scores of both cards.



 



*View at TechPowerUp Main Site*


----------



## LAN_deRf_HA (Aug 9, 2010)

I thought in the last news post about this it was pointed out that this is not intended for a mainstream release. It was just some premium card from some manufacturer along the lines of the custom cards asus does.


----------



## HillBeast (Aug 9, 2010)

Um, what was it I was going to say? Oh right, "BWAHAHAHAHAHA to the fools who bought the GTX480 with the assumption it would be the most powerful card from this generation. Sucks to be you. The 512SP version is better."

Of course by no means is this a good card (as it'll use even MORE power than the GF100s already high TDP), but this is going to be hands down the most powerful (in two ways) single core GPU ever.


----------



## DanishDevil (Aug 9, 2010)

I don't know why they're wasting their time with GF100 anymore (unless they have a lot more silicon to go through still). Where's the GTX 495 (2x GTX 460s)?


----------



## theonedub (Aug 9, 2010)

I agree, GF104 for now and when a die shrink is possible we can revisit a new and improved GF100.


----------



## a_ump (Aug 9, 2010)

here here ^

GF104 has already proved it being better, the GTX 460(GF104) is better than the GTX 465(GF100)


----------



## phanbuey (Aug 9, 2010)

a_ump said:


> here here ^
> 
> GF104 has already proved it being better, the GTX 460(GF104) is better than the GTX 465(GF100)



465 is not gf100 it is totally crippled.  460 is good tho, they should have released it with 384 cores.


----------



## LAN_deRf_HA (Aug 9, 2010)

DanishDevil said:


> I don't know why they're wasting their time with GF100 anymore (unless they have a lot more silicon to go through still). Where's the GTX 495 (2x GTX 460s)?



Well as I said I'm pretty sure nvidia isn't wasting their time on this, someone else is. And I believe asus is working on a 480 x2, so they'll be one even if nvidia doesn't do it.


----------



## DanishDevil (Aug 9, 2010)

Highest TDP card ever made?


----------



## btarunr (Aug 9, 2010)

DanishDevil said:


> Where's the GTX 495 (2x GTX 460s)?



GTX 49x is 2x GTX 475, fyi. Those are lined up for Winter.


----------



## RONX GT (Aug 9, 2010)

When this MONSTER will show up??

I guess GTX 470 is best from GF100 pack. Duel solution could be interesting from this(470)...hmmmm...not thinking of TDP though..


----------



## DanishDevil (Aug 9, 2010)

Oh dear. Why in the world are they even going to attempt mass-producing a dual GF100 card? Well maybe we'll see a GTX 485 that's dual 460s.


----------



## btarunr (Aug 9, 2010)

DanishDevil said:


> Oh dear. Why in the world are they even going to attempt mass-producing a dual GF100 card? Well maybe we'll see a GTX 485 that's dual 460s.



There is no dual GF100 NVIDIA SKU. GTX 475 isn't based on GF100.


----------



## VulkanBros (Aug 9, 2010)

BIOS version 70.00.21.00.00 vs 70.00.21.00.02 .....hmmm

Where can I get this BIOS ?? 

... Maybe my very  very expensive 480 CUDA core card can turn
into a 512 CUDA core card....  

AND...get me a CoolIT OMNI ... my card is an egg-boiler


----------



## NdMk2o1o (Aug 9, 2010)

phanbuey said:


> 465 is not gf100 it is totally crippled.  460 is good tho, they should have released it with 384 cores.



Its is still based on the gf100 core the same as the 470 and 480. 

And this is slightly disappointing, 5-6% faster, whats the point?



btarunr said:


> There is no dual GF100 NVIDIA SKU. GTX 475 isn't based on GF100.



Yoe meant 465 right


----------



## DanishDevil (Aug 9, 2010)

btarunr said:


> There is no dual GF100 NVIDIA SKU. GTX 475 isn't based on GF100.



I fully retract my previous statements


----------



## btarunr (Aug 9, 2010)

NdMk2o1o said:


> Yoe meant 465 right



Wrong, GTX 475. 

There's already an SKU called GTX 465, and that's based on GF100.


----------



## NdMk2o1o (Aug 9, 2010)

DanishDevil said:


> Oh dear. Why in the world are they even going to attempt mass-producing a dual GF100 card? Well maybe we'll see a GTX 485 that's dual 460s.





btarunr said:


> Wrong, GTX 475.
> 
> There's already an SKU called GTX 465, and that's based on GF100.



Ah ok my bad had to reread your 1st comment about the GTX 475 being lined up for winter 

Hmm a GTX 475 that isn't based off gf100 that should be interesting, and hopefully means they will have kept its performance in betweeen a 470 and 480 while bringing down the TDP, gf104?


----------



## Easo (Aug 9, 2010)

VulkanBros said:


> Where can I get this BIOS ??



Are you sure that reflashing would enable all shaders?


----------



## VulkanBros (Aug 9, 2010)

Easo said:


> Are you sure that reflashing would enable all shaders?



No....I dont know if that would work...pure speculation...but would be nice if it did


----------



## D4S4 (Aug 9, 2010)

cores are most probably locked at hardware level, therefore... i have a radeon X1800GTO in my rig that has 4 pipes locked at hardware level so there was no soup for me either.


----------



## vagxtr (Aug 9, 2010)

Don't think this will be a real product. Rather some Limited Edition vaporware like ATi's XT800PE (if anyone remembers that today) and highly overpriced over already insanely priced GTX480 BSE.



DanishDevil said:


> I don't know why they're wasting their time with GF100 anymore (unless they have a lot more silicon to go through still).



They counting on high probability of always supportive fans that are out there and ready to buy YACP. And btw. i disagree that GTX480-512 should be higher TDP product than original GTX480. There's pretty huge gap between real GTX470's 210W TDP and GTX480's 260W+ TDP and these products are top binned silicon they collect from GTX480/470 production



> Where's the GTX 495 (2x GTX 460s)?



It should be there when ATi releases their HD600 series dont you worry 
btw.GTX495 should be based on same GF104 chip but it should be based on two GTX475 with fully functional all 8 cores (x48SP), 64 TMUs (like original GF100 flavour). So there should be plenty of new products from envydia this fall.



LAN_deRf_HA said:


> Well as I said I'm pretty sure nvidia isn't wasting their time on this, someone else is. And I believe asus is working on a 480 x2, so they'll be one even if nvidia doesn't do it.



Would that be GTX465x2  Or how to put two already 250W+ wasting chips onto same card, even if pcb's sandwiched together it should be at least triple slot edition with 450W TDP. So i think Asus is selling c-r-a-p yet again and collecting some freebie advertising as their fans bragging about that _*never to be released card*_ all around the net


----------



## erasure (Aug 9, 2010)

*key, kVidia*

key, kVidia!
we still wait Dual GPU Fermi


----------



## Bjorn_Of_Iceland (Aug 9, 2010)

erasure said:


> key, kVidia!
> we still wait Dual GPU Fermi



wut?


----------



## the54thvoid (Aug 9, 2010)

Why?

Why oh why?

Nvidia already has the fastest single core GPU. (and hottest, loudest etc).  Why would they use the GF100 for this?  For 6-7% increase....

Unless it's a partner doing it and not NV?

Although if Southern Islands makes it out this year perhaps this is NV's attempt to dull down ATI's 5xxx series revision.

Two words - performance / watt.  Thats all that counts.  No point having a 6-7% faster card if it's technologically backwards in respect of power draw.


----------



## ebolamonkey3 (Aug 9, 2010)

Really can't wait for the GTX 475 and GTX 49x this winter!!!

Perhaps Nvidia is releasing them during the winter so they won't run so hot? 

I kid.... I hope.


----------



## LAN_deRf_HA (Aug 9, 2010)

vagxtr said:


> Would that be GTX465x2  Or how to put two already 250W+ wasting chips onto same card, even if pcb's sandwiched together it should be at least triple slot edition with 450W TDP. So i think Asus is selling c-r-a-p yet again and collecting some freebie advertising as their fans bragging about that _*never to be released card*_ all around the net



Yes cause surely they haven't done something like that before, and have many raging fans


----------



## KainXS (Aug 9, 2010)

I'm waiting for a die shot before I believe anything, why would Nvidia release the 512sp version when we know that the tweaked architecture on the GF104 is better than the 100's.

that or they got a rare 512sp sample.

but expreview did have a wei hang GTX480 and said it was 512sp when the one wei hang had has actually 480sp, not trusting them.


----------



## crazyeyesreaper (Aug 9, 2010)

same not gonna trust this yet and oooo 2fps gain  in crysis so impressed with a peak TDP at 320w on a 480gtx im willing to guess the 512sp will hit 340w -350w tdp on a single gpu card reaching nearly 4870x2 power consumption lol fail / sarcasm


----------



## HillBeast (Aug 9, 2010)

crazyeyesreaper said:


> same not gonna trust this yet and oooo 2fps gain  in crysis so impressed with a peak TDP at 320w on a 480gtx im willing to guess the 512sp will hit 340w -350w tdp on a single gpu card reaching nearly 4870x2 power consumption lol fail / sarcasm



They really are doing this just to have the undisputed champion of performance. And heat. And power. And noise. And unreliability due to the above.


----------



## crazyeyesreaper (Aug 9, 2010)

well more power to them but from what reviews i can find around the net crysis is the wrong game to hype performance gains with as at 8xAA the 5870 is faster then even this 512sp part let alone the 5970 etc lol should have used a different game to give more hype lmao and if not ahead there still equal so yea moot point a whopping 2fps that u can get from overclocking the cpu alone let alone the gpu  and overclocking this thing id bet we would see people hit 400w draw peak


----------



## $ReaPeR$ (Aug 9, 2010)

everything depends on the price....


----------



## HillBeast (Aug 9, 2010)

crazyeyesreaper said:


> crysis is the wrong game to hype performance gains with as at 8xAA



Crysis isn't actually a hardcore epic looking game like people think, it's just badly written. It runs just as bad on my HD5870 as it did on my GTX285 as it did on my 9800GX2 as it did on my 8800 Ultra. Fail. To be honest it seems to get slower every time I play it.


----------



## CDdude55 (Aug 9, 2010)

Already running 2x 470's, so i really don't care for what they have lined up next within with Fermi architecture. I can deal with the other cards having a bit more stream processors etc.


----------



## LAN_deRf_HA (Aug 9, 2010)

HillBeast said:


> Crysis isn't actually a hardcore epic looking game like people think, it's just badly written. It runs just as bad on my HD5870 as it did on my GTX285 as it did on my 9800GX2 as it did on my 8800 Ultra. Fail. To be honest it seems to get slower every time I play it.



I thought everyone got over that assumption already? Crysis isn't poorly coded, the engine does more than other game engines do. It's just not as apparent to the average person. If you pay any sort of attention to it you'll see a huge gulf in quality between it and an unreal powered game. Even trying very hard to optimize in warhead they ultimately had to reduce quality to get a few extra frames. It's not a skill issue, it's just what you get from a no corners cut engine. Even with all the time they'll have put into crysis 2 it will only be mildly more efficient.


----------



## dalekdukesboy (Aug 9, 2010)

I already have this card...2 of them...512 graphics cards


----------



## dalekdukesboy (Aug 9, 2010)

oh...I'm sorry I have 8800gts 512...not gtx480 512 guess that's not quite the same is it? LOL least I don't have to apply for a new power grid around my house to run these cards...though I admit this is an interesting idea, but gf100 needs to die, it was an experiment and gf104 was the winning result.  I agree till they do a die shrink or modify the gf100 architecture it's just way too hot, and way too inefficient as well as expensive for me to recommend it to someone.


----------



## Bjorn_Of_Iceland (Aug 9, 2010)

dalekdukesboy said:


> though I admit this is an interesting idea, but gf100 needs to die, it was an experiment and gf104 was the winning result.  I agree till they do a die shrink or modify the gf100 architecture it's just way too hot, and way too inefficient as well as expensive for me to recommend it to someone.


Multiple GPCs (gf100) vs Lesser GPC but more Shader Procs crammed per SM's (gf104)... The former was thought to have better tesselation performance that way.. so far, the only thing reacting well to it was Heaven Benchmark.. not a lot of tessellated games anyway.. I say they should stick to cramming SPs per SM for now..


----------



## EastCoasthandle (Aug 9, 2010)

crazyeyesreaper said:


> same not gonna trust this yet and oooo 2fps gain  in crysis so impressed with a peak TDP at 320w on a 480gtx im willing to guess the 512sp will hit 340w -350w tdp on a single gpu card reaching nearly 4870x2 power consumption lol fail / sarcasm


It looks to me they maybe trying to push a 512 core variant because they know AMD is releasing 6000 series. It was said a long time ago that the difference between 480c and 512c would be about 5% difference.  So seeing that come true wasn't a shocker.  As for the power consumption it would consume more then it's actual performance increase over the 480c.  In a word it's simply underwhelming when you consider heat and power draw.  I seriously doubt it will cost the same as the 480c but we will see.


----------



## HillBeast (Aug 9, 2010)

LAN_deRf_HA said:


> I thought everyone got over that assumption already? Crysis isn't poorly coded, the engine does more than other game engines do. It's just not as apparent to the average person. If you pay any sort of attention to it you'll see a huge gulf in quality between it and an unreal powered game. Even trying very hard to optimize in warhead they ultimately had to reduce quality to get a few extra frames. It's not a skill issue, it's just what you get from a no corners cut engine. Even with all the time they'll have put into crysis 2 it will only be mildly more efficient.



I never mentioned Unreal. Unreal looks nice but I do admit CryEngine looks better but it's not the bets looking game engine IMHO. Unigine from what I've seen of demos looks better to be honest, and that's doing DX11 which is slower and it stilll goes faster than Crysis.

CryEngine may be doing alot but it's still not a great engine.


----------



## dalekdukesboy (Aug 9, 2010)

Bjorn_Of_Iceland said:


> Multiple GPCs (gf100) vs Lesser GPC but more Shader Procs crammed per SM's (gf104)... The former was thought to have better tesselation performance that way.. so far, the only thing reacting well to it was Heaven Benchmark.. not a lot of tessellated games anyway.. I say they should stick to cramming SPs per SM for now..



and that takes a lot for someone to say when I see in his sig rig he has a gtx480 of all things!! lol Admittedly performance is great, just I have a budget and hot summers, and these two g92's in sli warm this room up considerably and they even as 2 are nowhere near what a reference gtx480 does for heat/power consumption!  I'd be tempted to get the zotac amp version which at least is well cooled but you still have all that heat created which vents out the case into your room and the wall socket still would be on fire from it regardless of a good cooler lol.


----------



## EastCoasthandle (Aug 9, 2010)

Hmm, CB says that a 480 offers 24 FPS in Crysis warhead at that resolution and AA.  Even CB's 258.96 review didn't show any significant improvements for that game.  In any case we will see if this 512c is true or not.


----------



## dalekdukesboy (Aug 9, 2010)

you know what we/nvidia REALLY needs is to release the gf104 with ALL it's cores unlocked...I believe that is the gtx475 that is rumored right? That may be interesting to see what it can do.


----------



## OnBoard (Aug 9, 2010)

Hmm, core revision is blurred, so will this be A4 or B1 ?)

Anyhow, looks like NVIDIA lineup will be GTX 460 (gf104), GTX 475 (gf104), GTX 485 (gf100 revA/B), GTX 49x (2xgf104 full).

GTX 465 is EOL, GTX 470 will be EOL as soon as they sell out and GTX 480 will be EOL and of year, but will most likely be around for a while for nice discounts, like GTX 280 was after GTX 285 release.

What I don't know is what to do with all the GF100 512 shader cards that don't cut it, as there won't be a SKU for them?


----------



## FreedomEclipse (Aug 9, 2010)

vagxtr said:


> Don't think this will be a real product. Rather some Limited Edition vaporware like ATi's XT800PE (if anyone remembers that today)



wasnt that supposed to be the X850XTPE?? ive not even heard of a XT800PE - but obviously the did exist as google has thrown up some references - mainly people having issues with games on their card.  ummm I know the X850XTPE was like vapourware, but I still have one...I had the 2nd fastest overclocked X8x0 card on Guru3d back in the day the guy who beat me had a volt modded X800XT. ahh, such fond memories....


----------



## CDdude55 (Aug 9, 2010)

20 bucks says this will be called ''GTX 480 Core 512''.


----------



## a_ump (Aug 9, 2010)

hmmm u know, at stock we know who wins. but for the people that'll be buying this, enthusiasts, who probly like to overclock, i can't see this GTX 480 512 out-clocking the 480core GTX. Ergo i think that when overclocking's taken into account they'll probly be even.


----------



## DarthCyclonis (Aug 9, 2010)

theonedub said:


> I agree, GF104 for now and when a die shrink is possible we can revisit a new and improved GF100.




Absolutley.  I would not want this with the current size of the GF100 die.   A die shrink and a revision has done at lot and the GF104 is proof.  However by that time they release a revised GTX480 the HD 6XXX series will most likely be out.


----------



## phanbuey (Aug 9, 2010)

the54thvoid said:


> Why?
> 
> Why oh why?
> 
> ...




Everyone is making the assumption that this will be hotter and louder.  Clearly this is a different stepping - and one that has less leakage and therefore less draw than the original GF100.  Steppings can make a huge difference in power draw especially when manufacturing defects are the primary cause of this draw.

If anything the power draw may even stay the same or be lower than the 480 core version of the fermi.


----------



## Whitey (Aug 9, 2010)

Do you think the cooler will have heatpipes coming out the side still ?

Or something like the Galaxy vapour chamber cooling ?


----------



## Benetanegia (Aug 9, 2010)

OnBoard said:


> What I don't know is what to do with all the GF100 512 shader cards that don't cut it, as there won't be a SKU for them?



Quadro and Tesla cards which are both based on 448 SPs afaik. They are actually selling quite a few of them.



EastCoasthandle said:


> As for the power consumption it would consume more then it's actual performance increase over the 480c.



According to empiric data on the net about how Fermi behaves, it is quite the opposite. Power draw vastly depends on the clock and very little in enabled/disabled parts.

For example, the GTX465 consumes almost as much as the GTX470 (~20w difference, which is a 10%) despite having 25% of the core disabled, but the GTX470 on the other hand consumes almost 100w less (50% difference) than GTX480 although it only has 7% of shader cores and 20% ROPs disabled. That discrepancy comes from clock difference and anyone who has ever OCed a GTX470 knows that. Conclusion: the power draw difference on the 512 part would be negligible.


----------



## EastCoasthandle (Aug 10, 2010)

Benetanegia said:


> According to empiric data on the net about how Fermi behaves, it is quite the opposite. Power draw vastly depends on the clock and very little in enabled/disabled parts.
> 
> For example, the GTX465 consumes almost as much as the GTX470 (~20w difference, which is a 10%) despite having 25% of the core disabled, but the GTX470 on the other hand consumes almost 100w less (50% difference) than GTX480 although it only has 7% of shader cores and 20% ROPs disabled. That discrepancy comes from clock difference and anyone who has ever OCed a GTX470 knows that. Conclusion: the power draw difference on the 512 part would be negligible.


That's a long way of saying it will consumer more .  In any case we will find out all the details if/when such a video card comes out and reviewed properly.


----------



## Benetanegia (Aug 10, 2010)

EastCoasthandle said:


> That's a long way of saying it will consumer more .  In any case we will find out all the details if/when such a video card comes out and reviewed properly.



No that's a long way of saying it will *not* consume more, unless it's clocked higher. I was responding to the claim that power consumption increase will be higher than the performance increase, which is not true.

And if it's based on a revised part as has been suggested (revision number blurred) anything could happen. For example that the power draw of such part has the same perf/watt difference comapred to the current GTX480 as the GTX460 has with the GTX465, which would make such part consume less than the GTX470. At this point anything is posible.


----------



## EastCoasthandle (Aug 10, 2010)

Benetanegia said:


> No that's a long way of saying it will *not* consume more, unless it's clocked higher. I was responding to the claim that power consumption increase will be higher than the performance increase, which is not true.
> 
> And if it's based on a revised part as has been suggested (revision number blurred) anything could happen. For example that the power draw of such part has the same perf/watt difference comapred to the current GTX480 as the GTX460 has with the GTX465, which would make such part consume less than the GTX470. At this point anything is posible.


If the information about it so far is true it will consume more (final clocks, etc).  We will see (that's if it does actually come out).  No need to get upset about it.


----------



## Benetanegia (Aug 10, 2010)

EastCoasthandle said:


> If the information about it so far is true it will consume more (final clocks, etc).



And that again is a bold statement that contradicts every bit of information we have about GF100. All the info says it will have exactly the same clocks as the 480 SP model. Like I said there's little difference in power draw between the GTX465 and 470 and there's a 25% difference in enabled silicon. To be precise there's a 10% power difference. *The 512 SP version will only have 3-4% more silicon enabled*, so do the math, power difference would be 1-2%. And that's assuming everything in the card itself is the same: a revised PWM, revised PCB, revised cooler... all of them would have a far bigger impact on power consumption than the chip itself.

So no, you just connot affirm that it will consume more, based on the fact that it will have 4% more silicon enabled, because the slightest change to the card's design would make a much greater difference. And that's assuming these pics are not related to a new revision that includes all the optimizations made on GF104.


----------



## DaedalusHelios (Aug 10, 2010)

the54thvoid said:


> Why?
> 
> Why oh why?
> 
> ...



Performance is performance..... power draw is a different concern. Electricity is not expensive where I live. If "gpu A" is even 200watts more than "gpu B" it isn't going to cost me but maybe $5 more a month(half the cost of a nice sandwich) when gaming with it 4 hours a day. And I don't think it is a 200watt difference. So it isn't a big deal. Lower power draw would have been nice though. Also I am assuming you don't have a poorly ventilated case or a low wattage PSU to worry about. I don't have any GTX 4xx, and I have many ATi 5xxx series cards. It is not like I think Nvidia is doing a bad job. It is competition which is good for the consumer. We don't have to take sides like they are sports teams.


----------



## EastCoasthandle (Aug 10, 2010)

Benetanegia said:


> And that again is a bold statement that contradicts every bit of information we have about GF100. All the info says it will have exactly the same clocks as the 480 SP model. Like I said there's little difference in power draw between the GTX465 and 470 and there's a 25% difference in enabled silicon. To be precise there's a 10% power difference. *The 512 SP version will only have 3-4% more silicon enabled*, so do the math, power difference would be 1-2%. And that's assuming everything in the card itself is the same: a revised PWM, revised PCB, revised cooler... all of them would have a far bigger impact on power consumption than the chip itself.
> 
> So no, you just connot affirm that it will consume more, based on the fact that it will have 4% more silicon enabled, because the slightest change to the card's design would make a much greater difference. And that's assuming these pics are not related to a new revision that includes all the optimizations made on GF104.



Why are you arguing?  I have no need to argue with you.  But I can show you results.
source.  
Power Consumption Idle
512c.....158w
480c.....141w

Power Consumption Load
512c.....644w
480c.....440w

Now it would have been silly of me to have gotten into an argument with you when no information was available at the time.  However with a 1st peek I will await more reviews for confirmation.


----------



## MadMan007 (Aug 10, 2010)

All the gamers here have lost sight of the fact that the full Fermi architecture does play a huge role in NVs strategic product outlook, it's just in HPC not gaming. Perhaps going forward NV will be smart and separate the two from the getgo even though that may be worse from a production standpoint.


----------



## D4S4 (Aug 10, 2010)

now that power draw is plain epic. 

but i don't believe it came from only unlocking the rest of the chip.


----------



## HillBeast (Aug 10, 2010)

EastCoasthandle said:


> Why are you arguing?  I have no need to argue with you.  But I can show you results.
> source.
> Power Consumption Idle
> 512c.....158w
> ...



Based on my calculations, this card breaks PCI-e specifications. If we assume they are doing full system power consumption and that the normal Fermi uses about 320W of power (I think that's right for a GF100 GTX480), then if we take the power consumption load of the SP480 (440W) and take away the Fermi load (320W) = 120W for system, then if we take the system consumption away from the SP512 model (644 - 120W) THAT'S 524W!

HOLY CRAP! The card has 2 8 pin power connectors totalling 300W of power (don't go on to me about power supplies being able to supply more because that's irrelavent) and 75W from the PCI-e slot, that's still 149W over! How the heck can they justify such a huge leap in power consumption over the SP480 with such a small improvement in performance?

Fail. Epic fail. Uber epic ULTIMATE FAIL! GF100 is FAIL!!!


----------



## pr0n Inspector (Aug 10, 2010)

HillBeast said:


> Based on my calculations, this card breaks PCI-e specifications. If we assume they are doing full system power consumption and that the normal Fermi uses about 320W of power (I think that's right for a GF100 GTX480), then if we take the power consumption load of the SP480 (440W) and take away the Fermi load (320W) = 120W for system, then if we take the system consumption away from the SP512 model (644 - 120W) THAT'S 524W!
> 
> HOLY CRAP! The card has 2 8 pin power connectors totalling 300W of power (don't go on to me about power supplies being able to supply more because that's irrelavent) and 75W from the PCI-e slot, that's still 149W over! How the heck can they justify such a huge leap in power consumption over the SP480 with such a small improvement in performance?
> 
> Fail. Epic fail. Uber epic ULTIMATE FAIL! GF100 is FAIL!!!



Because some random Chinese website on the Internet is so reliable, right?

Also, the existence of 6+2 pin connectors means PSU manufacturers are already ignoring *ATX* specifications.


----------



## Bjorn_Of_Iceland (Aug 10, 2010)

dalekdukesboy said:


> and that takes a lot for someone to say when I see in his sig rig he has a gtx480 of all things!!


Myeah.. well afterall this is a tech forum where we discourse anything tech related intelligently.. Its just that sometimes, people get over zealously sentimental / rabidly hostile to some brand or belief, which imo is pretty pointless.


----------



## newfellow (Aug 10, 2010)

hmm, weird damn results there.. GPU score for a 1 HD5850 OC'd up is 18500 Points and feature scores are a lot higher. Now those cannot by any mean be true readings on screenshots. So, why is there like 9K/10K at the screenshots.


----------



## slyfox2151 (Aug 10, 2010)

newfellow said:


> hmm, weird damn results there.. GPU score for a 1 HD5850 OC'd up is 18500 Points and feature scores are a lot higher. Now those cannot by any mean be true readings on screenshots. So, why is there like 9K/10K at the screenshots.



er... becouse its running in EXTREAM mode??

id like to see a 5850 get anywhere near  9000 points in extream


----------



## btarunr (Aug 10, 2010)

Whitey said:


> Do you think the cooler will have heatpipes coming out the side still ?
> 
> Or something like the Galaxy vapour chamber cooling ?



I expect it to have the same cooling solution as the GTX 480, except with a more hair-trigger fan profile.


----------



## HillBeast (Aug 10, 2010)

pr0n Inspector said:


> Because some random Chinese website on the Internet is so reliable, right?
> 
> Also, the existence of 6+2 pin connectors means PSU manufacturers are already ignoring *ATX* specifications.



Dude, I don't know what Chinese class you took, and I haven't even taken any form of Chinese lesson but I can read that just fine. No idea what you're on about with Chinese stuff.

Also I was talking about PCI-e specifications not ATX. If you would read my post right the first time, I wouldn't have to explain myself.


----------



## claylomax (Aug 10, 2010)

EastCoasthandle said:


> Why are you arguing?  I have no need to argue with you.  But I can show you results.
> source.
> Power Consumption Idle
> 512c.....158w
> ...



No wonder. Just look at the load voltage.


----------



## HillBeast (Aug 10, 2010)

claylomax said:


> No wonder. Just look at the load voltage.



It's not that much of a difference.


----------



## phanbuey (Aug 10, 2010)

MadMan007 said:


> All the gamers here have lost sight of the fact that the full Fermi architecture does play a huge role in NVs strategic product outlook, it's just in HPC not gaming. Perhaps going forward NV will be smart and separate the two from the getgo even though that may be worse from a production standpoint.



The whole point is to make the technologies converge.  HPC requires computational power... gaming(essentially 3d rendering) requires computational power.  Converging them makes much more sense than trying to design two different products that do what can potentially be the same thing.


----------



## crazyeyesreaper (Aug 10, 2010)

as for consumptions uh did anyone already forget cards are avaible that use 3 6pins or 3 6+2 pins that comes to 150+150+150+75 525w so there ya go problem solved


----------



## HillBeast (Aug 11, 2010)

crazyeyesreaper said:


> as for consumptions uh did anyone already forget cards are avaible that use 3 6pins or 3 6+2 pins that comes to 150+150+150+75 525w so there ya go problem solved



Yeah but in the review the sample had 2 8 pins AKA 2 6+2 pin connectors.


----------



## Meizuman (Aug 11, 2010)

HillBeast said:


> Um, what was it I was going to say? Oh right, "BWAHAHAHAHAHA to the fools who bought the GTX480 with the assumption it would be the most powerful card from this generation. Sucks to be you. The 512SP version is better."



OR

"BWAHAHAHAHAHA to the fools who didn't bought the GTX480 because the assumption it would be the most power-hungy card from this generation. Sucks to be you. The 512SP version is hotter!"


----------



## CDdude55 (Aug 11, 2010)

Meizuman said:


> OR
> 
> "BWAHAHAHAHAHA to the fools who didn't bought the GTX480 because the assumption it would be the most power-hungy card from this generation. Sucks to be you. The 512SP version is hotter!"




Both those statements are wrong lol.(yours and HillBeast's)


----------



## erocker (Aug 11, 2010)

EastCoasthandle said:


> Why are you arguing?  I have no need to argue with you.  But I can show you results.
> source.
> Power Consumption Idle
> 512c.....158w
> ...



204 extra watts for 32 extra "Cuda cores"?! Something isn't right. I wonder if this website took a standard GTX480 and got their hands on a 512 core bios.


----------



## Meizuman (Aug 11, 2010)

Lol Wut? 

But as it takes +200W (vs. GTX 480) more at full load...

http://en.expreview.com/2010/08/09/world-exclusive-review-512sp-geforce-gtx-480/9070.html/6

Edit: I knew someone posts between.


----------



## CDdude55 (Aug 11, 2010)

Meizuman said:


> Lol Wut?
> 
> But as it takes +200W (vs. GTX 480) more at full load...
> 
> ...



Again as erocker said, i don't understand how 32 extra SP's can give a card an extra 200w+. It's still the same architecture, so a slight bump in cuda cores shouldn't raise the wattage that high.


----------



## Hayder_Master (Aug 11, 2010)

im just bout new zotac GTX480 AMP i want to say "i want to put my 480 in your ass nvidia"


----------



## Meizuman (Aug 11, 2010)

Of course only a minor thing but it does use a different PCB.. And of course that could just be some leaked ES card with AC cooler slapped on to it... who knows. But I believe that the GF100 has been doomed from beginning.


----------



## Bjorn_Of_Iceland (Aug 11, 2010)

Had to post this lol.

Id just rather Overclock the 480 480sp, and still have +5% better perf and not humongous consumption


----------



## HillBeast (Aug 11, 2010)

CDdude55 said:


> Both those statements are wrong lol.(yours and HillBeast's)



How was I wrong? The GTX480 512SP is by far has the highest performance out of any single core card. How is that wrong?



erocker said:


> 204 extra watts for 32 extra "Cuda cores"?! Something isn't right. I wonder if this website took a standard GTX480 and got their hands on a 512 core bios.



No it is definitely a different PCB. It has 2 8 pin power connectors and different capacitors and stuff.


----------



## pantherx12 (Aug 11, 2010)

CDdude55 said:


> Again as erocker said, i don't understand how 32 extra SP's can give a card an extra 200w+. It's still the same architecture, so a slight bump in cuda cores shouldn't raise the wattage that high.




Could be damaged, the 955 I bought of ( I think coldstorm) was broken mid- Atlantic.

But didn;t know at the time so popped in in and tried to get it to work, pluged everything in and went upstairs for a bit and came down to find my 120 EX BURNING hot to touch, the heat-sink must of been 70c or so.


----------



## CDdude55 (Aug 11, 2010)

HillBeast said:


> How was I wrong? The GTX480 512SP is by far has the highest performance out of any single core card. How is that wrong?



Your statement: ''Um, what was it I was going to say? Oh right, "BWAHAHAHAHAHA to the fools who bought the GTX480 with the assumption it would be the most powerful card from this generation. Sucks to be you. The 512SP version is better."''

Ya, it'll be better by about two frames. How is someone a fool for buying a GTX 480 just because they don't have an extra 32 cuda cores on there card?.


----------



## HillBeast (Aug 11, 2010)

CDdude55 said:


> Your statement: ''Um, what was it I was going to say? Oh right, "BWAHAHAHAHAHA to the fools who bought the GTX480 with the assumption it would be the most powerful card from this generation. Sucks to be you. The 512SP version is better."''
> 
> Ya, it'll be better by about two frames. How is someone a fool for buying a GTX 480 just because they don't have an extra 32 cuda cores on there card?.



Because my statement is still correct. I never said it was miles better, just that it's better. 2 FPS is better.


----------



## a_ump (Aug 11, 2010)

Bjorn_Of_Iceland said:


> Had to post this lol.
> 
> Id just rather Overclock the 480 480sp, and still have +5% better perf and not humongous consumption



yea lol no way in hell i believe that graph


----------



## CDdude55 (Aug 11, 2010)

HillBeast said:


> Because my statement is still correct. I never said it was miles better, just that it's better. 2 FPS is better.



In that logic yes, but in reality it's not much better in terms of an actual difference between a regular 480. So when you say something like ''sucks for people who bought a 480, such fools'', it's really you who ends up looking like a fool.(no offense)


----------



## HillBeast (Aug 11, 2010)

CDdude55 said:


> In that logic yes, but in reality it's not much better in terms of an actual difference between a regular 480. So when you say something like ''sucks for people who bought a 480, such fools'', it's really you who ends up looking like a fool.(no offense)



The way I look at it is I hate GF100, and I was trying to find something to praise about it. Anyone who buys on of these would have to have acknowledged that it's really hot and powerful and they will most likely be running an all out system with no concerns for power consumption. In that sense, this is the better choice.

If however you want performance AND low power consumption, nothing can beat the Radeon 5850.


----------



## CDdude55 (Aug 11, 2010)

HillBeast said:


> The way I look at it is I hate GF100, and I was trying to find something to praise about it. Anyone who buys on of these would have to have acknowledged that it's really hot and powerful and they will most likely be running an all out system with no concerns for power consumption. In that sense, this is the better choice.
> 
> If however you want performance AND low power consumption, nothing can beat the Radeon 5850.


I'm running two 470's in SLI... honestly, it's not as hot as it's hyped up to be. And as i have said before, anyone who has the funds for a video card costing $350-$500, they should have enough to afford a decent PSU to power it, so that should be a non issue to, it's a high end card.. it should be expected. And of course Fermi is overall very powerful.

I agree that if you want good performance and if you can't afford or don't want a beefier PSU, then the 5850 is definitely a great choice.


----------



## newfellow (Aug 12, 2010)

slyfox2151 said:


> er... becouse its running in EXTREAM mode??
> 
> id like to see a 5850 get anywhere near  9000 points in extream



O my mistake..


----------



## LAN_deRf_HA (Aug 12, 2010)

HillBeast said:


> The GTX480 512SP is by far has the highest performance out of any single core card.





HillBeast said:


> I never said it was miles better, just that it's better. 2 FPS is better.



While technically you didn't say miles, you did phrase it as if referring to a great distance.


----------



## Tatty_One (Aug 12, 2010)

"Better" is subjective and measured in a number of ways.... power consumption, heat output, general performance etc etc, now in this context "faster" may be a better term to use.

Funny thing is, most fermi owners love their cards, few moan about power consumption or heat output, many have "upgraded" from a 5850 or 5870 so I cannot really understand many of the arguments.  It's all very well being a "pureist" and saying that it's consumption and heat are too great for me to buy it, however if most of those people were offered the option of card A at $300 for a level of performance with lower consumption and heat or for the same price offered card B with 10% more performance and 30% more consumption and heat.... still many would take card B.

As for this, I cannot beleive these power consumption figures, do the math, even with a new PCB, 32 more shaders and even with a stock speed of 800mhz I cannot for the life of me see how it could ever hit that consumption..... the GTX260 216SP did similar things when compared to this "upgrade", I kept the oold 192SP version as often with the less SP's the cards overclock better anyways.


----------



## HillBeast (Aug 12, 2010)

LAN_deRf_HA said:


> While technically you didn't say miles, you did phrase it as if referring to a great distance.



Why is everyone still caught up on my post. Okay I dun goof'd saying it's far better, but you have to consider when I wrote it the review wasn't up so how was I to know the 32 extra cores were worth nothing in gaming.

But that's the thing, they are only worth 2 FPS in games but if you were to do something that utilised the card more like running those hair demos that NVIDIA do or something using heaps of CUDA, then the 512 core will come into a league of it's own. It's being compared on a system which this card is most likely not designed for.

At this stage there is no game which can truly stress cards properly anymore. 99% of games are console ports that can work on a 7900GTX or a Radeon HD2800. They aren't very powerful cards and now we're getting these 40nm monoliths. Of course games benchmarks aren't going to be accurate to show off some SPs being unlocked. The only way to test this now is in a CUDA/OpenCL benchmark. That is where this card will shine.

Now stop quoting me. I'm sick of explaining myself. These cards are made for the uber Folders and SETI and such, not n00bs who play Spider Solitaire. If you are buying one of these cards, you will have most likely an unlimited budget and power consumption is the least of your worries. That is what this card is for and that is why it is better than the GTX480: It is offering unrestricted performance at a high cost to those who NEED it.

If you don't like this card (because I certainly don't) then get a Radeon HD 5850. I hate GF100, and I am defending it because you guys are not seeing my point. If I have to come back here and explain myself again, I might loose my mind.


----------

