# NVIDIA GF100 Graphics Card Chugs Along at CES



## btarunr (Jan 7, 2010)

NVIDIA's next generation graphics card based on the Fermi architecture, whose consumer variant is internally referred to as GF100 got to business at the CES event being held in Las Vegas, USA, performing live demonstration of its capabilities. The demo PC is housing one such accelerator which resembles the card in past sightings. Therefore it is safe to assume this is what the reference NVIDIA design of GF100 would look like. The accelerator draws power from 6-pin and 8-pin power connectors. It has no noticeable back-plate, a black PCB, and a cooler shroud with typical NVIDIA styling. The demo rig was seen running Unigine Heaven in a loop showing off the card's advanced tessellation capabilities in DirectX 11 mode. The most recent report suggests that its market availability can be expected in March, later this year. No performance figures have been made public as yet.



 

 

 

 

A short video clip after the break.












*View at TechPowerUp Main Site*


----------



## 3volvedcombat (Jan 7, 2010)

This is good very good. Im sitting here staring at my GTX 260's wondering if they should go on the [FS] forum because of the proof Ive just seen.


----------



## Imsochobo (Jan 7, 2010)

wondering if those at nvidia's marketing team uses 3 years to choose what kind of breakfast they are going to buy.

Because their naming schemes change every year....

So is GF100 entry ? best? lowend.... uhm ?

Gj getting something to show this round Nvidia!


----------



## btarunr (Jan 7, 2010)

Imsochobo said:


> wondering if those at nvidia's marketing team uses 3 years to choose what kind of breakfast they are going to buy.
> 
> Because their naming schemes change every year....



GF100 is an internal codename. Terms like G92, GT200, or GF100, aren't meant to be marketing names anyway.


----------



## Imsochobo (Jan 7, 2010)

btarunr said:


> GF100 is an internal codename. Terms like G92, GT200, or GF100, aren't meant to be marketing names anyway.



ohh.

well then they had issues finding internal codenames then


----------



## Airbrushkid (Jan 7, 2010)

Why? you'll still have to wait til March. I have as follows -

TNT-2 
Geforce 3
Geforce 5600
Geforce 6600 gt
Geforce 6800 OC
Geforce 8600 gts
Geforce 8800 gts 320
Geforce 9800 gt 1 gig
GTX 260 216
GTX 285 SSC

No plans on selling. When the new card comes out I may buy.





3volvedcombat said:


> This is good very good. Im sitting here staring at my GTX 260's wondering if they should go on the [FS] forum because of the proof Ive just seen.


----------



## BUCK NASTY (Jan 7, 2010)

3volvedcombat said:


> This is good very good. Im sitting here staring at my GTX 260's wondering if they should go on the [FS] forum because of the proof Ive just seen.


They should get bumped to the dedicated folding rig.


----------



## btarunr (Jan 7, 2010)

Imsochobo said:


> ohh.
> 
> well then they had issues finding internal codenames then



Codenames aren't meant for consumers anyway. They won't feature on specs sheets or boxes. Hence they can call their stuff whatever they want to, and that would still be a non-issue to the consumer.


----------



## cadaveca (Jan 7, 2010)

Anyone else notice that it looks like they got 3x pci-e connector wires going to 2x the pci-e plugs?(note the number of yellow 12v wires) Something is fishy here...


----------



## btarunr (Jan 7, 2010)

cadaveca said:


> Anyone else notice that it looks like they got 3x pci-e connector wires going to 2x the pci-e plugs?(note the number of yellow 12v wires) Something is fishy here...



No, nothing fishy.







The 8-pin connector is an internode (a single line, in the middle of which is a connector).


----------



## AlCabone (Jan 7, 2010)

Airbrushkid said:


> Why? you'll still have to wait til March. I have as follows -
> 
> TNT-2
> Geforce 3
> ...



Do you actually use all of those?


----------



## cadaveca (Jan 7, 2010)

btarunr said:


> No, nothing fishy.
> 
> http://img.techpowerup.org/100107/btalkjhas37.jpg
> 
> The 8-pin connector is an internode (a single line, in the middle of which is a connector).



If you say so. Just seems ot me this is one of the early samples, and has high power draw. 8-pin power only requires 3x 12v wires, not 6, as is shown.

No big deal..jsut means maybe production cards will be better/faster.

but yeah, I hear what you are saying...from 8-pin, a 6-pin hangs. But then why didn't they use just the single cable?

Good idea of length here too...that's a Raven RV01 case.


----------



## EastCoasthandle (Jan 7, 2010)

Is that 3 sets of bundles power cords I see


----------



## btarunr (Jan 7, 2010)

cadaveca said:


> If you say so. Just seems ot me this is one of the early samples, and has high power draw. 8-pin power only requires 3x 12v wires, not 6, as is shown.
> 
> No big deal..jsut means maybe production cards will be better/faster.
> 
> but yeah, I hear what you are saying...from 8-pin, a 6-pin hangs. But then why didn't they use just the single cable?








Capiche?

It's just the way the PSU's cables were designed. The "second" connector in the diagram above is what went into the card in the picture above. There's nothing more to it than this.


----------



## cadaveca (Jan 7, 2010)

Capisco!!

Again, then why didn't they use just the single cable?


----------



## btarunr (Jan 7, 2010)

cadaveca said:


> Capisco!!
> 
> Again, then why didn't they use just the single cable?



It does not make a difference.


----------



## cadaveca (Jan 7, 2010)

To me, it does, dependant on the PSU used. 


Meh...just something to talk about......clearly this is not a full production card, as there are a few months left, almost, until release.


----------



## btarunr (Jan 7, 2010)

cadaveca said:


> To me, it does, dependant on the PSU used.



I'll say it again, it does not make a difference. Besides they are not doing a performance evaluation there, so there's no scope to even speculate on something this trivial.


----------



## yogurt_21 (Jan 7, 2010)

interesting being that there were some rumors regarding the tessalation capabilities.

@ the naming scheme I think this is supposed to be the 10th rendition of the geforce series hence the gf on the tag, which means they are more than likely countign the gt200 as the geforce 9 onmiting the 9800's as the g92 first bore an 8800 designation. 

it likly means nvidia is returnign to a normal card progression internal naming scheme. 

speculation truly especially by cores without revision series were only up to 6 on the gt200 which would make the gf100 the 7th.


----------



## Benetanegia (Jan 7, 2010)

cadaveca said:


> Capisco!!
> 
> Again, then why didn't they use just the single cable?



Probably just for better cable mangement. I've done that with some of my cables.

Tom's Hardware on Fermi - http://www.tomshardware.com/reviews/ces-2010-fermi,2527-4.html



> Fermi Graphics: Really? Really!
> 
> I was just about to leave the event when Ken Brown, Nvidia PR guy (not to mention former executive editor of Computer Gaming World back in the day), tapped me on the shoulder and asked me if I wanted to see a Fermi-based GPU.
> 
> ...



He's claiming Fermi to be faster than HD5970 and that's from a Nvidia guy this time. Real or not, it's one step closer to being an official statement, not rumors or fakes (now, they really were fakes?). I'll chose to believe him, because some months ago I made my own calculations based on the specs and reached that conclusion, as some of you may remember. We'll see, but I'm optimistic.


----------



## Imsochobo (Jan 7, 2010)

btarunr said:


> Codenames aren't meant for consumers anyway. They won't feature on specs sheets or boxes. Hence they can call their stuff whatever they want to, and that would still be a non-issue to the consumer.



i was joking around 

As my head gets dizzy by knowing whats what.

And now it seems ati is doing the same, still not as hard as nvidia's naming have been lately, but they have started so they might follow footsteps


----------



## Animalpak (Jan 7, 2010)

whoooo nice, things comes real this time


----------



## HalfAHertz (Jan 7, 2010)

Finally! May the price wars commence!


----------



## phanbuey (Jan 7, 2010)

price wars... mmm price wars.  Can't wait.


----------



## 20mmrain (Jan 7, 2010)

I too thought it looked like 3 wires at first until I looked again. It is only two 1 6pin + 1 8 pin. Just must be the way they had the wires situated. 

Also did anyone notice that the guy running the demo happened to keep switching the modes during the really intense tessellation parts? It was either him doing it or that is the way they had it set up to run. 
I the parts that it did run with full Tessellation it did look a little choppy at times. I do think that it looked a little smoother than my single 5870..... but not too much. Not enough to make me really impressed. 

So let's say for a second that this thing really does beat a 5870 by 36%. Well I wouldn't be surprised it only is coming out half a year later. If you ask me that number wouldn't be that impressive for 6 months later. 

But I still like the idea of the technology that they are using. I also think it will be a great addition to the world of GPU's. It will keep things moving forward that is for sure! 

But If I was that confident with it's performance and I were Nvidia. I would have left the FPS counter on the bottom and the top..... still left out there for all to see. But they didn't makes me wonder.


----------



## erocker (Jan 7, 2010)

20mmrain said:


> I too thought it looked like 3 wires at first until I looked again. It is only two 1 6pin + 1 8 pin. Just must be the way they had the wires situated.
> 
> Also did anyone notice that the guy running the demo happened to keep switching the modes during the really intense tessellation parts? It was either him doing it or that is the way they had it set up to run.
> I the parts that it did run with full Tessellation it did look a little choppy at times. I do think that it looked a little smoother than my single 5870..... but not too much. Not enough to make me really impressed.
> ...



I don't know. If the card really beats a 5870 by 36%. Wouldn't Nvidia be touting it left and right? I'm hoping to see some concrete numbers by the end of CES. Really, we aren't seeing much that we haven't already seen.


----------



## 20mmrain (Jan 7, 2010)

> I don't know. If the card really beats a 5870 by 36%. Wouldn't Nvidia be touting it left and right? I'm hoping to see some concrete numbers by the end of CES. Really, we aren't seeing much that we haven't already seen.



Well here is the video I was looking at let me know what you think? I don't know if you have seen this one.

http://www.youtube.com/watch?v=gkI-ThRTrPY

But like I said looks smoother than my 5870 but he was also toggling threw the Tessellation. Plus like I said they had 6 months longer. Not that is an excuse....... my whole point wether I am a 5870 owner or not is.... that 6 months later and only 36% better (if that's what it is) is not that impressive. IMO


----------



## EastCoasthandle (Jan 7, 2010)

erocker said:


> I don't know. If the card really beats a 5870 by 36%. Wouldn't Nvidia be touting it left and right? I'm hoping to see some concrete numbers by the end of CES. Really, we aren't seeing much that we haven't already seen.



Yes, they would as they did with the GT8800 back then for example.  The reason for that trend is that they would have  head room to overclock it making it  the new standard clock should AMD be able to match or beat their part.  They are being very conservative by not providing this information.  Specially when they are possibly 2 quarters behind before they are available for purchase.  IMO, I call this a red flag.  But perhaps before the day is over with we actually get performance numbers.


----------



## cadaveca (Jan 7, 2010)

Realistically, it can only hurt them to release actual performacne, as the time between now and release will give AMD a set goal to beat with HD5XXX refreshes. I'd prefer to see it running a myriad of apps with stability, and no more.


----------



## gumpty (Jan 7, 2010)

Benetanegia said:


> Tom's Hardware on Fermi - http://www.tomshardware.com/reviews/ces-2010-fermi,2527-4.html
> 
> 
> He's claiming Fermi to be faster than HD5970 and that's from a Nvidia guy this time.



No he didn't. He claimed it was faster than AMD's fastest GPU, not AMD's fastest graphics card. He was claiming it was faster than a 5870.


----------



## a_ump (Jan 7, 2010)

eh, i didn't expect performance numbers at all, i was hoping to see official specs, clock speeds n whatnot. But now i don't expect to see anything until the day of release...


----------



## EastCoasthandle (Jan 7, 2010)

I believe the very opposite of that.  If they truly have the lead they are claiming they would always compensate based on the overhead of just increase the clock rate. Which is something they've always done.


----------



## 20mmrain (Jan 7, 2010)

> Realistically, it can only hurt them to release actual performacne, as the time between now and release will give AMD a set goal to beat with HD5XXX refreshes. I'd prefer to see it running a myriad of apps with stability, and no more.



You do have a point there. But still think about it they released their GTX200 series first and are releasing Femi last. They have had Hella more time to work on their card. So it should be faster.

But still it's seems just fishy to me.
If they come out with a much faster and awesome card..... I am at peace with that. I think that could only help the consumer in the end. 
My hole focus is though that everything they have been doing with this card just seems fishy. 

But looking at it from this stand point too..... If I were Nvidia and I had the best card the world has ever seen. I might not want to say anything as well. Especially if I thought that it might be the undoing of my competition! I just hope that it doesn't go that far!


----------



## cadaveca (Jan 7, 2010)

Heh..to me, i think Jen Hsun has been seen at the poker tables in Vegas.


----------



## Steevo (Jan 7, 2010)

I am sure the NV card will be faster than a 5870, but I'm also sure it will cost at least 50% more. The question is how much, how much faster exactly, and how much more monies exactly. They need to get some figures out there and fast, as ATI still has the fastest/only DX11 card, and they are taking more of the marketshare everyday. Plus with a huge die on the NV card the chances of yield problems at the foundry, and thus a short supply will only cause more people to choose the in stock ATI variant if they don't have a large performance/price lead.


I know I can't wait to see what this card brings, for folding, GTA4, and price. If it is right I'm going green this time.


----------



## 20mmrain (Jan 7, 2010)

> Heh..to me, i think Jen Hsun has been seen at the poker tables in Vegas.



You too? I thought that was him but I couldn't tell with out him holding his fake tessellation card?


----------



## [H]@RD5TUFF (Jan 8, 2010)

I'm still waiting for someone to proclaim this as fake, and say there has to be an ATI  card doing the DX11 demo.


----------



## TheMailMan78 (Jan 8, 2010)

> From the pictures one can note the following facts:
> • There is no backplate like on the Tesla Fermis
> • Given the holes in the PCB the temperature regulation seems to be more difficult - but this could also be a preparation for future SLI versions
> • 1x 6-pin and 1x 8-pin power connector: According to the PCI E specifications the card is allowed to draw up to 300 watt



Is 300watt max for a PCI-E or just those connections? Ima confused.

Source


----------



## 20mmrain (Jan 8, 2010)

> I'm still waiting for someone to proclaim this as fake, and say there has to be an ATI card doing the DX11 demo.



It was didn't you know ATI had a GForce version?


----------



## Benetanegia (Jan 8, 2010)

TheMailMan78 said:


> Is 300watt max for a PCI-E or just those connections? Ima confused.
> 
> Source



It's just saying that according to PCIE expecifications...


```
8pin = 150w
6pin = 75w
MB pcie slot = 75W

150+75+75 = 300w
```

...the card is allowed to draw up to 300 watt

According to that the card will draw something between 225w and 300w, UNLESS it's like the Tesla cards, which also have 1x 8pin and 1x 6pin, but only need 150w external. That is, you can use 2x 6pin OR 1x 8pin. In that case power consumption would be up to 225w like in Tesla cards. IMO the GeForce card consumes 250-ish at the moment and that's why Nvidia PR guys have told the press at the event that the configuration could change in the final product. If it was something like 275-300 they wouldn't be able to change anything.


----------



## fullinfusion (Jan 8, 2010)

I shall keep my comments to myself lol .... go green team..go! 
maybe next time


----------



## OneCool (Jan 8, 2010)

So this one wasnt put together with wood screws? 








baaaaaaaa hahahaha


----------



## kid41212003 (Jan 8, 2010)

Benetanegia said:


> He's claiming Fermi to be faster than HD5970 and that's from a Nvidia guy this time. Real or not, it's one step closer to being an official statement, not rumors or fakes (now, they really were fakes?). I'll chose to believe him, because some months ago I made my own calculations based on the specs and reached that conclusion, as some of you may remember. We'll see, but I'm optimistic.



He said "GPU", so I  assume he meant a single GPU card .


----------



## btarunr (Jan 8, 2010)

AMD does occasionally refer to its dual-GPU cards as GPUs.


----------



## overclocking101 (Jan 8, 2010)

mmmmmm i cant wIT FERMI here i come!!!


----------



## Benetanegia (Jan 8, 2010)

And they said "fastest GPU". No need to remark fastest if they were talking about the HD5870. No one is going to think they were refering to the HD5850. 

And like bta said, Ati themselves refer to their dual-GPU cards as simply GPU. Ever since the HD3870x2 they have always tried to impose the idea that X2 was the high-end and the single GPU cards are the performance level cards. They don't want any differentiation based on number of GPU, the X2 card is just their top card (even the new name HD5970 points to this), hence the fastest card.


----------



## [H]@RD5TUFF (Jan 8, 2010)

20mmrain said:


> It was didn't you know ATI had a GForce version?



No I didn't silly me.


----------



## Weer (Jan 8, 2010)

btarunr said:


> I'll say it again, it does not make a difference. Besides they are not doing a performance evaluation there, so there's no scope to even speculate on something this trivial.



Gosh, for some reason, I've come to respect you.


----------



## Thrackan (Jan 8, 2010)

As far as Toms Hardware is concerned, I think they've lost the status of being "independent reviewers" a long time ago, especially when it comes to GFX cards. Their GFX charts are constantly favoring the green team, to an absurd extent.

Let me see some actual test results and we can start talking again, Tom!


----------



## Bjorn_Of_Iceland (Jan 8, 2010)

Thrackan said:


> As far as Toms Hardware is concerned, I think they've lost the status of being "independent reviewers" a long time ago, especially when it comes to GFX cards. Their GFX charts are constantly favoring the green team, to an absurd extent.


Any direct evidence for that?


----------



## Thrackan (Jan 8, 2010)

Bjorn_Of_Iceland said:


> Any direct evidence for that?



Just watch the charts and use common sense and plenty of other reviews.


----------



## TheMailMan78 (Jan 8, 2010)

TPU is the only review source you need. I've read tons of GPU reviews on here and they have yet to mislead me. If W1zz says the card is good than its good. If he say its "not great" than you better believe its shit.


----------



## btarunr (Jan 8, 2010)

Looky












Images courtesy Hardware Upgrade Italia.

http://www.youtube.com/watch?v=3-1GMbqzGX0


----------



## phanbuey (Jan 8, 2010)

its so weird to keep seeing pictures of the actual card in use but NOT ONE LEAKED BENCH.  Weird and frustrating.


----------



## kid41212003 (Jan 8, 2010)

It crashed....


----------



## Thrackan (Jan 8, 2010)

is it me or is that HDMI plug hardly usable?


----------



## phanbuey (Jan 8, 2010)

Thrackan said:


> is it me or is that HDMI plug hardly usable?



i think its just the angle of the shot... but a thick connector probably wouldn't fit .


----------



## wolf (Jan 8, 2010)

Did I hear correctly in the vid you posted btarunur, a chicken gun??


----------



## btarunr (Jan 8, 2010)

wolf said:


> Did I hear correctly in the vid you posted btarunur, a chicken gun??



I thought it was a duck gun.


----------



## Bjorn_Of_Iceland (Jan 8, 2010)

pfft. thought so.


----------



## newtekie1 (Jan 8, 2010)

cadaveca said:


> Capisco!!
> 
> Again, then why didn't they use just the single cable?



Because you aren't supposed to do that?

When you have a single cable, with both an 8-pin and a 6-pin, there is almost always a warning somewhere to not use both.

An 8-pin is rated for 150w, a 6-pin is rated for 75w.  When you have a power supply that has both on a single cable, that cable is only rated for 150w, they just added the 6-pin for people that need two 6-pins, it eliminates the hassle of having a 8-pin to 6-pin adaptor.  When you use the 8-pin you are using all 150w that the cable is rated for already, so you can't use the 6-pin.


----------



## OneCool (Jan 8, 2010)

kid41212003 said:


> It crashed....



Thats about all I got out of it too


----------



## pantherx12 (Jan 8, 2010)

btarunr said:


> Looky
> 
> 
> 
> http://img.techpowerup.org/100108/fermi_geforce_triple_sli.jpg



What a mess


----------



## Bo_Fox (Jan 8, 2010)

newtekie1 said:


> Because you aren't supposed to do that?
> 
> When you have a single cable, with both an 8-pin and a 6-pin, there is almost always a warning somewhere to not use both.
> 
> An 8-pin is rated for 150w, a 6-pin is rated for 75w.  When you have a power supply that has both on a single cable, that cable is only rated for 150w, they just added the 6-pin for people that need two 6-pins, it eliminates the hassle of having a 8-pin to 6-pin adaptor.  When you use the 8-pin you are using all 150w that the cable is rated for already, so you can't use the 6-pin.



+1 on that.  A standard 12v wire is rated for 2A current (about 25W per wire, 3 12v wires for 75W on a 6-pin PCI-e connector).  Unless the PSU has some overcurrent protection, the wires will start melting if all of the connectors are plugged into those tri-SLI cards, sucking in 4A or more per wire that has to feed all of the connections.

Heck, I do not know..  perhaps the newer PSU's allow each 12v connection to be fed up to 4A's, and the wires are higher quality capable of handling the load?


----------



## [H]@RD5TUFF (Jan 8, 2010)

pantherx12 said:


> What a mess



Not the cleanest set up I've ever seen no lol.


----------



## HalfAHertz (Jan 8, 2010)

btarunr said:


> I thought it was a duck gun.



Well you do have to be a quack to come up with such a crazy benchmark


----------



## TheMailMan78 (Jan 8, 2010)

btarunr said:


> Looky
> 
> http://img.techpowerup.org/100108/fermi_geforce_back.jpg
> 
> ...



I love the SLI ribbon. I wish they made a crossfire cable like that.


----------



## Steevo (Jan 8, 2010)

I'm counting 7 yellow wires on the first post images. Can anyone confirm? I blew the image up and used a tool to measure the wires and compared. The two on the bottom of the image look like one, but are thicker than one, and it has a slight shadow.


----------



## TheMailMan78 (Jan 8, 2010)

I blew up the image and cleaned it up some. I hope this helps. I rushed but if need be I can clean it up more.


----------



## TheMailMan78 (Jan 8, 2010)

I found some more.


----------



## Steevo (Jan 8, 2010)

I see 8 yellow and blue lines to feed this monster. Perhaps they realized they needed the extra juice, and since it was such a bad thing to ahve 8+8 they did extra power lines to the connector to keep it as a 8+6 connector? 

So we now have the possibility of more than 300W to power this beast? 


Really considering that it is supposed to be more powerful than a energy sipping 5870 it isn't a far fetched idea. More computational power means more electrical power, means more heat. So now to run one of these you will need a new PSU to run it efficiently. 

And is it me or is that card running vertical? Do they really need that much extra air movement to use a ATI style blower, and they are forced to use draft from the direction of the card to keep it cool? Then again if it is dumping 350W or more of heat.........


----------



## [H]@RD5TUFF (Jan 8, 2010)

Steevo said:


> I see 8 yellow and blue lines to feed this monster. Perhaps they realized they needed the extra juice, and since it was such a bad thing to ahve 8+8 they did extra power lines to the connector to keep it as a 8+6 connector?
> 
> So we now have the possibility of more than 300W to power this beast?
> 
> ...



That's simply the case, I doubt there is any real reason, as you can see they also run 3 of them in SLI in a horrizontal position.

It's not even a preduction card and people are already trying to find problems with it. Sad very sad.

Are we really still debaiting this . .. :shadedshu


----------



## Steevo (Jan 9, 2010)

When they have extra power wires going to power their cards, and they are supposed to be making the card as we speak to go on the shelf. It is a problem, when the cards are obviously drawing more power than the original connector can provide, it does spell issues for users with standard connectors. Missing 50W of power can cause some issues.


----------



## EastCoasthandle (Jan 9, 2010)

Read this

Below is what they are claiming:

Fermi TDP: 300 Watt

Honestly, all I can say is what for some official word on this.


----------



## TheMailMan78 (Jan 9, 2010)

EastCoasthandle said:


> Read this
> 
> Below is what they are claiming:
> 
> ...








If any of that is true Nvidia is pulling a HD2900 on us. If so I would skip that garbage........If its true.


----------



## EastCoasthandle (Jan 9, 2010)

It is said that the Heaven Benchmark crashed (or was it their physx demo called Supersonic Sled..not sure) on Fermi over at Beyond3D.


----------



## jimmyz (Jan 9, 2010)

TheMailMan78 said:


> If any of that is true Nvidia is pulling a HD2900 on us. If so I would skip that garbage........If its true.
> 
> Edit: Damn its in French. I cant seem to copy the text.



Interesting you mention the 2900 as the 2900 was ATI's first card with tesselation, which was originally in DX10 specs. When they were late MS dropped several of the reqs. for DX10 in order to have cards ready for launch. now ati is into their stride with tesselation hardware and other DX11 features where this is NV's first crack at full support. I think the scope of the chip was just to eager and to expect (or actually demand) 0% leakage is being overly optomistic.
I think it will turn into a 2900 scenario but how NV handles it will determine how well they succeed in future generations. If they kick the dirt and stomp and blame DX11 or TSMC they will be destined to a few lean generations. If they suck it up and learn from the mistakes then they can have a card out by next spring with 2x the power.


----------



## TheMailMan78 (Jan 9, 2010)

Honestly I want Nvidia to come with it. Honestly we all would win if they do. We need a good price war


----------



## newtekie1 (Jan 9, 2010)

Steevo said:


> I see 8 yellow and blue lines to feed this monster. Perhaps they realized they needed the extra juice, and since it was such a bad thing to ahve 8+8 they did extra power lines to the connector to keep it as a 8+6 connector?
> 
> So we now have the possibility of more than 300W to power this beast?
> 
> ...



Again, it is an 8-pin with a 6-pin coming out of it, likely tied back to keep it neat.

Judging by the stickers labelling each connector it looks like a Silverstone power suppply, and if you look at some silverstone power suppplies, it is actually pretty common for them to run a 6-pin out of the 8-pin.  Look at the OP1000-P if you want an example, in the shot of the 8-pin there are clearly more cables than need, because the 6-pin is running out of it.  There isn't sleeving like in the pics, but for all we know nVidia is using a power supply from Silverstone that hasn't even hit the market yet.


----------



## jimmyz (Jan 9, 2010)

TheMailMan78 said:


> Honestly I want Nvidia to come with it. Honestly we all would win if they do. We need a good price war



Absolultely, I want it to be good too. I was just expanding on the current events in the thread. The simularity that both launches were the first to have hardware dedicated for tesselation seems like more than a coincidence. I think it is probably hte achilles heel in both of these examples. As we know if a part of a chip doesn't work propely it generally has leakage issues.


----------



## [I.R.A]_FBi (Jan 9, 2010)

bring it nv so i can get a 200 dollar 5850


----------



## Steevo (Jan 9, 2010)

Holla. 


I still woudl buy this green giant if it delivers on performance and price. I have water cooling and it won't make my temps rise enough to worry about. But i'm not paying $600 for a single GPU solution that only performs as well as a $385 solution from another company.


I really think NV has some power and heat issues that ATI took their loss on long ago by moving to a new archatecture. Much like a car where throwing more displacement and money at it makes it go faster, but a better optomized and designed car will still perform without the issues, NV may yet suprize us with a turd.


----------



## HalfAHertz (Jan 9, 2010)

And I think you've been dipping a bit too much into the red cool-aid. Honestly let's wait for some benches first and then start speculating if this card will melt down the polar ice caps


----------



## newtekie1 (Jan 9, 2010)

Steevo said:


> Holla.
> 
> 
> I still woudl buy this green giant if it delivers on performance and price. I have water cooling and it won't make my temps rise enough to worry about. But i'm not paying $600 for a single GPU solution that only performs as well as a $385 solution from another company.
> ...



This thing is definitely going to be putting out more heat when at full blast than an HD5870, I expect power and heat to be similar to an HD5970, and I also expect performance to be similar to that also though.

I also know that there won't be just one model, and I wouldn't be surprised if we saw cut down lower models performing similar to the HD5870 and having similar heat and power characteristics.

I don't believe watercooling will be necessary for single or even dual card configurations, triple SLi(and Crossfire) has always presented heat issues because the cards are so close together.  And of course the watercooled other parts of the computer too, so it might have just been done to attract attention to the rig.


----------



## Bo_Fox (Jan 9, 2010)

We just have to wait and see.  If it can run under 90 degrees C at max. load with stock cooling and stock fan speed, I'll be happy.  If it has low idle power consumption like the GT200 series, I'll be happy.  If it does not eat any more power than a 5970 at load, I'll be happy.  If it can overclock by at least 10% without a voltage bump, I'll be happy.  If it beats a 5870 by at least say, 22%, I'll be happy.

NV has never fallen behind ATI since the R300 days, and that was only because there NV got a bit lazy with the Geforce 4 series after having no real competition for a couple years, and could not prepare their FX series in time.  Even if the Fermi is a kinda new architecture, NV must have been preparing for this for a while.  It's just TSMC process to blame, I would like to think.

(NV did get a bit behind ATI for like 6 months in 1H 2006 after the X1900XTX took them by surprise, but not by much more than 10% anyways, and that was after having a 100% advantage over ATI with the 7800GTX for nearly 6 months in 2H 2005).


----------



## Bo_Fox (Jan 9, 2010)

This is great that there are new PSU's with high-quality yellow/blue 12V wires that can probably handle 5A or even 6A current (75W) each.  No biggie, but that's good to see.


----------



## EastCoasthandle (Jan 9, 2010)

> ...Element V Nvidia Edition chassis also incorporated graphic card “air duct” system engineered by Thermaltake and Nvidia to provide added cooling for high-performance 3-way SLI or Quad SLI setup based on Nvidia’s next generation of enthusiast graphic card.  The proprietary “air duct” system brings cool and fresh air directly from the outside of the chassis and accelerates it to graphic card’s intake to increase heat displacement and achieve optimal cooling efficiency.  Without Nvidia SLI certified chassis, system powered by the next generation of high-performance graphic cards may not be able to operate at their highest setting due to inadequate cooling...


source


HUH!? Ok, I don't recall multi video card setups needing specific cooling solutions.  Is this marketing or is this really needed...hard to say.


----------



## Hayder_Master (Jan 10, 2010)

why 3 pci-e power , and seems 3x8 pin power, with quad SLI you need a generator not a PSU


----------



## btarunr (Jan 10, 2010)

hayder.master said:


> why 3 pci-e power , and seems 3x8 pin power, with quad SLI you need a generator not a PSU



Where do you see three connectors?


----------



## TAViX (Jan 10, 2010)

That picture is tricky, but I've finally understand it. There aren't 2 cables in 1 connector, is just that the connector is in serial with another one, and that cable is bended, so it gives the sensation of 3 cables there, haha! Nice illusion. 


Regarding the card, it will definitely be faster that 5870, the question is how much more expensive it will be, and how much power hungry?


----------



## Animalpak (Jan 10, 2010)

rotated like this are more better and give a solid idea how it looks like.


----------



## Hayder_Master (Jan 10, 2010)

Animalpak said:


> rotated like this are more better and give a solid idea how it looks like.
> 
> http://img.techpowerup.org/100110/61b.jpg





btarunr said:


> Where do you see three connectors?





this revers picture clear the shot , it's three tied cables but not 3 pci-e power now it seems 8pin+6pin power


----------



## Benetanegia (Jan 10, 2010)

TAViX said:


> That picture is tricky, but I've finally understand it. There aren't 2 cables in 1 connector, is just that the connector is in serial with another one, and that cable is bended, so it gives the sensation of 3 cables there, haha! Nice illusion.
> 
> 
> Regarding the card, it will definitely be faster that 5870, the question is how much more expensive it will be, and how much power hungry?








http://forums.techpowerup.com/showpost.php?p=1706875&postcount=14

 btrunr addresses that in the first page!!!! Everything is well explained from the start. The thing started finished and started again. If people actually read before posting... :shadedshu

Not only directed at you, Tavix, it's for all the people still comenting on the cables.


----------



## [I.R.A]_FBi (Jan 10, 2010)

Benetanegia said:


> http://img.techpowerup.org/100107/bta198172he.jpg
> 
> http://forums.techpowerup.com/showpost.php?p=1706875&postcount=14
> 
> ...



you seem hurt


----------



## Benetanegia (Jan 10, 2010)

[I.R.A]_FBi said:


> you seem hurt



I am. You like seing the same thing over and over again in the SAME THREAD? It's a pain and I had enough of that in the HD5870 below expectations thread, thank you. (Bo Fox, if you read this, no offense )


----------



## LAN_deRf_HA (Jan 10, 2010)

Based on everything I've read so far, it seems realistically this card will be at most 10% faster than a 5870/295. It will most likely have very limited overclocking potential, bumping from a probable 600 MHz base clock to 680-700 mhz max overclock. I'd say in that scenario a 5870 overclocked to 1000 MHz could probably match a 380 GTX (guessing on the name) overclocked to 700 MHz. 

So we'll most likely see the same overclock performance from both cards. Given that, despite the 380 GTX being pricier I'd still buy it for the ability to max GTA (1.5 GB mem standard) and for nvidia's vastly better drivers. It just seems like a better package to me.


----------



## [H]@RD5TUFF (Jan 10, 2010)

Steevo said:


> When they have extra power wires going to power their cards, and they are supposed to be making the card as we speak to go on the shelf. It is a problem, when the cards are obviously drawing more power than the original connector can provide, it does spell issues for users with standard connectors. Missing 50W of power can cause some issues.



Your really willing to make all these alogations and conclusions, on 1 picture.. . . .

Let it go and wait for a production model and a review, before you go slinging mud and condeming this product, sad very sad.:shadedshu


----------



## wolf (Jan 11, 2010)

In a way its great that they are keeping real performance numbers very secret, I suppose they want to catch ATi off guard when they release it, so that they can have the crown back for at least a few months before a refreshed 5k ATi card. If we knew now what it was going to produce, ATi could _*maybe*_ have something equal or better out when Fermi is released.

I still get the feeling that even _*if*_ Fermi is mind blowingly awesome, a 5970 OC quite heavily, or Trifire with a 5850 will still beat one of them, but what i really want is lower prices all round, so I can either buy a Fermi card, or another 5870 or maybe a 5970


----------



## HalfAHertz (Jan 11, 2010)

wolf said:


> In a way its great that they are keeping real performance numbers very secret, I suppose they want to catch ATi off guard when they release it, so that they can have the crown back for at least a few months before a refreshed 5k ATi card. If we knew now what it was going to produce, ATi could _*maybe*_ have something equal or better out when Fermi is released.
> 
> I still get the feeling that even _*if*_ Fermi is mind blowingly awesome, a 5970 OC quite heavily, or Trifire with a 5850 will still beat one of them, but what i really want is lower prices all round, so I can either buy a Fermi card, or another 5870 or maybe a 5970



Has anyone seen this? 

http://www.fudzilla.com/content/view/17166/1/

I think the "refresh" is already here


----------



## TheMailMan78 (Jan 11, 2010)

HalfAHertz said:


> Has anyone seen this?
> 
> http://www.fudzilla.com/content/view/17166/1/
> 
> I think the "refresh" is already here



Wow that MSI one looks like shit!


----------



## kid41212003 (Jan 11, 2010)

That's nothing related to Fermi.... I were expecting something like "MSI Lighting GTX380 Pictured".


----------



## TheMailMan78 (Jan 11, 2010)

kid41212003 said:


> That's nothing related to Fermi.... I were expecting something like "MSI Lighting GTX380 Pictured".



It doesn't matter. Thats not a refresh anyway.


Hmmmmmm some of my posts have been deleted. I've been visted by the "Post Delete Fariy". Most of the time I wake up with an infraction under my pillow when he comes around!


----------



## TAViX (Jan 11, 2010)

HalfAHertz said:


> Has anyone seen this?
> 
> http://www.fudzilla.com/content/view/17166/1/
> 
> I think the "refresh" is already here



Well the card is just a tuned 5870 and probably will be as fast as the 5890 future card...


----------



## wolf (Jan 11, 2010)

A refresh to me is an official ATi card that superceeds a 5870 or 5970, not just an OC model, not to mention 2gb ram would have been nice to see IMO, given it was done with a 4890.


----------



## HalfAHertz (Jan 11, 2010)

Well we could start arguing that the 4890 was just an OCed 4870 with a pinch of awesome sauce added


----------



## zithe (Jan 11, 2010)

Hopefully they pick a plastic that doesn't look as cheap in the final product.


----------

