Thursday, January 7th 2010
NVIDIA GF100 Graphics Card Chugs Along at CES
NVIDIA's next generation graphics card based on the Fermi architecture, whose consumer variant is internally referred to as GF100 got to business at the CES event being held in Las Vegas, USA, performing live demonstration of its capabilities. The demo PC is housing one such accelerator which resembles the card in past sightings. Therefore it is safe to assume this is what the reference NVIDIA design of GF100 would look like. The accelerator draws power from 6-pin and 8-pin power connectors. It has no noticeable back-plate, a black PCB, and a cooler shroud with typical NVIDIA styling. The demo rig was seen running Unigine Heaven in a loop showing off the card's advanced tessellation capabilities in DirectX 11 mode. The most recent report suggests that its market availability can be expected in March, later this year. No performance figures have been made public as yet.A short video clip after the break.
Source:
PCWatch
105 Comments on NVIDIA GF100 Graphics Card Chugs Along at CES
Because their naming schemes change every year....
So is GF100 entry ? best? lowend.... uhm ?
Gj getting something to show this round Nvidia!
well then they had issues finding internal codenames then :P
TNT-2
Geforce 3
Geforce 5600
Geforce 6600 gt
Geforce 6800 OC
Geforce 8600 gts
Geforce 8800 gts 320
Geforce 9800 gt 1 gig
GTX 260 216
GTX 285 SSC
No plans on selling. When the new card comes out I may buy.
The 8-pin connector is an internode (a single line, in the middle of which is a connector).
No big deal..jsut means maybe production cards will be better/faster.
but yeah, I hear what you are saying...from 8-pin, a 6-pin hangs. But then why didn't they use just the single cable?
Good idea of length here too...that's a Raven RV01 case.
Capiche?
It's just the way the PSU's cables were designed. The "second" connector in the diagram above is what went into the card in the picture above. There's nothing more to it than this.
Again, then why didn't they use just the single cable?
Meh...just something to talk about...:laugh:...clearly this is not a full production card, as there are a few months left, almost, until release.
@ the naming scheme I think this is supposed to be the 10th rendition of the geforce series hence the gf on the tag, which means they are more than likely countign the gt200 as the geforce 9 onmiting the 9800's as the g92 first bore an 8800 designation.
it likly means nvidia is returnign to a normal card progression internal naming scheme.
speculation truly especially by cores without revision series were only up to 6 on the gt200 which would make the gf100 the 7th.
Tom's Hardware on Fermi - www.tomshardware.com/reviews/ces-2010-fermi,2527-4.html He's claiming Fermi to be faster than HD5970 and that's from a Nvidia guy this time. Real or not, it's one step closer to being an official statement, not rumors or fakes (now, they really were fakes?). I'll chose to believe him, because some months ago I made my own calculations based on the specs and reached that conclusion, as some of you may remember. We'll see, but I'm optimistic.
As my head gets dizzy by knowing whats what.
And now it seems ati is doing the same, still not as hard as nvidia's naming have been lately, but they have started so they might follow footsteps :P
Also did anyone notice that the guy running the demo happened to keep switching the modes during the really intense tessellation parts? It was either him doing it or that is the way they had it set up to run.
I the parts that it did run with full Tessellation it did look a little choppy at times. I do think that it looked a little smoother than my single 5870..... but not too much. Not enough to make me really impressed.
So let's say for a second that this thing really does beat a 5870 by 36%. Well I wouldn't be surprised it only is coming out half a year later. If you ask me that number wouldn't be that impressive for 6 months later.
But I still like the idea of the technology that they are using. I also think it will be a great addition to the world of GPU's. It will keep things moving forward that is for sure!
But If I was that confident with it's performance and I were Nvidia. I would have left the FPS counter on the bottom and the top..... still left out there for all to see. But they didn't makes me wonder.