Friday, March 26th 2010

NVIDIA Preparing First Fermi-Derivative Performance GPU, GF104

With the launch of GeForce GTX 400 series enthusiast-grade graphics cards based on the GF100 GPU being a stone's throw away, it is learned that work could be underway at NVIDIA to develop a new performance GPU as a successor to G92 and its various derivatives, according to 3DCenter.org, a German tech portal. Codenamed GF104, the new GPU targets performance/price sweet-spots the way G92 did back in its day with the GeForce 8800 GT and 8800 GTS-512, which delivered high-end sort of performance at "unbelievable" price points. The GF104 is a derivative of the Fermi architecture, and uses a physically down-scaled design of the GF100. It is said to pack 256 CUDA cores, 32 ROPs, 32 TMUs, and a 256-bit wide GDDR5 memory interface. The more compact die as a result could achieve high clock speeds, like G92 did compared to the G80.

GF104 is believed to form three SKUs to fill performance-thru-mainstream market segments, starting with the fastest GeForce GTS 450, GeForce GTS 440, and GeForce GTS 430 (likely name). Among these, the GTS 450 enables all of GF104's features and specifications, with well over 700 MHz core speed, 1500 MHz shader, and 1800 MHz memory. This part could be priced at around the 240 EUR mark, and target performance levels of the ATI Radeon HD 5830. A notch lower, the GTS 440 has all the hardware inside the GF104 enabled, but has around 20% lower clock speeds, priced over 160 EUR, under 180 EUR. At the bottom is the so-called GTS 430, which could disable a few of the GPU's components, with 192 CUDA cores, and 192-bit GDDR5 memory interface, priced under $150. The lower two SKUs intend to compete with the Radeon HD 5700 series. The source says that the new SKUs could be out this summer.
Source: 3DCenter.org
Add your own comment

36 Comments on NVIDIA Preparing First Fermi-Derivative Performance GPU, GF104

#1
DaMulta
My stars went supernova
HD 2900XT all over again?????
Posted on Reply
#2
shevanel
this answers my question... the one i had about which half-assed approach would NV take to compete with the mainstream market. although i will admit this sounds better than i had imagined. :roll:
Posted on Reply
#3
Binge
Overclocking Surrealism
wtf GF104? lammemeeeeeeeeeeee
Posted on Reply
#4
btarunr
Editor & Senior Moderator
Bingewtf GF104? lammemeeeeeeeeeeee
GF104 is a GPU codename like GF100. It won't be printed on boxes so nothing lame about the name.
Posted on Reply
#5
Unregistered
WTF!!!!!, is this what nvdia promise ????:rolleyes:


it's really lame, i want GTX 480 benched right now!!!!!:cry:
Posted on Edit | Reply
#6
roast
I'm expecting GTX470 and 480 to be fail., and on that note, With these GF104 chips, I think NV could be on to a real winner. This could replace all of the G92 cards around, and if NV plan it right and set the right price, this would be all OEM's new favorite chip.... rebranding every two months...

I'm not saying I'm happy about it, but it would be a great money spinner for nVidia
Posted on Reply
#7
kajson
first actually launch the main product, then start boasting about cheaper/better derivatives......

Or maybe they'll just pospone the main product indefinitely and move straight to the derivatives
Posted on Reply
#8
KainXS
well I can only expect that they will remove half the sm's completely and 2 memory controllers but keep the layout of the sm's the same, if they did this then the GF104 would not have 64TMU's it would have 32, not even the GTX480 has 64TMU's, but since these chips will be so scalable they could just drop sm's and keep adding new cards by mixing different #'s of sm's with different# of memory controllers.

woops read wrong it says 32
Posted on Reply
#9
EchoMan
So who would buy a gts 450 @ 240 EUR when you can buy a 5850 right now for less?
Posted on Reply
#10
Divide Overflow
btarunrGF104 is a GPU codename like GF100. It won't be printed on boxes so nothing lame about the name.
Actual product name GF404?
Posted on Reply
#11
TheGuruStud
EchoManSo who would buy a gts 450 @ 240 EUR when you can buy a 5850 right now for less?
Yeah, the price and perf is

They're gonna have to do better than that to compete.
Posted on Reply
#12
monte84
wasn't today supposed to be the release date?
Posted on Reply
#13
springs113
monte84wasn't today supposed to be the release date?
i thought the same over @ xtremes....they have posted some numbers and i must say ATI win NV Fail
Posted on Reply
#14
madswimmer
today is the release at pax east, sposed to have 5 ct
Posted on Reply
#15
Fourstaff
Divide OverflowActual product name GF404?
Error code 404, more like
Posted on Reply
#16
Roph
Nvidia are actually... replacing G92? Not renaming it again? G92 will finally die?

Credit where credit is due :toast:
FourstaffError code 404, more like
Posted on Reply
#17
gvblake22
If these tables prove accurate, these will be the first mid-range video cards to not feature a 128-bit memory bus. The lowest is 192-bit and 256-bit for the middle and highest mid-range parts. I'm actually looking forward to the GTS450!
Posted on Reply
#18
Imsochobo
gvblake22If these tables prove accurate, these will be the first mid-range video cards to not feature a 128-bit memory bus. The lowest is 192-bit and 256-bit for the middle and highest mid-range parts. I'm actually looking forward to the GTS450!
HD 5770.......
Posted on Reply
#19
PCpraiser100
Its a sign that Nvidia is hyping the party up to confuse people. Fan boys are having techgasms atm while we OC our HD 5000 cards and previous gen cards at this time.
Posted on Reply
#20
erocker
*
People, right in the pictures it shows GTS 430, GTS 440 and GTS 450. I think you can come up with a good guess on what cards they will replace. ;)
Posted on Reply
#22
gvblake22
ImsochoboHD 5770.......
I may be reading this wrong, but it looks like it says Bus Width: 128 Bit...

Posted on Reply
#23
Imsochobo
gvblake22I may be reading this wrong, but it looks like it says Bus Width: 128 Bit...

tpucdn.com/reviews/HIS/HD_5770/images/gpuz_oc.gif
Sorry mate, late at night after a loong day, or three loong weeks with work everyday, starting to get to me, finally holiday.

Read wrong on ur comment and on the specs, was sure i read 128 bit, confused me :S

Still, theese will be expensive, and yet slower than 5850 all of them, same production price...
No 64 bit float like the big bro's.

What is there to them ?
Posted on Reply
#24
gvblake22
ImsochoboSorry mate, late at night after a loong day, or three loong weeks with work everyday, starting to get to me, finally holiday.
HAHA, no worries bro. ;)
ImsochoboStill, theese will be expensive, and yet slower than 5850 all of them, same production price...
No 64 bit float like the big bro's.
How do you know the production costs already? The target prices quoted in the article could be inflated by nVidia to start with and not indicative of actual production cost, leaving room for price drops and such. Either way though, those prices seem a bit on the high side, especially considering the very well received 5850 will keep dropping in price bit by bit until these things actually reach the market. Only time will tell I guess!
Posted on Reply
#25
Imsochobo
gvblake22HAHA, no worries bro. ;)

How do you know the production costs already?
GF100 DIE.

50% or maybe less, i dunno how much the 2x float takes of space. but i reccon its roughly around 270-330 nm.

Ati has 334 NM for 5870.

GTX285 has ~484

Estimate suggests its a fairly high production cost for the performance.

The question will remain, is the GPGPU stuff that nvidia taunts the big deal, or wont it be much use for us.

Surely, they HAVE to do something, cause they are getting well, smashed between AMD and Intel, both making ONdie NB, SB, GPU.
Making Nvidia chipset with IGP obsolute!
Nvidia chipsets all in all.

Thats huuuge incomes, immagine, Only high end laptops would have nvidia, thats just about 20% ? ouch...
Nvidia's GPGPU (Tesla) really is a great income source!

Will it affect us costumers, design cost for two seperate designs really isnt worth it either.(archs)

Really no hard fact on the die size, estimates, but well, atleast 260mm^2. unless nvidia has a magic way to shrink the size of a cuda core.
Posted on Reply
Add your own comment
Dec 26th, 2024 17:51 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts