Tuesday, March 16th 2010

GeForce GTX 480 has 480 CUDA Cores?

In several of its communications about Fermi as a GPGPU product (Next-Gen Tesla series) and GF100 GPU, NVIDIA mentioned the GF100 GPU to have 512 physical CUDA cores (shader units) on die. In the run up to the launch of GeForce 400 series however, it appears as if GeForce GTX 480, the higher-end part in the series will have only 480 of its 512 physical CUDA cores enabled, sources at Add-in Card manufacturers confirmed to Bright Side of News. This means that 15 out of 16 SMs will be enabled. It has a 384-bit GDDR5 memory interface holding 1536 MB of memory.

This could be seen as a move to keep the chip's TDP down and help with yields. It's unclear if this is a late change, because if it is, benchmark scores of the product could be different when it's finally reviewed upon launch. The publication believes that while the GeForce GTX 480 targets a price point around $449-499, while the GeForce GTX 470 is expected to be priced $299-$349. The GeForce GTX 470 has 448 CUDA cores and a 320-bit GDDR5 memory interface holding 1280 MB of memory. In another report by Donanim Haber, the TDP of the GeForce GTX 480 is expected to be 298W, with GeForce GTX 470 at 225W. NVIDIA will unveil the two on the 26th of March.
Sources: Bright Side of News, DonanimHaber
Add your own comment

79 Comments on GeForce GTX 480 has 480 CUDA Cores?

#1
tkpenalty
nvidia's naming schemes useful for once? :rolleyes:
Posted on Reply
#2
Deleted member 3
tkpenaltynvidia's naming schemes useful for once? :rolleyes:
Of course not, but it is not like AMD, Intel or whatever company have decent naming schemes. For some reason naming has to be as cryptic as possible.
Posted on Reply
#3
Kenshai
Just judging by the benchmarks released, I know they probably aren't accurate, but if the GTX470 is at the $300 price point that is a pretty good price to performance card.
Posted on Reply
#4
Cleorina
512SP? Cool:roll: but 298W:banghead:
Posted on Reply
#5
slyfox2151
Sweet,

just goter wait the GTX480 hits $400 Aud :D then ill be happy
Posted on Reply
#6
wolf
Better Than Native
slyfox2151Sweet,

just goter wait the GTX480 hits $400 Aud :D then ill be happy
when that day comes I'll own two, until that my 5800's kick serious ass, especially @ 975/1250.

odd tho to me that the high end part still has one cluster disabled, and the 470 only 2 disabled. Fermi you are a troubled chip aren't you.
Posted on Reply
#7
slyfox2151
wolfwhen that day comes I'll own two, until that my 5800's kick serious ass, especially @ 975/1250.

odd tho to me that the high end part still has one cluster disabled, and the 470 only 2 disabled. Fermi you are a troubled chip aren't you.
just means the GTX600 series will be epic :D

i hope.. with the smaller process, 28 nm?.
Posted on Reply
#8
afw
Hmm ... Does the price suggest that the GTX 480 will not beat 5870 by a huge margin ... ??
Posted on Reply
#9
wolf
Better Than Native
slyfox2151just means the GTX600 series will be epic :D

i hope.. with the smaller process, 28 nm?.
yeah 32 or 28, something tells me Nvidia might hit 32 while ATI jumps for 28.

I've said it once and I'll say it again;

BRING ON THE REVIEWS
Posted on Reply
#10
Lionheart
Well thats rather annoying, nvidia you have dissapointed me but Im gonna wait till next week friday for the reviews and benchies on your new shiny cards:D
Posted on Reply
#11
kajson
If that card really has some of it's cores locked up, it will be really hard to justify asking 50% more money for it on just 10% more cores/clocks and a bit more memory.

But indeed let's wait for the benchies..
Posted on Reply
#12
Steevo
Mwahahaha.....so all the green team who have been bashing ATI for weeks on the supposed performance have been fed bullshit lies on crap toast.
Posted on Reply
#13
alwayssts
wolfyeah 32 or 28, something tells me Nvidia might hit 32 while ATI jumps for 28.

BRING ON THE REVIEWS
I think they'll both jump to 28nm, and have products with around the same 'listed' transistors. I also believe they both will target a 1ghz (2000 shader for nVIDIA) clock with 7gbps memory on a 256-bit bus, both with 512 cores (that's 2560sp for ATi). I think both products will have performance around 50% faster and perform relatively close to one another...especially if ATi adds more transistors to the fixed function tessellation unit (it needs 2x performance). That of course, is just me thinking, but I think it's plausible enough.

28nm is going to bring about a return of RV770/G92 for both companies. It should be interesting.
Posted on Reply
#14
[I.R.A]_FBi
alwayssts28nm is going to bring about a return of RV770/G92 for both companies. It should be interesting.
whaddya mean?
Posted on Reply
#15
Steevo
How do we know what ATI needs, there isn't even competing cards out yet? Everyone of the green team keeps assuming this card will be so awesome, yet it has been how long, and all we keep getting is spin, more spin, and more spin on how awesome it is, and how much better it will be. If you really believe all this and feel that good, then keep on breathing the fumes man.
Posted on Reply
#16
Imsochobo
alwaysstsI think they'll both jump to 28nm, and have products with around the same 'listed' transistors. I also believe they both will target a 1ghz (2000 shader for nVIDIA) clock with 7gbps memory on a 256-bit bus, both with 512 cores (that's 2560sp for ATi). I think both products will have performance around 50% faster and perform relatively close to one another...especially if ATi adds more transistors to the fixed function tessellation unit (it needs 2x performance). That of course, is just me thinking, but I think it's plausible enough.

28nm is going to bring about a return of RV770/G92 for both companies. It should be interesting.
I think your far off.
Nvidia will get 1024 shaders or some odd number, they are a fan of that. 96 112 shaders 216 shaders and so on, and 384 and 448 bit mem bus....
Nvidia's design is already very complex

Nvidia havnt made a real trend i think.

Ati:
55nm -> 40nm = twice + 30% die size.(loads of added functionality !!!
40nm->28nm (no function add 5-10% die size increase 3200 shaders, same ratio for rop texture units and such.

depends if ati is certant that their architecture will still scale good!
Ati's tesselation unit isnt very bad really, not vs die size, ati have proven to have a very good design when it comes to performance vs die size, cheaper to make, cheaper to sell if compotition is there.
Posted on Reply
#18
human_error
If nivida really wanted to make sense with their names shouldn't they call the 470 the 448 then? :p Still i'm not suprised they have done this - increased yields through redundancy is usually employed when designing silicon, only this is reactionary to poor yields as opposed to being intended redundancy intended during the design phase. I wonder if you could unlock that disabled simd unit to get the full 512 cores working as long as it isn't damaged :D
Imsochobodepends if ati is certant that their architecture will still scale good!
ATi is already working on their next arcitecture - the 6k series was slated to have a different arc to the hd2k/3k/4k/5k series. Due to nvidia being so late to the party ATi may decide to delay the 6k series release until this time next year and release a refresh of the 5k series in september, then again they may decide to really hit nvidia hard and stick to their original september launch of the 6k series.
Posted on Reply
#19
tofu
Or perhaps nVidia is so confident that their Fermi will perform well at its price point and decided to disable a cluster to improve yields and pave the way to release a GTX485 later on.

Just throwing some thoughts out there.
Posted on Reply
#20
phanbuey
human_errorATi is already working on their next arcitecture - the 6k series was slated to have a different arc to the hd2k/3k/4k/5k series. Due to nvidia being so late to the party ATi may decide to delay the 6k series release until this time next year and release a refresh of the 5k series in september, then again they may decide to really hit nvidia hard and stick to their original september launch of the 6k series.
That would be a mistake on their part IMO... I think thats what nvidia tried to do with the g92 and it bit them in the proverbial ass pretty hard.

I think they should pay a bit more heed to murphy's law and make the product for september. But who knows what will happen... they probably have people much smarter than me sitting in a room somewhere thinking about this all day long.
Posted on Reply
#21
freaksavior
To infinity ... and beyond!
What annoys me the most, (sorry to those who do this) but I hate when a series of card(s) isn't even out, and people start talking about their "next" version how its going to be so much better.

Seriously, lets wait for whats not even out first.
Posted on Reply
#22
BraveSoul
with those prices, gtx470 seems like a good deal,
the folding power it will bring:rolleyes:
Posted on Reply
#23
theonedub
habe fidem
BraveSoulwith those prices, gtx470 seems like a good deal,
the folding power it will bring:rolleyes:
Those prices look very reasonable :D I think I better sell my GTX 275s soon! Someone needs to take the plunge first, I want to know about heat, shader clocks, and PPD :rockout:
Posted on Reply
#24
jasper1605
phanbueyThat would be a mistake on their part IMO... I think thats what nvidia tried to do with the g92 and it bit them in the proverbial ass pretty hard.

I think they should pay a bit more heed to murphy's law and make the product for september. But who knows what will happen... they probably have people much smarter than me sitting in a room somewhere thinking about this all day long.
Not to mention getting paid a lot more to sit thinking all day much like we do (just without the money) lol
Posted on Reply
#25
KainXS
im tryin to think of nice things to say about the 480 but when I think about the shader clocks being permantly linked to the core since the core clock is half the shader clock, . . . . . .

well in a little while we'll know for sure, whether overclocking sucks or not

hopefully it can be unlinked like the other cards, but with these new cards you never know T.T
Posted on Reply
Add your own comment
Dec 18th, 2024 05:43 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts