# GeForce GTX 480 has 480 CUDA Cores?



## btarunr (Mar 16, 2010)

In several of its communications about Fermi as a GPGPU product (Next-Gen Tesla series) and GF100 GPU, NVIDIA mentioned the GF100 GPU to have 512 physical CUDA cores (shader units) on die. In the run up to the launch of GeForce 400 series however, it appears as if GeForce GTX 480, the higher-end part in the series will have only 480 of its 512 physical CUDA cores enabled, sources at Add-in Card manufacturers confirmed to Bright Side of News. This means that 15 out of 16 SMs will be enabled. It has a 384-bit GDDR5 memory interface holding 1536 MB of memory.

This could be seen as a move to keep the chip's TDP down and help with yields. It's unclear if this is a late change, because if it is, benchmark scores of the product could be different when it's finally reviewed upon launch. The publication believes that while the GeForce GTX 480 targets a price point around $449-499, while the GeForce GTX 470 is expected to be priced $299-$349. The GeForce GTX 470 has 448 CUDA cores and a 320-bit GDDR5 memory interface holding 1280 MB of memory. In another report by Donanim Haber, the TDP of the GeForce GTX 480 is expected to be 298W, with GeForce GTX 470 at 225W. NVIDIA will unveil the two on the 26th of March.





*View at TechPowerUp Main Site*


----------



## tkpenalty (Mar 16, 2010)

nvidia's naming schemes useful for once?


----------



## Deleted member 3 (Mar 16, 2010)

tkpenalty said:


> nvidia's naming schemes useful for once?



Of course not, but it is not like AMD, Intel or whatever company have decent naming schemes. For some reason naming has to be as cryptic as possible.


----------



## Kenshai (Mar 16, 2010)

Just judging by the benchmarks released, I know they probably aren't accurate, but if the GTX470 is at the $300 price point that is a pretty good price to performance card.


----------



## Cleorina (Mar 16, 2010)

512SP? Cool but 298W


----------



## slyfox2151 (Mar 16, 2010)

Sweet,

just goter wait the GTX480 hits $400 Aud  then ill be happy


----------



## wolf (Mar 16, 2010)

slyfox2151 said:


> Sweet,
> 
> just goter wait the GTX480 hits $400 Aud  then ill be happy



when that day comes I'll own two, until that my 5800's kick serious ass, especially @ 975/1250.

odd tho to me that the high end part still has one cluster disabled, and the 470 only 2 disabled. Fermi you are a troubled chip aren't you.


----------



## slyfox2151 (Mar 16, 2010)

wolf said:


> when that day comes I'll own two, until that my 5800's kick serious ass, especially @ 975/1250.
> 
> odd tho to me that the high end part still has one cluster disabled, and the 470 only 2 disabled. Fermi you are a troubled chip aren't you.



just means the GTX600 series will be epic 

i hope.. with the smaller process, 28 nm?.


----------



## afw (Mar 16, 2010)

Hmm ... Does the price suggest that the GTX 480 will not beat 5870 by a huge margin ... ??


----------



## wolf (Mar 16, 2010)

slyfox2151 said:


> just means the GTX600 series will be epic
> 
> i hope.. with the smaller process, 28 nm?.



yeah 32 or 28, something tells me Nvidia might hit 32 while ATI jumps for 28.

I've said it once and I'll say it again;

BRING ON THE REVIEWS


----------



## Lionheart (Mar 16, 2010)

Well thats rather annoying, nvidia you have dissapointed me but Im gonna wait till next week friday for the reviews and benchies on your new shiny cards


----------



## kajson (Mar 16, 2010)

If that card really has some of it's cores locked up, it will be really hard to justify asking 50% more money for it on just 10% more cores/clocks and a bit more memory. 

But indeed let's wait for the benchies..


----------



## Steevo (Mar 16, 2010)

Mwahahaha.....so all the green team who have been bashing ATI for weeks on the supposed performance have been fed bullshit lies on crap toast.


----------



## alwayssts (Mar 16, 2010)

wolf said:


> yeah 32 or 28, something tells me Nvidia might hit 32 while ATI jumps for 28.
> 
> BRING ON THE REVIEWS



I think they'll both jump to 28nm, and have products with around the same 'listed' transistors.  I also believe they both will target a 1ghz (2000 shader for nVIDIA) clock with 7gbps memory on a 256-bit bus, both with 512 cores (that's 2560sp for ATi).  I think both products will have performance around 50% faster and perform relatively close to one another...especially if ATi adds more transistors to the fixed function tessellation unit (it needs 2x performance).  That of course, is just me thinking, but I think it's plausible enough.

28nm is going to bring about a return of RV770/G92 for both companies.  It should be interesting.


----------



## [I.R.A]_FBi (Mar 16, 2010)

alwayssts said:


> 28nm is going to bring about a return of RV770/G92 for both companies.  It should be interesting.



whaddya mean?


----------



## Steevo (Mar 16, 2010)

How do we know what ATI needs, there isn't even competing cards out yet? Everyone of the green team keeps assuming this card will be so awesome, yet it has been how long, and all we keep getting is spin, more spin, and more spin on how awesome it is, and how much better it will be. If you really believe all this and feel that good, then keep on breathing the fumes man.


----------



## Imsochobo (Mar 16, 2010)

alwayssts said:


> I think they'll both jump to 28nm, and have products with around the same 'listed' transistors.  I also believe they both will target a 1ghz (2000 shader for nVIDIA) clock with 7gbps memory on a 256-bit bus, both with 512 cores (that's 2560sp for ATi).  I think both products will have performance around 50% faster and perform relatively close to one another...especially if ATi adds more transistors to the fixed function tessellation unit (it needs 2x performance).  That of course, is just me thinking, but I think it's plausible enough.
> 
> 28nm is going to bring about a return of RV770/G92 for both companies.  It should be interesting.



I think your far off.
Nvidia will get 1024 shaders or some odd number, they are a fan of that. 96 112 shaders 216 shaders and so on, and 384 and 448 bit mem bus....
Nvidia's design is already very complex

Nvidia havnt made a real trend i think.

Ati:
55nm -> 40nm = twice + 30% die size.(loads of added functionality !!!
40nm->28nm (no function add 5-10% die size increase 3200 shaders, same ratio for rop texture units and such.

depends if ati is certant that their architecture will still scale good!
Ati's tesselation unit isnt very bad really, not vs die size, ati have proven to have a very good design when it comes to performance vs die size, cheaper to make, cheaper to sell if compotition is there.


----------



## pr0n Inspector (Mar 16, 2010)

more bullshit rumours:shadedshu.


----------



## human_error (Mar 16, 2010)

If nivida really wanted to make sense with their names shouldn't they call the 470 the 448 then?  Still i'm not suprised they have done this - increased yields through redundancy is usually employed when designing silicon, only this is reactionary to poor yields as opposed to being intended redundancy intended during the design phase. I wonder if you could unlock that disabled simd unit to get the full 512 cores working as long as it isn't damaged 



Imsochobo said:


> depends if ati is certant that their architecture will still scale good!



ATi is already working on their next arcitecture - the 6k series was slated to have a different arc to the hd2k/3k/4k/5k series. Due to nvidia being so late to the party ATi may decide to delay the 6k series release until this time next year and release a refresh of the 5k series in september, then again they may decide to really hit nvidia hard and stick to their original september launch of the 6k series.


----------



## tofu (Mar 16, 2010)

Or perhaps nVidia is so confident that their Fermi will perform well at its price point and decided to disable a cluster to improve yields and pave the way to release a GTX485 later on. 

Just throwing some thoughts out there.


----------



## phanbuey (Mar 16, 2010)

human_error said:


> ATi is already working on their next arcitecture - the 6k series was slated to have a different arc to the hd2k/3k/4k/5k series. Due to nvidia being so late to the party *ATi may decide to delay the 6k series release until this time next year and release a refresh of the 5k series in september*, then again they may decide to really hit nvidia hard and stick to their original september launch of the 6k series.




That would be a mistake on their part IMO... I think thats what nvidia tried to do with the g92 and it bit them in the proverbial ass pretty hard.

I think they should pay a bit more heed to murphy's law and make the product for september.  But who knows what will happen... they probably have people much smarter than me sitting in a room somewhere thinking about this all day long.


----------



## freaksavior (Mar 16, 2010)

What annoys me the most, (sorry to those who do this) but I hate when a series of card(s) isn't even out, and people start talking about their "next" version how its going to be so much better. 

Seriously, lets wait for whats not even out first.


----------



## BraveSoul (Mar 16, 2010)

with those prices, gtx470 seems like a good deal, 
the folding power it will bring


----------



## theonedub (Mar 16, 2010)

BraveSoul said:


> with those prices, gtx470 seems like a good deal,
> the folding power it will bring



Those prices look very reasonable  I think I better sell my GTX 275s soon! Someone needs to take the plunge first, I want to know about heat, shader clocks, and PPD


----------



## jasper1605 (Mar 16, 2010)

phanbuey said:


> That would be a mistake on their part IMO... I think thats what nvidia tried to do with the g92 and it bit them in the proverbial ass pretty hard.
> 
> I think they should pay a bit more heed to murphy's law and make the product for september.  But who knows what will happen... they probably have people much smarter than me sitting in a room somewhere thinking about this all day long.



Not to mention getting paid a lot more to sit thinking all day much like we do (just without the money) lol


----------



## KainXS (Mar 16, 2010)

im tryin to think of nice things to say about the 480 but when I think about the shader clocks being permantly linked to the core since the core clock is half the shader clock, . . . . . .

well in a little while we'll know for sure, whether overclocking sucks or not

hopefully it can be unlinked like the other cards, but with these new cards you never know T.T


----------



## qubit (Mar 16, 2010)

The more news tidbits that are released, the more this chip sounds like another HD 2900 disappointment. 

As Wolf said: bring on the reviews.


----------



## Benetanegia (Mar 16, 2010)

KainXS said:


> im tryin to think of nice things to say about the 480 but when I think about the shader clocks being permantly linked to the core since the core clock is half the shader clock, . . . . . .



It's not exactly like that iirc, shaders are not linked to core clock, it's somehow difficult to say if the new aproach makjes the chip worse or better though. Everything inside the GPC (TMUs, tesselator, setup engine, rasterizer) runs at half the speed of the SPs (hot clock), but everything on the outside (ROPs, main scheduler, L2 cache) runs at the core clock.

IMO it's mostly a good thing, because the only significant move (to the higher clock) are the TMU. The setup engine, rasterizer and tesselators are suposedly much smaller than SPs, TMUs or ROPs, so they should not have any effect on keeping the shader domain reaching higher clocks or on the temperatures and stability of the GPC, IMO. The units that are supposed to be more sensible to clocks like the ROPs and L2 remain at the slower core clock.


----------



## newconroer (Mar 16, 2010)

Steevo said:


> How do we know what ATI needs, there isn't even competing cards out yet? Everyone of the green team keeps assuming this card will be so awesome, yet it has been how long, and all we keep getting is spin, more spin, and more spin on how awesome it is, and how much better it will be. If you really believe all this and feel that good, then keep on breathing the fumes man.



Putting the green versus red bit aside, Steve unlocks an even greater point :
The architecture of both GPUs and the software they run hasn't changed fundamentally. So until that does, all of the hype on upcoming GPUs as we know them, is neither here nor there.
What we'll get, is another powerful card, that still falls down like all the rest, in all the same places, for all the same reasons - of which are the same reasons that have existed for the last decade or more. 

To me that's not impressive. I like big, I like powerful, it's how I like my vehicles, but not my computer components. I'm tired of getting bigger cases, bigger motherboards, bigger radiators and bigger PSUs, only to have the overrated max FPS of a game, go plummeting straight back down to twenty-five frames, because another character strolled onto the screen, and all this supposed Direct X special effects, that unfortunately we cannot actually see, has just sucked away the performance.

Don't get me wrong, I'm pro Direct X. When people were crying and whining about DX10 being a failure, I wasn't. I understood, I got it. Had you tried to run a lot of the background processing of DX10 on DX9(if it was possible) it wouldn't be a pretty sight, and DX11 brings some much needed tools for developers.

But I'm just not 'pro waiting six months or more every year' to see these 'fabled' graphics processors be put on a pedestal, and be released and yet don't provide anything really tangible over the last generation.

Consider that brute power alone, and computational flexibility, something like a GTX 295 or 4870X2 should be MORE than enough for modern games, and they usually are. Heck I can run the X2 at clocks of 500/500 in about 90% of modern games, and still have over 50fps. But then you get those moments where it all comes crashing down, and no matter how powerful the cards, it never ends.


----------



## qubit (Mar 16, 2010)

newconroer said:


> Putting the green versus red bit aside, Steve unlocks an even greater point :
> The architecture of both GPUs and the software they run hasn't changed fundamentally. So until that does, all of the hype on upcoming GPUs as we know them, is neither here nor there.
> What we'll get, is another powerful card, that still falls down like all the rest, in all the same places, for all the same reasons - of which are the same reasons that have existed for the last decade or more.
> 
> ...



I know what you mean by the same old same old, man. I hate those frame rate drops too. However, bear in mind that quite often that low fps bottleneck can also be at the cpu and not necessarily the gpu. Keeping those frame rates high and consistent is a big challenge when designing a game and unfortunately, it's not possible to prevent high complexity/detail scenes from tanking the frame rate sometimes. This is why I love running my old DX7 games on my big, grossly overpowered rig: even the lowest points are doing over 100fps (if vsync us unlocked) and the game runs smooth as butter 100% of the time.


----------



## the54thvoid (Mar 16, 2010)

freaksavior said:


> What annoys me the most, (sorry to those who do this) but I hate when a series of card(s) isn't even out, and people start talking about their "next" version how its going to be so much better.
> 
> Seriously, lets wait for whats not even out first.




+1 dude.


----------



## phanbuey (Mar 16, 2010)

newconroer said:


> Putting the green versus red bit aside, Steve unlocks an even greater point :
> The architecture of both GPUs and the software they run hasn't changed fundamentally. So until that does, all of the hype on upcoming GPUs as we know them, is neither here nor there.
> What we'll get, is another powerful card, that still falls down like all the rest, in all the same places, for all the same reasons - of which are the same reasons that have existed for the last decade or more.
> 
> ...




I see your point but at the same time I dont.  Yeah they all come crashing down, but crashing down for an x2 or 295 is like 25 fps which is annoying but still ok.  Crashing down for a single 260 or 4870 is like 15 fps... which is just jerky enough to send me into epileptic shock.

plus

All brand new architectures are head of their time.  Becuase no developer will spend oodles of money and time to develop a game for a hardware feature that 0.0001% of the gaming market has.  (sometimes a token game comes out with sponsorship of ati/nv but changes nothing).

Its just exciting bc these cards do bring something new to the table... unlike the 4xxx or gt200 or g92 or rv670 - its been a long time since that has happened.


----------



## 1freedude (Mar 16, 2010)

fermi meant for parallel processing

disabled SM to control TPD


----------



## saikamaldoss (Mar 16, 2010)

qubit said:


> The more news tidbits that are released, the more this chip sounds like another HD 2900 disappointment.
> 
> As Wolf said: bring on the reviews.



Don,t worry there are some GREEN web sits like GURU3D to do some 8xAA benchmark and say ATI was beaten by 10% but they will not try 8xQ which is real 8X for Nvidia 

 and there are fools who support that.....


----------



## v12dock (Mar 16, 2010)

bright side of the news is more like green side of the news


----------



## OnBoard (Mar 16, 2010)

tofu said:


> Or perhaps nVidia is so confident that their Fermi will perform well at its price point and decided to disable a cluster to improve yields and pave the way to release a GTX485 later on.
> 
> Just throwing some thoughts out there.



I think they'll first make GTX495 with 2x GTX470 but with 512 Cores. Then later with better yields GTX485 with some OC.


----------



## Marineborn (Mar 16, 2010)

saikamaldoss said:


> Don,t worry there are some GREEN web sits like GURU3D to do some 8xAA benchmark and say ATI was beaten by 10% but they will not try 8xQ which is real 8X for Nvidia
> 
> and there are fools who support that.....



+1, team green will release there next set of cards in 4 yrs, lol


----------



## Edito (Mar 16, 2010)

Where all the hate for nVidia FERMI go??? anyone??? lol don't get me wrong i like ATI but i knew this would happen nice move nvidia the good price is the key and nice work ATI/AMD to keep nvidia controlled in the price departament...


----------



## eidairaman1 (Mar 17, 2010)

so where's the board at then huh Nvidia Huh!? I dont see it in Wizzard's or other members hands. To me it's deader than dead itself.


----------



## simlariver (Mar 17, 2010)

qubit said:


> The more news tidbits that are released, the more this chip sounds like another HD 2900 disappointment.
> 
> As Wolf said: bring on the reviews.



x2

All this ongoing delays and lack of communication from Nvidia and those rumours about insanely high TPD are clearly pointing to a disaster release from Nvidia. If they were confident in their product, they would brag about it to no end instead of hiding it far from the rogue benchmarkers.


----------



## locoty (Mar 17, 2010)

Shot of Die A3 Stepping of Fermi























more HERE


----------



## pr0n Inspector (Mar 17, 2010)

Look at all the scared fanboys trying to convince themselves that these cards are full of fail. How entertaining.


----------



## OnBoard (Mar 17, 2010)

Naked pictures are always nice, be they cores or more fleshy stuff 

So 87 cores ready at least  Wonder if they leave that marker stuff under IHS or those are just quality assurance samples.


----------



## Benetanegia (Mar 17, 2010)

locoty said:


>



_Stares at pic._

Oh nooooo! Fermi A3 silicon also has 2% yields. Because we can clearly see a number 2 written over that die and as everybody knows when a company gets few samples back from factory they write numbers on them (and only when they get very few of them, otherwise they would never write over them, it would be stupid to do so) and always ALWAYS show the one with the higher number, in this case a 2 (in other a 7 ). That number clearly means 2% yields.


----------



## Wile E (Mar 17, 2010)

The gap between 470 and 480 is now far too small. This is very bad news, imo.


----------



## xtremesv (Mar 17, 2010)

I keep saying... rumooooorrrrrs. Show me the numbers.

There's no attack to Nvidia, it's just fair to recognize when there's a disappointment, that's how human civilization has evolved, learning from errors.

ATi, for instance, failed with their HD2xxx/3xxx series, that's why I kept my X1650Pro a little longer. Nvidia delivered the mythical G92 and the sole reason I chose my HD4830 over the 9800GT was the price (in my country Nvidia is really overpriced).

Nvidia failed with the FX series and ATi triumphed with the 9xxx (9 a lucky number?).

There's no need to fight, even a fanboy has to reckon when his/her company screwed it up.

More rumors? See this:

http://www.semiaccurate.com/2010/02/20/semiaccurate-gets-some-gtx480-scores/


----------



## v12dock (Mar 17, 2010)

"One said they measured it at 70C at idle on the 2D clock."


----------



## Bjorn_Of_Iceland (Mar 17, 2010)

phanbuey said:


> ...
> All brand new architectures are head of their time.  Becuase no developer will spend oodles of money and time to develop a game for a hardware feature that 0.0001% of the gaming market has.  (sometimes a token game comes out with sponsorship of ati/nv but changes nothing).
> 
> Its just exciting bc these cards do bring something new to the table... unlike the 4xxx or gt200 or g92 or rv670 - its been a long time since that has happened.


It really is a hit or miss when it comes to deploying new architecture / hardware level instructions. Heck when MMX came out, we all thought that it was the future of games.. but then graphics accellerators squashed it.. and tbh, when that whole Hardware T&L came out, very few games were using it, I thought it was just another fad.. but then it became staple to all games.


----------



## theonedub (Mar 17, 2010)

Wile E said:


> The gap between 470 and 480 is now far too small. This is very bad news, imo.



How so? When looking back at the GT200 cards the gap between the GT400 cards is actually bigger. I know ATI cards are a little different, but by nVidia standards this is about right.


----------



## Marineborn (Mar 17, 2010)

pr0n Inspector said:


> Look at all the scared fanboys trying to convince themselves that these cards are full of fail. How entertaining.



imo they are full of fail, released 6-7 months late then when they should have been.


----------



## Wile E (Mar 17, 2010)

theonedub said:


> How so? When looking back at the GT200 cards the gap between the GT400 cards is actually bigger. I know ATI cards are a little different, but by nVidia standards this is about right.



Even the GTX280 had 11% more shaders than the 260 core 216 (even bigger difference for the original 196 shader 260's). If this rumor is true, the 480 only has 7% more than 470.


----------



## DeathByTray (Mar 17, 2010)

I can't wait to see this farce coming to an end.


----------



## theonedub (Mar 17, 2010)

Wile E said:


> Even the GTX280 had 11% more shaders than the 260 core 216 (even bigger difference for the original 196 shader 260's). If this rumor is true, the 480 only has 7% more than 470.



I was thinking that the GTX275 and GTX285 were more similar to the GTX470 and GTX480, in which case there is a bigger gap this go 'round. I don't know how fair comparing a 260 to a 280 is (aside from that they are both launch cards), but I see where you are going.


----------



## DirectorC (Mar 17, 2010)

450/300 sounds a lot closer to what I was hoping they would release the cards for than what everyone else was saying.  This shows nVidia isn't completely bonkers.  For 300 I will stay with the green team.


----------



## phanbuey (Mar 17, 2010)

this will be a solid card... it would have been a solid card 8 months ago, when it should have been released, but these things can happen when one tries to cram 20lbs of ***t in a 10lb bag.


----------



## Hayder_Master (Mar 17, 2010)

with this price will be good deal, better than ATI, and about 512 cores i think they plan for GTX490


----------



## KainXS (Mar 17, 2010)

Wile E said:


> Even the GTX280 had 11% more shaders than the 260 core 216 (even bigger difference for the original 196 shader 260's). If this rumor is true, the 480 only has 7% more than 470.



yeah but the GTX470 will have about 20% less raster performance than the GTX480

still I wouldn't buy a GTX480 for 500 but im just sayin(tryin to think of somethin nice to say about em)


----------



## phanbuey (Mar 18, 2010)

KainXS said:


> yeah but the GTX470 will have about 20% less raster performance than the GTX480
> 
> still I wouldn't buy a GTX480 for 500 but im just sayin(tryin to think of somethin nice to say about em)



but it sounds like that the bulk will be done with artificially low core/memory clocks... as soon as that bad boy is OC'd it will be right up there with the 480.


----------



## wahdangun (Mar 18, 2010)

phanbuey said:


> but it sounds like that the bulk will be done with artificially low core/memory clocks... as soon as that bad boy is OC'd it will be right up there with the 480.



IF they are OC-able, but i doubt because, the core already hot as hell


----------



## Super XP (Mar 18, 2010)

DanTheBanjoman said:


> Of course not, but it is not like AMD, Intel or whatever company have decent naming schemes. For some reason naming has to be as cryptic as possible.


Yes to confuse the crap our of people so people don't know what the hell they are buying.


KainXS said:


> yeah but the GTX470 will have about 20% less raster performance than the GTX480
> 
> still I wouldn't buy a GTX480 for 500 but im just sayin(*tryin to think of somethin nice to say about em*)


Finding it difficult Ey  Yes your not alone


----------



## Bjorn_Of_Iceland (Mar 18, 2010)

hayder.master said:


> with this price will be good deal, better than ATI, and about 512 cores i think they plan for GTX490


Yes. But Ati can lower their current card's prices anytime, making a huge gap in a performance / dollar.


----------



## TAViX (Mar 18, 2010)

.....Yeah, but AMD/ATI has 1600 shader cores!!! More than 3 times!



xtremesv said:


> I keep saying... rumooooorrrrrs. Show me the numbers.



*Numbers are irrelevant! Benching is futile!*

Half of year is almost an eternity on a very tight competition market.


----------



## Bjorn_Of_Iceland (Mar 18, 2010)

1 ati shader core != 1 cuda core


----------



## bobzilla2009 (Mar 18, 2010)

Bjorn_Of_Iceland said:


> 1 ati shader core != 1 cuda core



indeed, the are pretty much uncomparable. But for laymans 'rough' conversions, it has generally been 5 ATi cores = 1 nvidia core for a while for equal performance cards. Nothing to do with efficiency or one company being better ect. It's just architectural differences between them. This time looks like it'll stay that way, roughly (maybe 4 = 1 this time).

Also, if those prices are correct AMD will still win out. People have forgotten how long the hd5xxx series has been out for. OF COURSE AMD are going to go on a price slashing spree  if nvidia can match the hd5xxx series prices for 6 months ago then big whoop, they should've done so 6 months ago! Which is going to be awesome for everyone, i personally can't wait to see sub-£200 hd5850's.

On a side note, catalyst 10.3a has added pretty decent performance boosts in a lot of games, they're the drivers ATi has obviously been saving for the eventual release of fermi (judging by how there has always been talk from AMD of awesome drivers whenever they were close to a fermi release date, now it's final, they've been released).


----------



## Super XP (Mar 18, 2010)

Bjorn_Of_Iceland said:


> 1 ati shader core != 1 cuda core


It doesn't matter even if it was 100 ATi shaders = 1 cuda core, both hardware designs work differently and may provide similar results.


----------



## nt300 (Mar 18, 2010)

Good point. The cat 10.3 are wonderful but I can't wait for cat 10.4 to come out


----------



## DirectorC (Mar 18, 2010)

http://vr-zone.com/articles/nvidia-geforce-gtx-480-final-specs--pricing-revealed/8635.html

*GeForce GTX 480 : 512 SP, 384-bit, 295W TDP, US$499

GeForce GTX 470 : 448 SP, 320-bit, 225W TDP, US$349*


----------



## nt300 (Mar 18, 2010)

DirectorC said:


> http://vr-zone.com/articles/nvidia-geforce-gtx-480-final-specs--pricing-revealed/8635.html
> 
> *GeForce GTX 480 : 512 SP, 384-bit, 295W TDP, US$499
> 
> GeForce GTX 470 : 448 SP, 320-bit, 225W TDP, US$349*


If this is true I don't see a Dual Fermi coming out anytime soon unless they make it with GTX 470 and lower clock speeds. Dang, that 480 is cooking up a storm. I can see the HD 5970 with that much heat output for a Dual-gpu but a single? Good luck on keeping that 480 sucker cool. I don't see much overclock headroom right now.


----------



## DirectorC (Mar 19, 2010)

The guy at VR-ZONE changed his mind, put these numbers up:

*GeForce GTX 480 : 480 SP, 700/1401/1848MHz core/shader/mem, 384-bit, 1536MB, 295W TDP, US$499

GeForce GTX 470 : 448 SP, 607/1215/1674MHz core/shader/mem, 320-bit, 1280MB, 225W TDP, US$349*


----------



## Super XP (Mar 19, 2010)

DirectorC said:


> The guy at VR-ZONE changed his mind, put these numbers up:
> 
> *GeForce GTX 480 : 480 SP, 700/1401/1848MHz core/shader/mem, 384-bit, 1536MB, 295W TDP, US$499
> 
> GeForce GTX 470 : 448 SP, 607/1215/1674MHz core/shader/mem, 320-bit, 1280MB, 225W TDP, US$349*


 Now that makes more sense. I knew NVIDIA had issues running when 512 SP. (16 x 32 = 512) they run in blocks of 32, so they would have to disable one block. But I read the problems were scattered all over the place, I guess they didabled the worst section.


----------



## DirectorC (Mar 19, 2010)

I was kind of skeptical about the jump from 448 to 512 myself.

Power consumption of these things is still insane.  Now we will get to put our way overpowered PSUs to work!


----------



## Super XP (Mar 19, 2010)

Do you believe they'll be releasing OC'ed versions of the GTX 480 
http://www.fudzilla.com/content/view/18154/34/


----------



## the54thvoid (Mar 19, 2010)

Fudzilla is to ATI as Charlie Demerijan is to NV.  Well maybe not quite like that but it's generally very NVoptimistic.

Besides the article has next to zero content (unlike most of the very recent GF100 info) and it's a picture of a box with no other writing apart from 'Zotac' and 'GTX480'.  Oh it has a crown on it.  That must mean it's made of gold or something.

Avoid (Fud)zilla.  Stick to TPU, BSN and Anandtech IMO.


----------



## Kenshai (Mar 19, 2010)

the54thvoid said:


> Fudzilla is to ATI as Charlie Demerijan is to NV.  Well maybe not quite like that but it's generally very NVoptimistic.
> 
> Besides the article has next to zero content (unlike most of the very recent GF100 info) and it's a picture of a box with no other writing apart from 'Zotac' and 'GTX480'.  Oh it has a crown on it.  That must mean it's made of gold or something.
> 
> Avoid (Fud)zilla.  Stick to TPU, BSN and Anandtech IMO.



Did you click the source link in the article that is linked from TPU?


----------



## nt300 (Mar 19, 2010)

Fudzilla gets info quite fast mostly before anybody else. It based on speculation same with Semiacurate and The Inquirer. Nothing wrong with that because most of hte time they are right on the money an before anybody else.


----------



## the54thvoid (Mar 20, 2010)

Kenshai said:


> Did you click the source link in the article that is linked from TPU?



oops, my bad.

But... just clicked it there.  What am i seeing?  The same GTX 480 - no clock details so why speculate about it's overclockiness?

and this...

_launch a special limited edition packaging and a limited number will be incorporated into a number of media is currently being carried out in order first to be sold._

....means jack all.  Except perhaps to imply there will be added incentive given away with card (such as the razer mice).

Now, if indeed it is fact that razer mice are being given away with the cards, this indicates Nvidia's massive investment in marketing this card.  If it were stand alone superb, it wouldnt require quite so much *ahem* pomp and ceremony and whoring it out with other peripherals.

Or is it, Buy a Razer Deathadder and get a free GTX 470?


----------



## Kenshai (Mar 20, 2010)

the54thvoid said:


> oops, my bad.
> 
> But... just clicked it there.  What am i seeing?  The same GTX 480 - no clock details so why speculate about it's overclockiness?
> 
> ...



A lot of card manufacturers give away games and stuff, I understand a mouse doesn't fall under the same category. But I believe it's due to the vendor rather than Nvidia.


----------



## eidairaman1 (Mar 20, 2010)

Kenshai said:


> A lot of card manufacturers give away games and stuff, I understand a mouse doesn't fall under the same category. But I believe it's due to the vendor rather than Nvidia.



For 1 most good mice cost more than a DVD and also Mice production is way higher creating additional weight to a container or make it bigger which in turn means more wasted material and more money spent to produce the packaging when it is really not needed. Plus if the Mouse that is included with the package fails You would have to get a RMA for it which means more wasted time and money from several companies.


----------



## Wile E (Mar 20, 2010)

nt300 said:


> Fudzilla gets info quite fast mostly before anybody else. It based on speculation same with Semiacurate and The Inquirer. *Nothing wrong with that because most of hte time they are right on the money an before anybody else.*



Ummmm, no they aren't. There's a reason they are no longer allowed as a source in TPU's new section.


----------



## Kenshai (Mar 20, 2010)

eidairaman1 said:


> For 1 most good mice cost more than a DVD and also Mice production is way higher creating additional weight to a container or make it bigger which in turn means more wasted material and more money spent to produce the packaging when it is really not needed. Plus if the Mouse that is included with the package fails You would have to get a RMA for it which means more wasted time and money from several companies.



Wasn't arguing this point, just stating that Nvidia doesn't usually decide what incentives will go for with a card.


----------

