Thursday, October 21st 2010

GeForce GTX 580 Expected to be 20% Faster than GeForce GTX 480
NVIDIA's next enthusiast-grade graphics processor, the GeForce GTX 580, based on the new GF110 silicon, is poised for at least a paper-launch by end of November, or early December, 2010. Sources in the video card industry told DigiTimes that the GTX 580 is expected to be 20% faster than the existing GeForce GTX 480. The new GPU is built on the existing 40 nm process, NVIDIA's 28 nm GPUs based on the Kepler architecture are expected to take shape only towards the end of 2011. Later this week, AMD is launching the Radeon HD 6800 series performance graphics cards, and will market-launch its next high-end GPU, codenamed "Cayman" in November.
Source:
DigiTimes
98 Comments on GeForce GTX 580 Expected to be 20% Faster than GeForce GTX 480
...
fail?
Bye bye NVidia...
If you want to know more exactly how enabling SPs/ROPs trully affect power consumption on real cads, then look at GTX470 vs GTX465, because they both have the same clocks. And how much is that then again?
GTX465 = 199 W
GTX470 = 232 W
Let's experiment with that:
GTX465 = 352 SP ; GTX470 = 448 SP
GTX465 = 4 ROP Partition ; GTX470 = 5 ROP partition
Ficticial GTX470 = (199 W x 448) / 352 = 253 W hmm
Second Fictional GTX470 = (199 W x 5) / 4 = 248 W hmm
wow so it looks like actual power consumption of actual GTX470 is lower than my math. I knew that from the beginning, my assumptions above are for worst case scenario.
GF100 consumed a lot, everybody knows that. Everybody should know by now too, that it was a problem with the fabric (interconnection layer between the different units within a chip) and that the problem has already been fixed. It was mostly fixed for GF104 as can be seen by its power consumtion and is probably even better for other future releases.
And for people who are obsessing over the potential high heat and consumption. Just wait and see, its not worth getting all worked up about it now. Maybe nvidia has new strategies to over come this. Better reference coolers perhaps
Hopefully they can up the performance when it actually comes out, it's still WAY to early to tell anything.
It's really sad tbh, sadder every day. Every day that passes is more difficult to talk about tech.
If only GTX460 had never been released, then maybe, maybe, the comments wouldn't be so baseless. But the GTX460 was released and it's faster than the GTX465 while consuming 50 W less... Is it posible for Nvidia to repeat that achievement with GF110, or maybe even... take a sit AMD fanboys... further improve the efficiency a little bit?
- No, because I'm so biased I cannot even see what's in front of my eyes.
- No, because AMD has apparently improved power efficiency further with NI, but there's no way on Earth or otherwise for Nvidia to catch up. Imposible! I mean, Nvidia never had better power efficiency than AMD/Ati. Never, never, never... hmm well... hmmmmokey "only" before Evergreen/Fermi. :rolleyes:
I'm no fan boy for either side as they both suck in their own special ways and are both awesome in the ways as well, and although stupidity annoys me i can ignore the stupidity in the fanboy comments for both amd and nvidia..... how about you 2 join me in laughing at the stupidiy? :roll::laugh::roll:
Back on topic i'm really excied about seeing what nvidia will be bringing out, i just hope its very close to the 6970's release date as im sick of waiting for my gpu upgrade and im starting to think i would want something more powerful than 1gb 460 sli so hopefully either amd or nvidia can give me something that fits the bill and before the end of the year.
but it should make it so both companies have cards with similar performance at least, somewhat unlike now. wow that review shows that the HD5450 is the fastest card O.O
did you change that yourself
I have a feeling this will just be a GF104 + 50% of it again, making it 576sp, back to 48 ROPS, and still 384-bit GDDR5, they just need to improve the memory controller, and make sure they hit a good power consumption target.
They need a killer product. 20% over GTX480 doesn't cut it....they need 33% or so...then they'd have a real chance.
Mind you the current rumour about price drops is definately gonna work in thier favor.
No ones denying Fermi's issues. But as you even said, both sides have their issues.
Both sides kinda have issues, and so really, I blame TSMC.
Bear Jesus, while I can afford to just get rid of stuff and start over again, to me, that'd make all the time I've spent trying to get things working a waste of time. As far as I am concerned, either AMD fixes the issues I have, or they fail. I can't truly say that unless is see this out to the end.
But, because my usage(3 monitors) dictates I need a certain level of performance, I'm just plain out of options at this point. The 69xx-series is my last hope, or maybe this GTX580 can pull up nV's socks, and I'll switch over.
I am not "sticking it out" because I'm a fanboy...I need 60+FPS, and @ 5870x1080. A little bit of AA would be nice too. The first company that can do this, gets my cash.
For anyone else...they don't need GTX580. Seriously...a 480 is more than enough power, for anyone with a single monitor.
Bottom line: 20% over GTX480 is not enough, considering Cayman XT will be at the very least 30% - 40% faster than Cypress XT, that would make it go neck on neck(equal performance) with 'GTX580', and as everyone knows, due to amd chips being cheaper to produce, amd can lower prices more, and still make some profit.
Fact is that a smaller chip costs less than a big chip when everything else is equal.
But that's not the end of it. There's far too many things we dont' know and have as much effect on profitability.
- How much does AMD pay per waffer and how much Nvdia, considering Nvidia buys 2x the ammount of waffers and is not going to flee to GloFo as soon as GloFo is ready?
- Chip price is only a fraction of what a card costs. How much does it cost Nvidia to make the cards and how much AMD?
- AMD's ability to make smaller chips is based on much more $$ put into that R&D department than NVidia. How much?
- How much does it cost Nvidia and AMD to operate as a company?