Tuesday, March 30th 2010
XFX Abandons GeForce GTX 400 Series
XFX is getting cozier with AMD by the day, which is an eyesore for NVIDIA. Amidst the launch of GeForce GTX 400 series, XFX did what could have been unimaginable a few months ago: abandon NVIDIA's high-end GPU launch. That's right, XFX has decided against making and selling GeForce GTX 480 and GeForce GTX 470 graphics cards, saying that it favours high-end GPUs from AMD, instead. This comes even as XFX seemed to have been ready with its own product art. Apart from making new non-reference design SKUs for pretty-much every Radeon HD 5000 series GPU, the company is working on even more premium graphics cards targeted at NVIDIA's high-end GPUs.
The rift between XFX and NVIDIA became quite apparent when XFX outright bashed NVIDIA's high-end lineup in a recent press communication about a new high-end Radeon-based graphics card it's designing. "XFX have always developed the most powerful, versatile Gaming weapons in the world - and have just stepped up to the gaming plate and launched something spectacular that may well literally blow the current NVIDIA offerings clean away," adding "GTX480 and GTX470 are upon us, but perhaps the time has come to Ferm up who really has the big Guns." The move may come to the disappointment of some potential buyers of GTX 400 series, as XFX's popular Double Lifetime Warranty scheme would be missed. XFX however, maintains that it may choose to work on lower-end Fermi-derivatives.
Source:
HardwareCanucks
The rift between XFX and NVIDIA became quite apparent when XFX outright bashed NVIDIA's high-end lineup in a recent press communication about a new high-end Radeon-based graphics card it's designing. "XFX have always developed the most powerful, versatile Gaming weapons in the world - and have just stepped up to the gaming plate and launched something spectacular that may well literally blow the current NVIDIA offerings clean away," adding "GTX480 and GTX470 are upon us, but perhaps the time has come to Ferm up who really has the big Guns." The move may come to the disappointment of some potential buyers of GTX 400 series, as XFX's popular Double Lifetime Warranty scheme would be missed. XFX however, maintains that it may choose to work on lower-end Fermi-derivatives.
199 Comments on XFX Abandons GeForce GTX 400 Series
I'm guessing you also missed the point I was making entirely.
I'll explain it again, Fermi is the first single GPU that we have seen released that actually tops the previous generation's dual GPUs. That is a huge feat, one that hasn't been done ever before. In any other situation that alone would have made Fermi get praised as a wonderful GPU. And in the case of the HD4870x2, Fermi actually does it with less power and less heat, making it even more amazing.
I'm not ignoring the dual GPU cards, obviously they exist and provide amazing performance, that wasn't my point at all. Ignoring the complications that they add, they definitely are the top dogs. However, that again was not the point. The accomplishment of a single GPU topping the previous generation's dual-GPU cards has never been done, and Fermi doing it while using less power and producing less heat shows that it is an utterly amazing GPU. However, RV870 is even better for other reasons. If it wasn't for RV870 having the perfect balance of power usage/heat output/price and performance, Fermi would probably be praised right now instead of bashed.
And the HD5870 doesn't beat the HD4870x2 in almost everything, in fact overall the HD5870 is about 5% behind the HD4870x2, that means that the HD4870x2 beats the HD5870 in more things...:slap: That is how it is not in the same boat as Fermi. Beating the HD4870x2 in a few things, but losing overall, it still a lose. Fermin wins overall compared to the HD4870x2.
The thinking in each of these cases is that its probbly due to the card continuing to draw more and more power the longer its under load, one review i found showed 2 480's(reviewer also tested sli but it a seprate revies) each of them drew 340watts and hit 101c after being left running 3dmark or furmark or heaven for an extended time, it wasnt hours but the guy did leave it running a good while to simulate a real gaming session playing a stressfull game in a common case(think it was an antec 900 or something like that)
the fact is, as the TPU review shows, the card can/does pull more power then nvidia wants to admit, it runs hot even at idle, and really dosnt perform that great if you get right for its specs.
1. an overclocked 5780 will be faster then a 480, even an overclocked one(they dont overclock well)
2. even overclocked the 5780 couldnt draw as much power or create as much heat as the 480.
3. in idle the 5k cards use VERY LITTLE POWER and run VERY COOL
nVidia should have TESTED b4 they went to mass production, I know amd/ati do, after the 2900 they went back to testing b4 they sent cards/chips to mass production, they ensure they will be able to keep them within a reasonable power/heat threshold had nV done proper testing b4 they started mass production they could have avoided
1. having such a hot card thats well below the planned specs.
2. having such a high fail rate on cores (make something hard to produce and its going to have higher fail rates)
3. being 6 months late to market trying to work around heat and production problems.
yes the 5k cards had problems to, BUT notice they got worked out and yeilds are well above 7.1%
nV screwed up, I would guess they know they screwed up and are working hard to get out either a refresh OR a re-designed product that wont be so damn hot.
I wonder if even a water cooler like the older toxic cards used could keep these things heat in check....i have a feeling it would take a 120mm rad or dual 120mm rad even to do the job...
the 480 is not better all around than the 295 is it?
Yes, I've already gone over that compared to the HD5870 the GTX480 isn't as good power and heat wise, even the GTX470 isn't. But again, that is because the HD5870 set an amazingly high bar. Really, compared to past cards, the GTX480 isn't terrible, it isn't great or even really good, but it isn't terrible heat and power wise.
And I find it funny that you talk about ATi never releasing another hot and problem card after the HD2900 series, because I seem to remember the HD4850 issues with heat, and furmark killing the cards...but yeah, ATi tests everything really well before they release it... Don't post BS, ATi isn't perfect like you seem to think they are.
One resolution doesn't matter, overall is what matters, next you'll be telling us we should pick whichever benchmark ATi did best in, and use that as the final word on which card is better...
Next you'll be saying the GTX480 is best because green is a pretty color.
All around means: considering all aspects. Looking at one resolution isn't considering all aspects now it is?
And you are the one trying to pick and choose stats here, when I use overall performance, I'M USING ALL THE STATS! Guess what picking one resolution is...I'll tell you...it is picking and choosing your stats.
But, please, continue with the defense of Nvidia. It's cute, now that the tides have turned. :laugh:
come on, Nvidia cant always win!:)
this time, they fucked up, not ATI... they did that in the old hd2900 days, for exact the same reason.... trying to keep up with the fastest cards available (which were the g80 series at that time;))without thinking first
And I'm willing to bet that the higher resolution issues will be worked out with drivers, and there are pretty obviously driver issues still involved with the initial driver release.