# XFX Abandons GeForce GTX 400 Series



## btarunr (Mar 30, 2010)

XFX is getting cozier with AMD by the day, which is an eyesore for NVIDIA. Amidst the launch of GeForce GTX 400 series, XFX did what could have been unimaginable a few months ago: abandon NVIDIA's high-end GPU launch. That's right, XFX has decided against making and selling GeForce GTX 480 and GeForce GTX 470 graphics cards, saying that it favours high-end GPUs from AMD, instead. This comes even as XFX seemed to have been ready with its own product art. Apart from making new non-reference design SKUs for pretty-much every Radeon HD 5000 series GPU, the company is working on even more premium graphics cards targeted at NVIDIA's high-end GPUs. 

The rift between XFX and NVIDIA became quite apparent when XFX outright bashed NVIDIA's high-end lineup in a recent press communication about a new high-end Radeon-based graphics card it's designing. "XFX have always developed the most powerful, versatile Gaming weapons in the world - and have just stepped up to the gaming plate and launched something spectacular that may well literally blow the current NVIDIA offerings clean away," adding "GTX480 and GTX470 are upon us, but perhaps the time has come to Ferm up who really has the big Guns." The move may come to the disappointment of some potential buyers of GTX 400 series, as XFX's popular Double Lifetime Warranty scheme would be missed. XFX however, maintains that it may choose to work on lower-end Fermi-derivatives.





*View at TechPowerUp Main Site*


----------



## Paintface (Mar 30, 2010)

I guess XFX wants to stick with producing videocards instead of fancy looking roomheaters


----------



## aCid888* (Mar 30, 2010)

It's good its on the front page but this was posted yesterday and has been pretty active. 

http://forums.techpowerup.com/showthread.php?t=118801


----------



## Fourstaff (Mar 30, 2010)

How many will follow suit?


----------



## afw (Mar 30, 2010)

Fourstaff said:


> How many will follow suit?



This article says two including XFX ---> http://www.hardwarecanucks.com/news/video/xfx-jumps-gtx-480-gtx-470-ship/

I think its because of warranty issues  .... :shadedshu


----------



## DigitalUK (Mar 30, 2010)

maybe XFX knows something we dont or maybe not, bfg and evga has no problems bringing the heat..


----------



## gumpty (Mar 30, 2010)

Wouldn't it be interesting to know the actual reasons behind this. Is it a calculated business move which ties in with their strategic business plan? Or is it something subtler, like seeing flaws in the Fermi chip itself (maybe heat?), and didn't want to get involved with a chip that was likely (pure speculation) to cause them lots of RMA trouble in the future?


----------



## TheOnlyHero (Mar 30, 2010)

I dont care about XFX, its there choice. There a lot of companys like xfx ( Asus,Inno3d,Galaxy...) ho makes Nvidia videocards , so it isnt a big shock for me .XFX goes against mother nature


----------



## Fourstaff (Mar 30, 2010)

Yeah, I think the warranty is going to be a problem for them, since that they offer generous warranty services.


----------



## gumpty (Mar 30, 2010)

TheOnlyHero said:


> I dont care about XFX, its there choice. There a lot of companys like xfx ( Asus,Inno3d,Galaxy...) ho makes Nvidia videocards , so it isnt a big shock for me .XFX goes against mother nature



I think the 'shocking' thing about this is that ... iirc ... XFX were an Nvidia-only company as little as 12 months ago. To wholly switch teams so quickly, within one new generation of GPUs, is the surprising part. For me at least.


----------



## DigitalUK (Mar 30, 2010)

warranty is everything ,specially when you are paying that sort of money on a card. 1 year later you could have a dead card with nothing you can do about it.

a card that can run at 100c wont be lasting years. if most of use saw those temps we would be crying/rebuilding or rma'ing


----------



## Velvet Wafer (Mar 30, 2010)

someone with big influence at XFX knews something, Nvidia doesnt want regular customers to know, i believe
anyway, Evga is Nvidia exclusive till today, and that really pisses me off


----------



## qwerty_lesh (Mar 30, 2010)

gumpty said:


> I think the 'shocking' thing about this is that ... iirc ... XFX were an Nvidia-only company as little as 12 months ago. To wholly switch teams so quickly, within one new generation of GPUs, is the surprising part. For me at least.



yeah, thats extremely common for consumer buying trends (to follow a brand thats doing the best value/performance at the current times) but not for manufacturers or manufacturing partners.


----------



## gumpty (Mar 30, 2010)

Fourstaff said:


> Yeah, I think the warranty is going to be a problem for them, since that they offer generous warranty services.



Yeah, which would be worse PR for Nvidia's Fermi brand: XFX not making the cards at all, or XFX only offering, say a 1 or 2 year warranty - when they offer a lifetime warranty on everything else? It would make everyone wonder about the safety of buying such a GPU.

Bearing that in mind, I wonder if Nvidia politely asked XFX to not make any at all when they told them they would only do a limited warranty - knowing that pulling out totally, and just keeping mum about the reasons, would be the better choice for Fermi's reputation.

This is all wild speculation though.


----------



## LiveOrDie (Mar 30, 2010)

they end up shiping both in time so who cares


----------



## Yukikaze (Mar 30, 2010)

That's a sort of a: , but the colors on the smileys need to be reversed...


----------



## ArkanHell (Mar 30, 2010)

XFX is just afraid of the low sales that could have if they take out the 480, and you know why.


----------



## Hayder_Master (Mar 30, 2010)

i think nvidia who is losing XFX, maybe people now know the best brand is EVGA it's right but nvidia known as durable and powerful cards cuz the XFX


and i think it's nvidia new GPU's problem, high heat mean GPU take more and that's point to new nvidia GPU's have bad quality


----------



## DigitalUK (Mar 30, 2010)

XFX has always released Nvidia cards in all flavours , there has to be a very good reason to drop NV now like a bad smell.
at the end of the day we need Nvidia to come through tho or we will be paying through the nose for cards, ive noticed over last few days since the release that the ati 58xx cards seem to be going up in price and not down.


----------



## gumpty (Mar 30, 2010)

DigitalUK said:


> XFX has always released Nvidia cards in all flavours , there has to be a very good reason to drop NV now like a bad smell.
> at the end of the day *we need Nvidia to come through tho or we will be paying through the nose for cards*, ive noticed over last few days since the release the the ati 58xx cards seem to be going up in price and not down.



QFT


----------



## OnBoard (Mar 30, 2010)

Hope no-one buys them and they are soon 50% off. I wouldn't mind playing with one if it was cheap 

Not that it's going to happen, if there is already shortage of them :/
http://www.fudzilla.com/content/view/18277/1/


----------



## crow1001 (Mar 30, 2010)

Guess them 5000 cards XFX are selling are doing really well for them, they have basically gave the finger to Nvidia which looks really bad for them, ATI fanbois are gonna love this.


----------



## gumpty (Mar 30, 2010)

OnBoard said:


> Not that it's going to happen, if there is already shortage of them :/
> http://www.fudzilla.com/content/view/18277/1/



This could also be a likely reason why XFX pulled out. They couldn't get Nvidia to guarantee they'd be supplied with enough chips to make the whole manufacturing/branding/marketing thing worthwhile; so they just said feck-it, don't bother.


----------



## HalfAHertz (Mar 30, 2010)

gumpty said:


> This could also be a likely reason why XFX pulled out. They couldn't get Nvidia to guarantee they'd be supplied with enough chips to make the whole manufacturing/branding/marketing thing worthwhile; so they just said feck-it, don't bother.



Sounds like the most logical reason


----------



## Semi-Lobster (Mar 30, 2010)

Fudzilla is saying that it was Nvidia who's breaking up with XFX and not the other way around

http://www.fudzilla.com/content/view/18278/1/

This is sort of the exact opposite of what every other site is saying and they seem to be the only ones claiming this


----------



## DigitalUK (Mar 30, 2010)

and the plot thinkens, if nvidia are really behaving in that way (Very unprofessional). could we see more vendors dropping nvidia.
it would be nice for big companys these days to just drop the BS and say it like it is, if its shortage of supply or internal difference say it , i could respect that.


----------



## crow1001 (Mar 30, 2010)

FUD are fubar.

XFX know the score, they realize a loss is in the making with selling the 400 cards, so they stick with ATI for the high performance GPU market and enjoy life.


----------



## jagd (Mar 30, 2010)

Funny ,i dont care about xfx(asus is king for warranty in my country ) , but they are populer in US because their warranty and not losing your warranty when you install aftermarket cooler .

Fudzilla is nvidia's green knight take with a truck load of salt what fud says   .dont forget nvidia forcing companies to buy old crap (210/220/240 etc ) if they want to buy gtx480/470 cards


TheOnlyHero said:


> I dont care about XFX, its there choice. There a lot of companys like xfx ( Asus,Inno3d,Galaxy...) ho makes Nvidia videocards , so it isnt a big shock for me .XFX goes against mother nature


----------



## rpsgc (Mar 30, 2010)

Semi-Lobster said:


> Fudzilla is saying that it was Nvidia who's breaking up with XFX and not the other way around



Yeah... I don't think NVIDIA can afford to do that right now.


----------



## mtosev (Mar 30, 2010)

Bad news for nVidia. they are slowly slipping away. if nVidia continues on this path they will end up as 3DFX did


----------



## DigitalUK (Mar 30, 2010)

mtosev said:


> Bad news for nVidia. they are slowly slipping away. if nVidia continues on this path they will end up as 3DFX did



amen to that, i used to love voodoo and glide back then was amazing


----------



## Velvet Wafer (Mar 30, 2010)

question is, when this will happen, who will buy them?


----------



## Fourstaff (Mar 30, 2010)

Velvet Wafer said:


> question is, when this will happen, who will buy them?



Nah, Nvidia will not sink. It has alot of other things like Tegra and CUDA seems to be a hit with scientists who cannot afford a proper supercomputer.


----------



## mtosev (Mar 30, 2010)

Velvet Wafer said:


> question is, when this will happen, who will buy them?


Intel, who else is there?


----------



## Velvet Wafer (Mar 30, 2010)

there are masses of hardware producers... maybe Samsung suddenly has the itch to sell VGA´s?
they own so much more, it would be not too difficult for them to pay

regarding CUDA etc.
surely these technologies have advances... if you have the proper hardware to execute them


----------



## filip007 (Mar 30, 2010)

You can't sell two good different products under one roof that's the problem and like XFX is selling only the best 4870X2,5850,5870,5970...


----------



## WhiteLotus (Mar 30, 2010)

This type of thing pleases me greatly.

One: Nvidia will have to work damn hard in getting them back
Two: A nice little f'you to Nvidia to bring them back dow to earth.


----------



## mtosev (Mar 30, 2010)

Velvet Wafer said:


> there are masses of hardware producers... maybe Samsung suddenly has the itch to sell VGA´s?
> they own so much more, it would be not too difficult for them to pay
> 
> regarding CUDA etc.
> surely these technologies have advances... if you have the proper hardware to execute them



Intel wanted to buy ATi but didn't have a chance. so the only logical buyer is Intel. plus intel has a LOT of money.


----------



## DigitalUK (Mar 30, 2010)

i would buy a samsung graphics card, in the high gloss curvy finish there tvs have would be nice.


----------



## Velvet Wafer (Mar 30, 2010)

mtosev said:


> Intel wanted to buy ATi but didn't have a chance. so the only logical buyer is Intel. plus intel has a LOT of money.


Intel wants technology. New technology! Nvidia has its problems with that, since a few years
surely it would be possible, but i somehow dont believe they will buy Nvidia, if they really fail.
im not sure on that tho, its just a "stomach feel"



DigitalUK said:


> i would buy a samsung graphics card, in the high gloss curvy finish there tvs have would be nice.


if they would produce VGA´s like they produce marvellous Flatscreens, ATI/AMD would have a serious business opponent. One which has even more gigantic resources,than intel


----------



## mdm-adph (Mar 30, 2010)

Ouch -- just a short time ago XFX made nothing _but_ Nvidia cards, right?

The GTX 400 series is looking like the FX5000 series, if they had actually had the performance to back up the heat and power requirements.  

The GTX 400 series isn't a flop, but it's very, very close.


----------



## qwerty_lesh (Mar 30, 2010)

regarding the talk about Intel buying out nVidia.  I doubt such a thing would ever happen. Prior to AMD buying out ATi, nVidia and Intel weren't and still aren't on good terms. 

In its most simple form, AMD & ATi (usually the underdog friends ((sorry if it sounds fanboyish)) ) where Intel and nVidia are the big (and oddly more seemingly evil) rivals on the other side of the make believe fence.  That made sense right?


----------



## DarthCyclonis (Mar 30, 2010)

I could see Intel buying Nvidia.  Let's be honest. Everything Nvidia has is leaps and bounds ahead of Intel and it's GPU development.  Nvidia does have good engineers. It's the management that is killing the company with it's anti-consumer polcies.

Also Nvidia's short sighted management was banking on GPGPU computing to be bigger then what it is when they started the Fermi project.  GPGPU could very well be big for them.  Just not for a couple of generations of GPU's from now. But they need cash today.


----------



## newtekie1 (Mar 30, 2010)

It isn't surprising, a lot of these companies' fame is thanks solely to nVidia's series of good GPUs that outclassed ATi's.  They rode the nVidia dominance wave, and it isn't surprising that some of them are switching now that it is ATi's turn to start dominate.



gumpty said:


> I think the 'shocking' thing about this is that ... iirc ... XFX were an Nvidia-only company as little as 12 months ago. To wholly switch teams so quickly, within one new generation of GPUs, is the surprising part. For me at least.



XFX started selling ATi cards with the introduction of the HD4800 series, so its been about 18 months, and two generations.


----------



## FreedomEclipse (Mar 30, 2010)

DigitalUK said:


> there has to be a very good reason to drop NV now like a bad smell.



Theyve seen the pittyful  470/480 benchmarks & decided it was such an event worthy of many epic facepalms & facedesks that they said this shit wont fly!

the 480 is the top range but compare to the 5970 the 480 hasnt got shit on it. the 5970 just thrashes the 480 in 99% of benchmarks.

any noob lokking to buy a new GPU, does a litte re-search would know to go with the 5970, compared to 480 its more efficient GPU overall.

a very bold move by XFX - they probably knew they might have trouble selling the  cards since Nvidias top end isnt really top end anymore


----------



## arnoo1 (Mar 30, 2010)

there goes one of the best
damn i don't like that
i always have had xfx I never had one crappy card of them

it will evga from now on 

damn those fermi cards are expensive 340euro's i will not buy one if there are not around 250euro's

gtx275 wil stay for now


----------



## Delta6326 (Mar 30, 2010)

XFX - We are now Nvidia GTX 400 clean, we finally got rid of the Nvidia drug thanks to ATI's "Quit Nvidia"  helpers club.

ATI - We are so pleased that we were able to help XFX, we hope to help other company's that are addicted to Nvidia. In the mean time try out are sweet gaming cards! That's the first step

 Im so happy!


----------



## pantherx12 (Mar 30, 2010)

Whilst funny this is also not good news, nvidia NEED partners in order to compete with ati, no competition = high prices.


----------



## leonard_222003 (Mar 30, 2010)

I can believe Fudzilla scenario that Nvidia kicked XFX because they sided with Ati , if you look at Nvidia's history dealing with the world they have been kind of not so diplomatic , at war with Intel and provoking them constantly , at war with rambus not paying linceses , at war with AMD/ATI but this is understandable because it's the competition , at war with integrators  , blamed them for not cutting production costs when AMD was killing them with low cost cards , forcing integrators to take the loss when things didn't worked that well ( the time when price war started bettwen AMD and Nvidia and Nvidia had expensive to produce g92 and GT200 cards ).
Actually that was the time when brand name integrators got tired of Nvidia and started producing Ati cards , other integrators had to speak now with fermi launching and probably said , WTF , we loyal and others not so , fu..ck  them and give us priority , we suffered with you , xfx didn't , we get Fermi and they don't PERIOD , what can Nvidia say to this ? it's Nvidia's fault because they didn't took the loses  when they had too , they treat integrators very badly.


----------



## Deleted member 24505 (Mar 30, 2010)

Who cares if evga make the best nvidia cards,who would actually want to pay for one of the heaters/gtx 480/70 cards anyway.


----------



## phanbuey (Mar 30, 2010)

Yeah but Nvidia do have plety of "partners"... The fewer partners nvidia has, the better it is for the remaining partners, as there is less competition in the space.  There will be a market for nvidia for a while.

This is just one product out of generations of many successful products, and many more to come.  Bashing it and potentially ruining the relationship is not smart.  Im inclined to believe FUD more than this.  XFX by itself wouldn't cut off a relationship that has made them SO much money over the last few years.

and G92 was cheap as **** to produce.  Board mfgs were making a killing off those cards.


----------



## Edito (Mar 30, 2010)

Bad for nvidia but i really don't care about XFX they still have EVGA, ASUS, Zotac, BFG


----------



## Delta6326 (Mar 30, 2010)

They probably left because they didn't want to be one of those guys standing at the end It's funny


----------



## KainXS (Mar 30, 2010)

Sadly this will make other Nvidia/AMD partners re-evaluate whether fermi is worth the backlash, seeing as Nvidia just lost one of their best partners here if they got another hit from another, I don't know ASUS, or MSI, that will pretty much be a red flag to not buy the GTX4XX.

but they will be selling the midrange 4XX cards, they just don't wanna take the risk with the 2 top ones

LOL
http://www.youtube.com/watch?v=If0Bkfnifi4


----------



## Edito (Mar 30, 2010)

New cards must bring evolution in tech department not just raw power like most ATI cards im buying Fermi cause of CUDA, Physx, C++, Open CL and driver support that metter for me and i belive for many other gamers too and the price battle is good but lets not forgget nvidia cards are steps ahed of AMD in terms of tech...


----------



## jagd (Mar 30, 2010)

XFX said that the decision not to carry this series of GF100 graphics card was their decision 
http://www.legitreviews.com/news/7707/


leonard_222003 said:


> I can believe Fudzilla scenario that Nvidia kicked XFX because they sided with Ati


----------



## newtekie1 (Mar 30, 2010)

jagd said:


> XFX said that the decision not to carry this series of GF100 graphics card was their decision
> http://www.legitreviews.com/news/7707/



Have you ever seen two sides from a couple that broke up, both sides say they broke up with the other person.

It doesn't really matter who did it, the important part is that it was done.



tigger said:


> Who cares if evga make the best nvidia cards,who would actually want to pay for one of the heaters/gtx 480/70 cards anyway.



Really, if the GTX470 actually does perform the same as the HD5870 like the reviews seem to suggest(I'm waiting for a W1z review though), and it is $50 less than the HD5870 as the MSRP suggests, I'd buy one.  Performance for the buck is all I really care about, the 30w power consumption doesn't really bother me.  If performance per watt was a concern, I would have never bought an HD4890.


----------



## KainXS (Mar 30, 2010)

imo

the 470 is a good card. . . . . amd seems to be increasing prices.


----------



## gumpty (Mar 30, 2010)

newtekie1 said:


> XFX started selling ATi cards with the introduction of the HD4800 series, so its been about 18 months, and two generations.



Yeah, but only one generation of Nvidia GPU (GT200). It's splitting hairs a bit though - it doesn't seem like long ago that XFX was Nvidia-only.



newtekie1 said:


> Really, if the GTX470 actually does perform the same as the HD5870 like the reviews seem to suggest(I'm waiting for a W1z review though), and it is $50 less than the HD5870 as the MSRP suggests, I'd buy one.  Performance for the buck is all I really care about, the 30w power consumption doesn't really bother me.  If performance per watt was a concern, I would have never bought an HD4890.



If it's performance vs dollar then the 5850 wins (I think) out of the four competitors. But then you can keep playing that bang-for-buck game right down the line of products. In reality we all put a minimum-performance threshold down, and play bang-for-buck above that.

Watts is something I consider, but is unlikely to be a deal-breaker. Unless the difference is seemingly huge, as it is with the 5870 vs 480, then it gains more weight in the decision making process (as the electricity cost over the lifetime of the product may become significant). Noise is a factor that I also consider important.


----------



## leonard_222003 (Mar 30, 2010)

Fermi is not to throw off so quickly , who knows what it can do for applications , i would gladly pay 500 euros for a card that can speed up my video editing software considerably, look at this 
http://blogs.adobe.com/genesisproject/2009/11/technology_sneek_peek_adobe_me.html , holly sh.it this is fast , you can't have this with any CPU on the market , i want this.
Also that is with a quadro 4800 , a GT200 version with 192 shaders , imagine GTX480 in adobe premiere CS5.
Other than speeding up applications i don't care for Nvidia much , in games i prefer the cheaper Ati versions.


----------



## bpgt64 (Mar 30, 2010)

I think this correlates to to XFX's value proposition to consumers.  The warranty they offer on products is somewhat unique compared to other vendors, and I think there saving themselves cost, and future Fermi returns down the line.


----------



## kid41212003 (Mar 30, 2010)

I believe you need to get the workstation card, which is over 1k or something, lol.

They have different bios, and drivers, well I guess you can change them, but over 1k usd is their official price.


----------



## mechtech (Mar 30, 2010)

wow, just wow.

You think they could still make profit off of the fermi line so they would do it, I really wonder why they arent?!?!


----------



## leonard_222003 (Mar 30, 2010)

kid41212003 said:


> I believe you need to get the workstation card, which is over 1k or something, lol.
> 
> They have different bios, and drivers, well I guess you can change them, but over 1k usd is their official price.



Not really 
http://blogs.adobe.com/genesisproject/2009/11/technology_sneek_peek_adobe_me.html


> The list of approved GPU cards will be limited initially to ensure that we have a consistent experience for our customers.   Obviously this begs the question of which cards are you going to support?  I think the answer here could be a moving target but two cards that I've been told will be supported are the GeForce GTX 285, the Quadro FX 4800, 5800 and the Quadro CX.  I'm currently doing my testing with the Quadro 4800.  The GeForce GTX 285 card should be one to really look at as it's street price is only about $300.00 and provides a real value to users that are looking to get the maximum bang for the buck.


Maybe quadro cards have that price for people who work with 3d , i don't need that.


----------



## AsRock (Mar 30, 2010)

ArkanHell said:


> XFX is just afraid of the low sales that could have if they take out the 480, and you know why.



More like they don't want to deal with the returns off the 400 series.  I think their going be high returns and they just don't want to deal with it due to high costs.

I will not buy a 4xx series due to the heat as it's asking for issue's for sure.  If any thing this makes me like XFX even more.


----------



## kaosII (Mar 30, 2010)

Something is going on at Nvidia, and it's no secret.
I called my broker at Smith Barney yesterday, for some advice on my continuously shrinking
portfolio. 
I was advised against the short sell, but was also told that Nvidia Chairmen have already sold large percentages of their own shares. 
Hsun is one of three names he mentioned, having sold more than 70,000 shares of late.
I inquired about purchasing more shares for myself since it is down so low.

He replied "As to the future of Nvidia I can't say, I'm only a stock broker. But, when the top Brass starts jumping ship, it is wise to sit back and see if the ship is actually sinking or not. 
These are often calculated maneuvers, or exactly what they seem." 

This is not the first time Nvidia was late for the dance. I personally do not want to believe they are in that deep, first quarter reports will be out soon.


----------



## xtremesv (Mar 30, 2010)

Fourstaff said:


> Nah, Nvidia will not sink. It has alot of other things like Tegra and CUDA seems to be a hit with scientists who cannot afford a proper supercomputer.



I'm a gamer not a quantum physicist. If someone wants to do some heavy folding with a GTX 480, please start thinking about the real cost-benefit between the folding itself and electricity consumption (more global warming besides your room warming).


----------



## Cleorina (Mar 30, 2010)

xtremesv said:


> I'm a gamer not a quantum physicist. If someone wants to do some heavy folding with a GTX 480, please start thinking about the real cost-benefit between the folding itself and electricity consumption (more global warming besides your room warming).



U right dude....


----------



## kaosII (Mar 30, 2010)

xtremesv said:


> I'm a gamer not a quantum physicist. If someone wants to do some heavy folding with a GTX 480, please start thinking about the real cost-benefit between the folding itself and electricity consumption (more global warming besides your room warming).



Did you really just print that on this forum?

Really?????


----------



## pantherx12 (Mar 30, 2010)

kaosII said:


> Did you really just print that on this forum?
> 
> Really?????




It's true though.


Folding doesn't necessarily have a benefit for starters, secondly people see it as being "free" without factoring in bills etc.

The amount of users I see on this forum with their own farms is crazy.

Surely it would make more sense to save the money you would be spending yourself on folding and sending it directly to the institutions running these sort of folding programs etc.

They can buy efficient server farms with no latency etc.

Or hell send the money direct to cancer research facilities etc.



Thought I'd show the other side of the coin XD




Off topic though so reply to me via PM or make a new thread about folding discussion in this way.

I'll see it trust me


----------



## newconroer (Mar 30, 2010)

Paintface said:


> I guess XFX wants to stick with producing videocards instead of fancy looking roomheaters



Which is ironic, since a good majority of these vendors make their own unnecessary modifications to reference cards and then stick a premium price on them, for something you could have done yourself.

Fu** em is what I say, why should I pay more money for nothing?
And why do we need the middle-man? For a 'step-up program?' that's just another carnival scheme that's not worth your time.

These vendors don't make the cards or the technology, ATi and Nvidia do, and that's the products I'm buying.


----------



## alexsubri (Mar 30, 2010)

Thank God I bought my XFX XXX 5850 on time! I heard it's now sold out again on NewEgg...I had XFX 7950 GT nVidia on my old rig, I'm glad I'm running crossfire with this bad boy! Another blow to nVidia fan's

BTW - Here is my spoiler, got this back in early Febuary from NewEgg when I was building my system!


----------



## newtekie1 (Mar 30, 2010)

gumpty said:


> Yeah, but only one generation of Nvidia GPU (GT200). It's splitting hairs a bit though - it doesn't seem like long ago that XFX was Nvidia-only.
> 
> 
> 
> ...



I agree, I was just talking about that performance level.  Usually the bang for the buck improves as performance decreases, and generally I don't pay more than $300 for a graphics card, and this generation won't be an exception.  I'm also not an early adoptor anymore, I'll likely stick with what I have for a good long while until prices are reasonable, likely almost at the end of the generation.

I'm personally very disappointed with the GTX480, but the GTX470 seems to be turning into a decent card.  Yes, power consumption is still higher than the HD5870, but performance is right about the same, and price is shaping out to be the same also.  The GTX480 is turning into another 8800GTX/Ultra, insanely price, insanely hot, but the single GPU performance leader, but the real winner is the next step down(8800GTS/GTX470).

You know, up until a few months ago, I wouldn't have considered noise as a buying factor.  However, when I put the HD4890 in my main rig, it really has changed my mind!  It is so much more annoying than most of the previous cards I've had in the computer, it is really the first card that actually bothers me during gaming, and with the GTX480 being even louder, screw that.


----------



## gumpty (Mar 30, 2010)

newtekie1 said:


> You know, up until a few months ago, I wouldn't have considered noise as a buying factor.  However, when I put the HD4890 in my main rig, it really has changed my mind!  It is so much more annoying than most of the previous cards I've had in the computer, it is really the first card that actually bothers me during gaming, and with the GTX480 being even louder, screw that.



My computer is in our bedroom in our flat, so noise has always been an issue. My old 4870x2 was noisy as hell too, but that problem got fixed when it broke three times and I decided to 'trade down' to the 285.


----------



## EastCoasthandle (Mar 30, 2010)

I am really starting to wonder if the 400 will be in limited supply or not now.


----------



## vnl7 (Mar 30, 2010)

i preffer to use 3 5770 than ONE hotter and louder GTX480


----------



## theubersmurf (Mar 30, 2010)

So much speculation...I honestly don't think nVidia is going down in flames here. For one reason or another XFX decided not to produce the GF100 cards, could be yeilds, could be the relatively poor performance and heat, whatever. I doubt the fud's claim that nvidia kicked them to the curb, that seems ludicrous to me. They've been a successful partner in the past, it's not in nvidia's best interest to get rid of a successful board partner.

I almost don't even care about this stuff right now, with the release of the gtx480/470 the ati fans are cheering publicly, and the nvidia fans are mumbling quietly off in the corner. I'm sort of glad nvidia is taking a little bit of a bash atm, one gpu maker in decided control of the market is bad for us all, But the two camps going at it like this is just exhausting. I'm honestly pretty sick of the gpu wars and it's various camps of devotees. I've had cards from both companies be good products, and the bashing is just too juvenile. It's splitting the PC gaming community apart (and has for a while) and creates situations like Batman: AA and Crysis, where support is thrown wholly behind one manufacturer of gpus and not another. Which does a great disservice to all PC gamers by narrowing our choices...I guess that was sort of a rant, sorry.


----------



## PaulieG (Mar 30, 2010)

I just have to say this, just in case people start going into fanboy rants....

No fan of enthusiast hardware should EVER rout for the complete demise of competition. That is the absolute worst thing that can happen to any consumer. So, despite the difficulties at Nvidia, and the fact that I chose to go with ATI this generation, I really hope that they pull out of this strong.


----------



## HalfAHertz (Mar 30, 2010)

I'm a bit shocked at some of the comments here. First of all in none of the sources I read did it state that they will not sell the GTX400 cards period, but that they won't at launch. It doesn't mean that they won't start a bit later on... My guess is that they did not want to disappoint their loyal customers with poor availability.

Secondly about Nvidia being a sinking ship, seriously? We're talking about the second or third largest hardware company. Their IP portfolio has more pages than the Bible. We cannot even begin to compare it to 3dfx because the current situation is so much different: 1)The graphic market is huge and so are the players involved in it 2) Nvidia has spread out in many different fields, like a moderns day Lernaean hydra(Geforce, computing, ion, optimus, tegra, etc.) 3) They sit on a huge stockpile of cash 4)Even tho I don't like their last 2 generations, we need a second player on the market to keep Ati's greediness in line.


----------



## lism (Mar 30, 2010)

The actual reason is in the startpost:


> XFX have always developed the most powerful, versatile Gaming weapons in the world - and have just stepped up to the gaming plate and launched something spectacular that may well literally blow the current NVIDIA offerings clean away,



Basicly the GTX 480 is a powerhungry chip which barely beats the AMD's high-end flagship. Therefore your paying 500 euro's for a high end card which is 10% more power usage and deliveres up to marginal improvements compared to a Ati card.

I cant say they are wrong, i woud'nt like to have a card in my computer that chrunches at 95 celcius default.


----------



## BazookaJoe (Mar 30, 2010)

(Yes, I'm very late to the thread)







Haw-Haw!


----------



## dumo (Mar 30, 2010)

It happened before with FX5800 (dustbuster and flame thrower) and then NVDA released GF6800 whis was a winner. So, imo GTX480 will be a stepping stone for more refined GTX line in the near future.


----------



## MKmods (Mar 30, 2010)

whats interesting is if XFX pulled out why didnt any of the others?



Semi-Lobster said:


> Fudzilla is saying that it was Nvidia who's breaking up with XFX and not the other way around
> 
> http://www.fudzilla.com/content/view/18278/1/
> 
> This is sort of the exact opposite of what every other site is saying and they seem to be the only ones claiming this


if thats the case maybe XFX seriously lucked out..

since the currently availably coolers cant really cool the existing GPUs nicely and the new Nvidia ones are even hotter its not gonna be a big seller and a warranty nightmare.


----------



## newtekie1 (Mar 30, 2010)

Paulieg said:


> I just have to say this, just in case people start going into fanboy rants....
> 
> No fan of enthusiast hardware should EVER route for the complete demise of competition. That is the absolute worst thing that can happen to any consumer. So, despite the difficulties at Nvidia, and the fact that I chose to go with ATI this generation, I really hope that they pull out of this strong.



100% agreed.  And what everyone seems to fail to realize is that the industry goes back and forth, in every sector, but particularly in the graphics area.  ATi will have some real winner products for a while, and nVidia will have some real winner products for a while.

R300 vs. NV30 - ATi had the lead
R350 vs. NV35 - ATi increase the lead
R423 vs. NV45 - nVidia decreases ATi's lead
R580 vs. G70 - nVidia pulls even with ATi
R580+ vs. G71 - nVidia takes the lead
R600 vs. G80 - nVidia increases the lead and we have the reverse of what he have today(R600 comes 6 months late, and is a power hungy expensive disappointment)
RV670 vs. G92 - nVidia's lead stays pretty steady
RV770 vs. GT200 - ATi decreases nVidia's lead
RV870 vs. GF100 - ATi takes the lead with a similar result to R600 vs. G80 but in reverse

It is a cycle that we've seen a few times in the past, and will likely continue to see in the future.  NVidia will work on their GPUs to come back, then ATi will do the same, and so on.


----------



## dumo (Mar 30, 2010)

We will soon see how's GTX480 perform. 

Probably better for those EVGA and BFG alike to bundled up GTX 480 with up to specs PSU to make sure it works no problem.


----------



## Yukikaze (Mar 30, 2010)

Paulieg said:


> I just have to say this, just in case people start going into fanboy rants....
> 
> No fan of enthusiast hardware should EVER route for the complete demise of competition. That is the absolute worst thing that can happen to any consumer. So, despite the difficulties at Nvidia, and the fact that I chose to go with ATI this generation, I really hope that they pull out of this strong.



This is the healthiest view of the situation I've seen for a while.


----------



## theubersmurf (Mar 30, 2010)

newtekie1 said:


> 100% agreed.  And what everyone seems to fail to realize is that the industry goes back and forth, in every sector, but particularly in the graphics area.  ATi will have some real winner products for a while, and nVidia will have some real winner products for a while.
> 
> R300 vs. NV30 - ATi had the lead
> R350 vs. NV35 - ATi increase the lead
> ...


You know, this sums it up beautifully in a way, I'm just worried about the marketshares a bit. There are so many devoted to invidia at this point that regardless of ATIs current products portfolio, people will just keep buying invidia, and ATI will sink. I can't see them going completely out of business, but if they end up doing integrated solutions only, or something like that, I think we'd all suffer.


----------



## DrPepper (Mar 30, 2010)

Why is everyone saying this is because of warranty issue's ? The card has been out 4 days and you guys somehow conjure up that it has a high failure rate ?


----------



## laszlo (Mar 30, 2010)

warranty and supply shortage...

but anyone wonder about the price of of it? this monolithic chip is not cheap....all combined...XFX say:where is my profit?


----------



## Andy77 (Mar 30, 2010)

DrPepper said:


> Why is everyone saying this is because of warranty issue's ? The card has been out 4 days and you guys somehow conjure up that it has a high failure rate ?



In a statement to partners Nvidia said it will reduce warranty period from 24 to 12 months on the Fermi line up.

@newtekie1, yeah, that can go on forth and back as long as partners can endure nvidia's treatment... and by the looks of it, XFX didn't want to put up with it anymore. FWIW progress also comes from gathering a series of partners around you that help in the development of products. For now, Nvidia has only swallowed what it considered useful and marginalized what it didn't, like XFX. That's not how a healthy business should be run if you want to consider longevity. Time will tell, but from the looks of it, they only managed to stay active this long because of those around them that supported them.

I wonder how many partners preordered Fermi cards in Dec / Nov / Whatever last year? i.e. filled Nvidia's pockets with money for non-existing products that helped them somewhat stay this long in the charts! Does anyone think partners will still be around if this happens the next time?


----------



## xtremesv (Mar 30, 2010)

DrPepper said:


> Why is everyone saying this is because of warranty issue's ? The card has been out 4 days and you guys somehow conjure up that it has a high failure rate ?



The card has been "out" 4 days to the public opinion but manufacturers like XFX have got the gpu's weeks ago, enough time for extreme lab testing.

However, I'm not sure about the true cause that led XFX to skip GTX 480/470, if I had to guess I'd say it's a matter of yields. On the other hand, I wouldn't be surprised if XFX takes back in the next days.

I've owned Nvidia's and ATi's and I've been very satisfied with both. I think everyone on this forum agrees that competition is the best for all of us, so be it, I'd love to see a closer 50/50 market share between greens & reds.


----------



## Kantastic (Mar 30, 2010)

I read somewhere that in Nvidia is trying to get rid of their GT2xx chips so in order to buy Fermi you have to buy a crapload of older chips, which haven't been selling very well.

10 GTX480's + 10 GTX470's must be purchased along with 20 GT220's and 20 GT240's.


----------



## Andy77 (Mar 30, 2010)

Kantastic said:


> I read somewhere that in Nvidia is trying to get rid of their GT2xx chips so in order to buy Fermi you have to buy a crapload of older chips, which haven't been selling very well.
> 
> 10 GTX480's + 10 GTX470's must be purchased along with 20 GT220's and 20 GT240's.



Along with 20 G210s, and 20 GTS250s.

Guess XFX had ATI's for that market segment.


----------



## Kantastic (Mar 30, 2010)

Andy77 said:


> Along with 20 G210s, and 20 GTS250s.
> 
> Guess XFX had ATI's for that market segment.



Right, I knew I was forgetting something, about 40 other cards LOL.

If I knew the forum post I read was based on Semi-Accurate I probably would have kept my mouth shut.


----------



## pdxer1 (Mar 30, 2010)

If I were a scientist I would be all about building a super computer with 4 or 8 470/480's and an air conditioning unit. Depending on the country you're in, you could get the tax refund for business purposes... I'm assuming, since science can be a business!

For me, the 450 looks to perform way better then my EVGA 275 896MB. But I'll actually hold that speculation until I see some real-world testing and retail cost!

EDIT: I wouldn't be at all surprised if XFX contracts for revision 470/480's. 
Something in my business intuitive gut says that XFX did not want to take on high RMA returns due to multiple reasons. XFX may have also wanted moar chips then NV could part to them and figured the production of few PCB's would cost more than the production of a large batch of PCB's. This cost then rises for the consumer. If the consumer can not afford the product, XFX loses return, interest and paying the bills. From a business standpoint, the HD5000 series has been very successful on 'sales'. The 'said' comings of the HD6000 series, while speculatively, make investors leave NV if... Well, there are many ifs, speculation is cheap! 

Product Quality, Service and Warranty, Price Point and driver functionality are the main driving force behind the majority of 'consumer consumption'.

However a lack of availability on demand is also key to what will drive consumers to another market. NV should have learned this with the 200 series!

_"Green with NV"_


----------



## OneCool (Mar 30, 2010)

dumo said:


> It happened before with FX5800 (dustbuster and flame thrower) and then NVDA released GF6800 whis was a winner. So, imo GTX480 will be a stepping stone for more refined GTX line in the near future.




Thats exactly what I was thinking.XFX isnt leaving nvidia their just skipping the 2 high end cards for now because they know that nvidia is going to be releasing a revamp of the F100 (F110...maybe?) soon just like the NV30 (5800) to the NV35 (5900).XFX knows this and is waiting for the updated reference design.

What was the time scale between the nv30 and the nv35 like 3 months?


----------



## newtekie1 (Mar 30, 2010)

theubersmurf said:


> You know, this sums it up beautifully in a way, I'm just worried about the marketshares a bit. There are so many devoted to invidia at this point that regardless of ATIs current products portfolio, people will just keep buying invidia, and ATI will sink. I can't see them going completely out of business, but if they end up doing integrated solutions only, or something like that, I think we'd all suffer.



I know that is the case with AMD vs. Intel, when Intel was behind brand loyalty kept people buying Intel.  However, I don't think the graphics card industry suffers from that as much.  The bulk of buyers are going to be average consumers and I don't think the average consumer has been totally brainwashed that nVidia is the only player in town like they have been with Intel.  ATi has had a fair bit of success, and if they can keep their current success for a decent amount of time, I can see them regaining a lot of market share.



Andy77 said:


> @newtekie1, yeah, that can go on forth and back as long as partners can endure nvidia's treatment... and by the looks of it, XFX didn't want to put up with it anymore. FWIW progress also comes from gathering a series of partners around you that help in the development of products. For now, Nvidia has only swallowed what it considered useful and marginalized what it didn't, like XFX. That's not how a healthy business should be run if you want to consider longevity. Time will tell, but from the looks of it, they only managed to stay active this long because of those around them that supported them.



From what I've seen, nVidia has acually been pretty good to it's partners.  The only thing bad that they've done is let their products get stale.  They left G92 out on the market for too long, despite it beeing a more than capable product, the partners like new products to release press statements about every 6 months.

You also seem to have the roles reverse, a lot of these partners wouldn't even exist today if it wasn't for nVidia and their successful products.  Where was BFG, eVGA, and XFX before G70?  All three jumped on the nVidia wave and rode it until this point, before G70 I don't think anyone would have even know who any of them were.  XFX has just jumped from one wave to the other, if they can ride the ATi wave to even greater success I'm glad for them.  Really it makes sense too.  XFX is probably the best ATi partner right now, with little competition.  No other ATi partner offers lifetime warranties, no other partner offers as good of bundles, XFX has a leg up on the ATi partners, and they would have a tuffer time in the nVidia market.


----------



## popswala (Mar 30, 2010)

*what!!!*

This sucks what they did. Out of all the 470 & 480's so far. I was liking their design on them. I'm a big fan of XFX. I guess theres always evga to fall back on. 

Wonders if the price of there past cards will drop now since theres no more or will it go up for limited quantity and owning a piece of them to hold on to. just wonderin


----------



## MrMilli (Mar 30, 2010)

HalfAHertz said:


> I'm a bit shocked at some of the comments here. First of all in none of the sources I read did it state that they will not sell the GTX400 cards period, but that they won't at launch. It doesn't mean that they won't start a bit later on... My guess is that they did not want to disappoint their loyal customers with poor availability.
> 
> Secondly about Nvidia being a sinking ship, seriously? We're talking about the second or third largest hardware company. Their IP portfolio has more pages than the Bible. We cannot even begin to compare it to 3dfx because the current situation is so much different: 1)The graphic market is huge and so are the players involved in it 2) Nvidia has spread out in many different fields, like a moderns day Lernaean hydra(Geforce, computing, ion, optimus, tegra, etc.) 3) They sit on a huge stockpile of cash 4)Even tho I don't like their last 2 generations, we need a second player on the market to keep Ati's greediness in line.



http://en.wikipedia.org/wiki/Semiconductor_sales_leaders_by_year#Ranking_for_year_2009
Not only isn't nVidia second or third largest 'hardware company' (like you call it) but they're not even in the top 20.

On topic:
Well, like some of you have mentioned already, warranty may be an issue.
At work (computer repair), I can see a bigger failure rate of nVidia card over Ati cards. While nVidia does hold a bigger market share, it seems to me that a bigger percentage of cards die. 6000, 7000 & 8000 series cards seem to be very sensitive to heat. When a card gets clogged with dust, they just die. While Ati cards don't die when clogged but shut themselves off on time. Dust is a big problem but at least Ati cards save themselves.
We sold some computers with Asus 9800GX2's (refrence design) ... i can tell you that every single one of them died (within Asus' warranty of three years).


----------



## pantherx12 (Mar 30, 2010)

Paulieg said:


> I just have to say this, just in case people start going into fanboy rants....
> 
> No fan of enthusiast hardware should EVER rout for the complete demise of competition. That is the absolute worst thing that can happen to any consumer. So, despite the difficulties at Nvidia, and the fact that I chose to go with ATI this generation, I really hope that they pull out of this strong.




To true, twas what I was trying to say on the 2nd page.

Seams forward thinking is in serious short supply these days


----------



## DigitalUK (Mar 30, 2010)

dumo said:


> It happened before with FX5800 (dustbuster and flame thrower) and then NVDA released GF6800 whis was a winner. So, imo GTX480 will be a stepping stone for more refined GTX line in the near future.



rotfl when i read that, still gigling now. so true. thanks


----------



## cdawall (Mar 30, 2010)

newtekie1 said:


> You also seem to have the roles reverse, a lot of these partners wouldn't even exist today if it wasn't for nVidia and their successful products.  Where was BFG, eVGA, and XFX before G70?  All three jumped on the nVidia wave and rode it until this point, before G70 I don't think anyone would have even know who any of them were.  XFX has just jumped from one wave to the other, if they can ride the ATi wave to even greater success I'm glad for them.  Really it makes sense too.  XFX is probably the best ATi partner right now, with little competition.  No other ATi partner offers lifetime warranties, no other partner offers as good of bundles, XFX has a leg up on the ATi partners, and they would have a tuffer time in the nVidia market.



Well in all honesty sapphire has a leg up on the partners it is ati's pcb partner who else get cards like the 4850x2. Also asus has some very good bundles and a warranty to back them up. Visiontek also carries a lifetime warranty and on top end cards some very nice bundles. 

Xfx jumping ship should represent the air falling out of nv for this series. Imo g100 is a place holder and 6 months from now they will release a good powerful card think of it like a 2900 hot powerful and sucks power like crazy. Another nv card to compare it to would be the 5800 ultra. It was out performed by the cheaper ati card and thanks to it we had high vga card prices. 

Now view on all this mess nv needs a good card so we can have competition who liked 4850 for 250 almost immediatly after release? Who liked 9800gtx for cheap? Cause all of that is gone if fermi continues to flop the 5850 is way overpriced right now and will stay there until nv releases a card worth 2 shits.

Oh and 1st to call me an ati lover may as well skip it I have owned nv cards and still use nv cards I just walked into my current cards for cheap.


----------



## newtekie1 (Mar 30, 2010)

cdawall said:


> Well in all honesty sapphire has a leg up on the partners it is ati's pcb partner who else get cards like the 4850x2. Also asus has some very good bundles and a warranty to back them up. Visiontek also carries a lifetime warranty and on top end cards some very nice bundles.
> 
> Xfx jumping ship should represent the air falling out of nv for this series. Imo g100 is a place holder and 6 months from now they will release a good powerful card think of it like a 2900 hot powerful and sucks power like crazy. Another nv card to compare it to would be the 5800 ultra. It was out performed by the cheaper ati card and thanks to it we had high vga card prices.
> 
> ...



Sapphire didn't "get" the HD4850x2, they were just the only ones to develope it.  ATi left the design to the partners, and Sapphire was the only one that wanted to make one, mainly because most of the partners were afraid the HD4850x2 sales would kill the HD4870x2.

ASUS' certainly doesn't have the warranty to back up their cards, they only have a 3 year, VisionTek does, I wasn't aware of this as they used to have a 3 year also.  I'm guess Visiontek moving to a lifetime warranty is reaction to XFX entering the ATi market, which is certainly a good thing.  XFX, IMO, really is showing the other ATi partners how to do things.

As for competition, I'm not worried about it, as I've already pointed out.  GTX480 being competitive doesn't worry me, because it is out ahead and won't be competitive at all.  What worries me is the GTX470 being competitive.  As long as GTX470 managed to at least be competitive with the HD5870, I'll be happy, and nVidia can release a lower card to compete with the HD5850.


----------



## cdawall (Mar 30, 2010)

newtekie1 said:


> Sapphire didn't "get" the HD4850x2, they were just the only ones to develope it.  ATi left the design to the partners, and Sapphire was the only one that wanted to make one, mainly because most of the partners were afraid the HD4850x2 sales would kill the HD4870x2.
> 
> ASUS' certainly doesn't have the warranty to back up their cards, they only have a 3 year, VisionTek does, I wasn't aware of this as they used to have a 3 year also.  I'm guess Visiontek moving to a lifetime warranty is reaction to XFX entering the ATi market, which is certainly a good thing.  XFX, IMO, really is showing the other ATi partners how to do things.
> 
> As for competition, I'm not worried about it, as I've already pointed out.  GTX480 being competitive doesn't worry me, because it is out ahead and won't be competitive at all.  What worries me is the GTX470 being competitive.  As long as GTX470 managed to at least be competitive with the HD5870, I'll be happy, and nVidia can release a lower card to compete with the HD5850.



Did you get inside knowledge from asus vt and other companies about there developement of the 4850x2 that I didn't? 

And they way the asus warranty works is you have 3 years for the original card to die and the replacement recieves another 3 yr contract to it as they are issued by serial number and warranty works off manuf date not sales date.

Also what competition the 480 doesn't outperform the 5970 hell there are benchmarks the gtx295 275 and 4870x2 beat it in. The card is a flop just like the fx series was yes nv had a 5950 ultra but it sure as hell got beat ny the 9800pro same goes here 480<5970


----------



## newtekie1 (Mar 30, 2010)

cdawall said:


> Did you get inside knowledge from asus vt and other companies about there developement of the 4850x2 that I didn't?
> 
> And they way the asus warranty works is you have 3 years for the original card to die and the replacement recieves another 3 yr contract to it as they are issued by serial number and warranty works off manuf date not sales date.
> 
> Also what competition the 480 doesn't outperform the 5970 hell there are benchmarks the gtx295 275 and 4870x2 beat it in. The card is a flop just like the fx series was yes nv had a 5950 ultra but it sure as hell got beat ny the 9800pro same goes here 480<5970



Go back and read the news articles from the HD4850x2, there were several discussing it.

Every ASUS warranty I've done, the replacement part continues the original warranty from the part it replace, the warranty period does not extend when the part is replaced.  A lot of companies go by manufacture day, however brand new replacement parts do not apply, they all have systems set up to adjust for this.

The GTX480 isn't $750 either... What I find on is that people are saying that like just because the GTX480 doesn't outperform the HD5970, it is a flop.  You know where it does outperform the HD5970?  Price vs. Performance.  Yep, the GTX480 is actually better than the HD5970 in Price vs. Performance! Why does the GTX480 have to outperform the HD5970 anyway?  Did the GTX285 outperform the HD4870x2?  No.  Was the GTX285 a damn good card? Yes.  The GTX480 isn't a great card, I certainly wouldn't buy it. However, the reviews still put it out front of the HD5870 by a decent amount, and of course that comes at a price.  However, I'm more interested in the GTX470, which the reviews seem to put at just about even with the HD5870.  That is the competition that I'm concerned about.  Because right now, if I had to pick between a $400 HD5870 and a $349 GTX470, I'd go with the equally performing GTX470 for less money.  Hopefully we'll see something like a GTX460 that compete with the HD5850 and drives those prices down also.


----------



## cdawall (Mar 30, 2010)

newtekie1 said:


> Go back and read the news articles from the HD4850x2, there were several discussing it.
> 
> Every ASUS warranty I've done, the replacement part continues the original warranty from the part it replace, the warranty period does not extend when the part is replaced.  A lot of companies go by manufacture day, however brand new replacement parts do not apply, they all have systems set up to adjust for this.
> 
> Why does the GTX480 have to outperform the HD5970?  Did the GTX285 outperform the HD4870x2?  No.  Was the GTX285 a damn good card? Yes.  The GTX480 isn't a great card, I certainly wouldn't buy it. However, the reviews still put it out front of the HD5870 by a decent amount, and of course that comes at a price.  However, I'm more interested in the GTX470, which the reviews seem to put at just about even with the HD5870.  That is the competition that I'm concerned about.  Because right now, if I had to pick between a $400 HD5870 and a $349 GTX470, I'd go with the equally performing GTX470 for less money.  Hopefully we'll see something like a GTX460 that compete with the HD5850 and drives those prices down also.




Ok I see your point however I remember when the 2900's came you were the first to point out issues with them running hot or this or that right now you are stairing at the equivilent nvidia card except this one it won't even oc all that well. 5870 is a better card it pulls less juice to accomplish the same performance oh it puts out less heat too and to top it all off you can xfire 3 of them on a 1200w psu without starting a fire. So it has a better multi card upgrade path you don't need a new psu just to run one of them.

In all honesty I think you are a wee bit of an nv fanboi and clouded to the ati side of things I have run both sets of cards and moved where performance was. I had a ti4200 7800 g92 all those nv cards as well as a 3850 and 4800x2's 

The ati card makes more logical sense right now the gtx already has reports of high returns and killing high end psu's its not worth it you save your $50 and why don't you instead buy $100 more expensive psu so your gtx runs and while your at it move your cpu to water cause the vga puts out more heat than the case can handle. G92 was a milestone for nv these cards they are a turd.


----------



## kaosII (Mar 30, 2010)

I have not gone back to XFX since my repeted RMA problems with the FC chokes or lack there of on the initial 280GTX cards. 
I can still hear the screaching sound in my head.
Did they ever have problems like this on AMD ati cards??? 
I never did any follow up on this issue.


----------



## [I.R.A]_FBi (Mar 30, 2010)

DigitalUK said:


> rotfl when i read that, still gigling now. so true. thanks




bad man nuh giggle


----------



## Zubasa (Mar 30, 2010)

kaosII said:


> I have not gone back to XFX since my repeted RMA problems with the FC chokes or lack there of on the initial 280GTX cards.
> I can still hear the screaching sound in my head.
> Did they ever have problems like this on AMD ati cards???
> I never did any follow up on this issue.


As far as I know the GTX280 has the highest failure rate of any card except for the 4870X2.


----------



## kaosII (Mar 30, 2010)

Zubasa said:


> As far as I know the GTX280 has the highest failure rate of any card except for the 4870X2.



All 280's or just XFX?  ..........and thank you.


----------



## Zubasa (Mar 30, 2010)

kaosII said:


> All 280's or just XFX?  ..........and thank you.


All of them in general.
One thing in common of these cards are they run really warm.

The thing about XFX is that they aren't know for having the best coolers on their cards, 
so I guess it is good for them to stay away from the GF100.


----------



## [I.R.A]_FBi (Mar 30, 2010)

werent they reference cooled?


----------



## Zubasa (Mar 30, 2010)

[I.R.A]_FBi said:


> werent they reference cooled?


The GTX 280s are mostly so, but the full GT200 which is 65nm runs quite hot.
For the GF100, knowing what XFX did with the 4890s will you still want to buy their non-reference cards?
Most importantly, XFX offers life-time warranties in the USA, so I guess it isn't smart for them to produce cards with potentially high failure rates.


----------



## suraswami (Mar 30, 2010)

Don't know if this is blow to NV, XFX or customers.


----------



## eidairaman1 (Mar 30, 2010)

Zubasa said:


> The GTX 280s are mostly so, but the full GT200 which is 65nm runs quite hot.
> For the GF100, knowing what XFX did with the 4890s will you still want to buy their non-reference cards?



it took NV the 285 to actually make an Impact for the High End, but seriously NV is pushing the thermal barrier and power supply requirements that eventually the gen after this series will have to be a total 180 or they will actually fall.


----------



## FreedomEclipse (Mar 30, 2010)

eidairaman1 said:


> it took NV the 285 to actually make an Impact for the High End, but seriously NV is pushing the thermal barrier and power supply requirements that eventually the gen after this series will have to be a total 180 or they will actually fall.



yeah I think the initial shock was that some people might actually need to upgrade their PSU's to run a single or dual 480  when a 750-850w PSU would be perfectly fine for 2 5870's now you need at least 1k on the PSU lol.

be interesting to see what temps the cards get under water.


----------



## Deleted member 24505 (Mar 30, 2010)

Or what effect they have on the average water temp.


----------



## TheMailMan78 (Mar 31, 2010)

newtekie1 said:


> Sapphire didn't "get" the HD4850x2, they were just the only ones to develope it.  ATi left the design to the partners, and Sapphire was the only one that wanted to make one, mainly because most of the partners were afraid the HD4850x2 sales would kill the HD4870x2.
> 
> ASUS' certainly doesn't have the warranty to back up their cards, they only have a 3 year, VisionTek does, I wasn't aware of this as they used to have a 3 year also.  I'm guess Visiontek moving to a lifetime warranty is reaction to XFX entering the ATi market, which is certainly a good thing.  XFX, IMO, really is showing the other ATi partners how to do things.
> 
> As for competition, I'm not worried about it, as I've already pointed out.  GTX480 being competitive doesn't worry me, because it is out ahead and won't be competitive at all.  What worries me is the GTX470 being competitive.  As long as GTX470 managed to at least be competitive with the HD5870, I'll be happy, and nVidia can release a lower card to compete with the HD5850.



Small correction. Visiontek had a lifetime warranty years before XFX came over to the red side.


----------



## eidairaman1 (Mar 31, 2010)

FreedomEclipse said:


> yeah I think the initial shock was that some people might actually need to upgrade their PSU's to run a single or dual 480  when a 750-850w PSU would be perfectly fine for 2 5870's now you need at least 1k on the PSU lol.
> 
> be interesting to see what temps the cards get under water.



well in a single instance if water was stationary it would reach the boiling point very quickly since 100 C is 212 F, over the small amt of water that goes through a block.


----------



## imperialreign (Mar 31, 2010)

TheMailMan78 said:


> Small correction. Visiontek had a lifetime warranty years before XFX came over to the red side.



Yep - VT also used to handle nearly all mid-tier brand warranties, too (even the ATI branded cards).

Their cards aren't anything overly "special" or "blingy," but they're solid as hell (never had one fail on me).


----------



## newtekie1 (Mar 31, 2010)

cdawall said:


> Ok I see your point however I remember when the 2900's came you were the first to point out issues with them running hot or this or that right now you are stairing at the equivilent nvidia card except this one it won't even oc all that well. 5870 is a better card it pulls less juice to accomplish the same performance oh it puts out less heat too and to top it all off you can xfire 3 of them on a 1200w psu without starting a fire. So it has a better multi card upgrade path you don't need a new psu just to run one of them.



I don't remember worrying much about the HD2900's running hot, I believe my issues with them were that they were overpriced.  I've always gone with the best bang for the buck.  I'll be the first one to point out the problems with the GTX480 also, in fact I did later on down in the post.  However, again, my buying decision almost always goes with bang for the buck, and that is pretty much what I am always concerned with.



cdawall said:


> In all honesty I think you are a wee bit of an nv fanboi and clouded to the ati side of things I have run both sets of cards and moved where performance was. I had a ti4200 7800 g92 all those nv cards as well as a 3850 and 4800x2's



For someone that just a few posts back went on about people not calling you an ATi fan boy because you've own nVidia cards too...you sure are quick to do the exact same thing to others...

Did you happen to look at what card is currently in my main rig? An HD4890.  Guess what card was in Rig4 before the 8800GTS...and x800xl(bought here from Xazax), and before that was an x1950Pro(sold to Crashnburnxp).  I've also had an x1900GT(bought from Blacktruckryder), which was replaced with an HD3850(bought from Xazax), which was replaced by an HD4670(sold to 3dsage), which was now replaced by the HD4890.  I just purchased an HD4870x2 off miahallen.  About the only series I haven't personally owned a card from was the HD2000 series, which I skipped, and to be fair, I skipped the G80 series also, I wasn't interested in either as both didn't offer enough gains over the cards I currently had to justify the price.  The only reason I have a G80 card now is because I got a good deal on it, and I wanted it to replace the x800xl so I could use the machine to fold.



cdawall said:


> The ati card makes more logical sense right now the gtx already has reports of high returns and killing high end psu's its not worth it you save your $50 and why don't you instead buy $100 more expensive psu so your gtx runs and while your at it move your cpu to water cause the vga puts out more heat than the case can handle. G92 was a milestone for nv these cards they are a turd.



No, the GTX480 doesn't make sense.  However, as I already said, the GTX470 is actually looking promising.  Granted, I won't totally believe that until we see a W1z review on it, but from the other reviews it is looking promising.  It doesn't use an extreme amount of power like the GTX480(so no worry about killing power supplies), it does still get hot but it uses a much weaker heatsink and fan than the GTX480 also.  The heat output is only about 60w more than the HD5870, which actually does an amazing job in heat and power usage.  However, the interesting thing is that the GTX470's heat output and power usage is very much inline with my current HD4890 and GTX285, and it is actually very similar to the GTX280.  I don't need to worry about water-cooling my CPU with those cards, so I'm not worried about it with the GTX470.  I think most, including you, seem to be caught up on the GTX480 and applying the problems it has with GTX470.  However, it is pretty obvious that nVidia really pushed that card to make it one hell of a beast so it would beat the HD5870 hands down, while the scaled back GTX470 is actually a reasonable card. The HD5870 is better in heat and power, but the GTX470 certainly isn't unreasonable, it compares rather nicely to the high end cards of the last generation actually, the HD5870 has just sets an extremly high bar.


----------



## mastrdrver (Mar 31, 2010)

If margins are suppose to be as tight (or non existent) as is rumored for the GTX 4xx cards, then to me its no surprise that they dropped them. Why not take that extra money and sell those crazy overpriced 5970 Blacks that will all sell out quickly? There are going to be huge margins on them compared to the 480/470.


----------



## eidairaman1 (Mar 31, 2010)

imperialreign said:


> Yep - VT also used to handle nearly all mid-tier brand warranties, too (even the ATI branded cards).
> 
> Their cards aren't anything overly "special" or "blingy," but they're solid as hell (never had one fail on me).



I could careless about the bling, but for one wanna bet they are a modders toy actually, for 1 different cooling, volt mods etc.


----------



## SUPERREDDEVIL (Mar 31, 2010)

This is obvious. Who´s gonna give a long warranty for a product that´s benn killed itself by it´s own heat.. nice move from XFX, they finally realized that AMD has the leadership now


----------



## AzureOfTheSky (Mar 31, 2010)

fermi= http://techreport.com/articles.x/4966/2 redux.......

anybody who thinks nVidia didnt loose this round is either an insain fanboi or mentally defective


----------



## cdawall (Mar 31, 2010)

AzureOfTheSky said:


> fermi= http://techreport.com/articles.x/4966/2 redux.......
> 
> anybody who thinks nVidia didnt loose this round is either an insain fanboi or mentally defective



While I agree from the tech perspective they may not be fanboi's maybe they just need a winter heater you know fermi the new preshott


----------



## xtremesv (Mar 31, 2010)

AzureOfTheSky said:


> fermi= http://techreport.com/articles.x/4966/2 redux.......
> 
> anybody who thinks nVidia didnt loose this round is either an insain fanboi or mentally defective



History repeats itself for nVidia.

FX 5800 / GTX 480
- Reduce availability.
- Not quite a performance star.
- Heat and noise problems.
- Late.

The difference is that now nVidia buyers have pieces of paper moving around in games (Physx), 3D glasses (if you can afford the whole kit), can work with some serious C programming and video encoding (CUDA). Nice features though. Ah, I was forgetting "The Way It's Meant to Be Played" extra value.


----------



## Zubasa (Mar 31, 2010)

xtremesv said:


> History repeats itself for nVidia.
> 
> FX 5800 / GTX 480
> - Reduce availability.
> ...


I wouldn't say TWIMTP is "extra value", in the end the money comes from who pay for these cards. :shadedshu
We both know that "The Way It's Meant to Be Payed" won't save the GF100 from the Hemlock.


----------



## theubersmurf (Mar 31, 2010)

xtremesv said:


> History repeats itself for nVidia.
> 
> FX 5800 / GTX 480
> - Reduce availability.
> ...


They got the performance crown (within the framework of single gpu video cards) but that's all they got. And not by a huge margin. By any other metric the card is a disaster. It seems like they sacrificed everything just to get the performance crown, as if that's the only thing people wanted to hear...


----------



## idx (Mar 31, 2010)

maybe ATI did something to them like paying them for that.... just saying..


----------



## Binge (Mar 31, 2010)

idx said:


> maybe ATI did something to them like paying them for that.... just saying..



Thank you conspiracy theorist for your doubts.  Reality just doesn't seem real, right?


----------



## xtremesv (Mar 31, 2010)

Zubasa said:


> I wouldn't say TWIMTP is "extra value", in the end the money comes from who pay for these cards. :shadedshu
> We both know that "The Way It's Meant to Be Payed" won't save the GF100 from the Hemlock.



I've met people who thought that just because X game had the nVidia "It's Meant to Be Played" logo their nVidia cards were superior to ATi on that particular game. Of course, I'm talking about people who still think that having more video memory is all that matters


----------



## Wile E (Mar 31, 2010)

newtekie1 said:


> I don't remember worrying much about the HD2900's running hot, I believe my issues with them were that they were overpriced.  I've always gone with the best bang for the buck.  I'll be the first one to point out the problems with the GTX480 also, in fact I did later on down in the post.  However, again, my buying decision almost always goes with bang for the buck, and that is pretty much what I am always concerned with.
> 
> 
> 
> ...



This is a point I've been trying to make repeatedly. I've pointed out numerous times that the 4870x2 actually draws MORE power in everything (except BD playback) than the GTX480.

ATI just hit a homerun with the 5k series. That doesn't make Fermi a bad card. Not a great card, but not a bad card.


----------



## AzureOfTheSky (Mar 31, 2010)

epically if you need a loud heater for your computer room/house on those cold winter nights....oh wait....its almost summer..........


----------



## mastrdrver (Mar 31, 2010)

Right, lets get this straight.....

2, 55nm, 1.3v, 4870 gpus run a little warmer and draw a little more power than

1, 40nm, .99v, GTX 480 gpu

Somehow, I don't see anything there that makes the GTX 480 a decent card. Let alone a good card. Not to mention that in performance a 4870x2 is close to or equal a 5870. That just makes the GTX 480 look worse imo.

I guess you don't have to worry about SLI profiles with one gpu. I guess that's a plus for Fermi.........right?


----------



## Wile E (Mar 31, 2010)

AzureOfTheSky said:


> epically if you need a loud heater for your computer room/house on those cold winter nights....oh wait....its almost summer..........



I have whole house AC, and it would be cooler than my X2 anyway.



mastrdrver said:


> Right, lets get this straight.....
> 
> 2, 55nm, 1.3v, 4870 gpus run a little warmer and draw a little more power than
> 
> ...



Fermi outperforms both the 5870 an the 4870x2. Voltage specs don't matter, only the end results. In this, the end result is a whole lot of wattage and heat. lol.

I don't see anything that makes Fermi a terrible card. I likewise don't really see anything that makes it a great card.


----------



## TAViX (Mar 31, 2010)

Imagine if they cannot offer warranty even for 1 year, what's the quality of those boards. And XFX is known for their 5 years+ warranty, even lifetime, so it is very understandable why they choose not to market the new cards from Nvidia.

Also, a card that "might" only last 2 or 3 years is not worth buying anyways.


----------



## mastrdrver (Mar 31, 2010)

Wile E said:


> Fermi outperforms both the 5870 an the 4870x2. Voltage specs don't matter, only the end results. In this, the end result is a whole lot of wattage and heat. lol.
> 
> I don't see anything that makes Fermi a terrible card. I likewise don't really see anything that makes it a great card.



Outperforms.....slightly.

Maybe it does a lot more in benches, but since I play games I didn't really look at the "Vantage" parts of all the reviews I read.

BTW, I read somewhere that a couple reviewers got bum GTX 480s. Still trying to find them though.

Fermi isn't terrible.....but it came close.


----------



## segalaw19800 (Mar 31, 2010)

mtosev said:


> Bad news for nVidia. they are slowly slipping away. if nVidia continues on this path they will end up as 3DFX did



Nvidia buyout 3Dfx


----------



## HalfAHertz (Mar 31, 2010)

mastrdrver said:


> Right, lets get this straight.....
> 
> 2, 55nm, 1.3v, 4870 gpus run a little warmer and draw a little more power than
> 
> ...



My 300nm pentium 1 ran on 5v (or was it 2,5v) and it was passively cooled. What's your point?


----------



## imperialreign (Mar 31, 2010)

Wile E said:


> Fermi outperforms both the 5870 an the 4870x2. Voltage specs don't matter, only the end results. In this, the end result is a whole lot of wattage and heat. lol.
> 
> I don't see anything that makes Fermi a terrible card. I likewise don't really see anything that makes it a great card.



It doesn't outperform them by all that much - especially considering how hyped the card was . . . as well, the slight margin it has over the other two seems to dwindle quick when the resolution goes up.  I mean, considering just how much nVidia was hyping this card, I was expecting performance close to (if not better than) the 5970.

TBH, I can't say it's a _bad_ card, either . . . but I don't really see where it's a better deal over a 5870.


----------



## cdawall (Mar 31, 2010)

Wile E said:


> This is a point I've been trying to make repeatedly. I've pointed out numerous times that the 4870x2 actually draws MORE power in everything (except BD playback) than the GTX480.
> 
> ATI just hit a homerun with the 5k series. That doesn't make Fermi a bad card. Not a great card, but not a bad card.



I think we are going to see that change something is wrong with how these cards got measured I have already seen reports of 3 1000w+ psu getting killed with fermi upgrades one tt toughpower 1200w one fsp bluetop 1000w and one fsp 1200w something isn't right here and it need to get figured out now cause even the 4870x2 wasn't blowing psu


----------



## phanbuey (Mar 31, 2010)

cdawall said:


> I think we are going to see that change something is wrong with how these cards got measured I have already seen reports of 3 1000w+ psu getting killed with fermi upgrades one tt toughpower 1200w one fsp bluetop 1000w and one fsp 1200w something isn't right here and it need to get figured out now cause even the 4870x2 wasn't blowing psu



HAH... wow.  Thats is insane.


----------



## cdawall (Mar 31, 2010)

phanbuey said:


> HAH... wow.  Thats is insane.



One card popped running 3dmark looped which is odd maybe as they produce more heat they become less effecient pulling more than the ~300w reviews have been putting them at. That's what I think is happening and the cards are overdrawing multirail psu's since most multirail systems set the cb higher than the psu can actually handle if multiple rails max out this has led to overloaded highend psu's but only the multirail ones.


----------



## DarthCyclonis (Mar 31, 2010)

So when are we going to see the GTX480 Coop addition with a G92 for PhysX! lmao.


----------



## phanbuey (Mar 31, 2010)

cdawall said:


> One card popped running 3dmark looped which is odd maybe as they produce more heat they become less effecient pulling more than the ~300w reviews have been putting them at. That's what I think is happening and the cards are overdrawing multirail psu's since most multirail systems set the cb higher than the psu can actually handle if multiple rails max out this has led to overloaded highend psu's but only the multirail ones.



Yeah I always use single rails, but still... 

Theyre definitely pushing the line on the manufacturing process. For sure they sent cherry picked samples to reviewers.  Not all cards draw the same amount of juice am i right?  Some chips might be leakier/run hotter/ burn more watts than others. 

Sounds like it might be a QC issue.  Was it all the same brand of card?


----------



## cdawall (Mar 31, 2010)

phanbuey said:


> Yeah I always use single rails, but still...
> 
> Theyre definitely pushing the line on the manufacturing process. For sure they sent cherry picked samples to reviewers.  Not all cards draw the same amount of juice am i right?  Some chips might be leakier/run hotter/ burn more watts than others.
> 
> Sounds like it might be a QC issue.  Was it all the same brand of card?



Ill ask the shop they got returned to, but I do not believe they were. Some of the reviewers show higher loads as well I have seen anywere from 300 to 340 and up


----------



## [I.R.A]_FBi (Mar 31, 2010)

300-340, teh variance is great


----------



## cdawall (Mar 31, 2010)

[I.R.A]_FBi said:


> 300-340, teh variance is great



That's a good differnece haha that's like a 8400gs extra in the system


----------



## phanbuey (Mar 31, 2010)

cdawall said:


> That's a good differnece haha that's like a 8400gs extra in the system



Does the speed/way in which a card demands power also make a difference?... i.e. If i plug something into my PSU that instantly demands 700W, and then the next nanosecond, 100W, and then 850W again in a few ms later, repeatedly.  Does that make any difference on the strain for the psu instead of just demanding 700W continuously?


----------



## newtekie1 (Mar 31, 2010)

mastrdrver said:


> Right, lets get this straight.....
> 
> 2, 55nm, 1.3v, 4870 gpus run a little warmer and draw a little more power than
> 
> ...



I don't see why that matters exactly?  The end result is, for the less heat and power, Fermi provides more performance then ATi's last generation's high end card.  In any other situation that would have been praised as amazing.  It is only in the shadow of RV870 that Fermi doesn't look great.

Here is an interesting little tidbit of information: Not once has ATi been able to release a single GPU card that actually outperformed every card from the previous generation.  This includes the HD5870.  However, nVidia has with Fermi.

You know, it kind of makes me wonder what the power and heat of RV870 would be like if they did push it to that level of performance...


----------



## cdawall (Mar 31, 2010)

phanbuey said:


> Does the speed/way in which a card demands power also make a difference?... i.e. If i plug something into my PSU that instantly demands 700W, and then the next nanosecond, 100W, and then 850W again in a few ms later, repeatedly.  Does that make any difference on the strain for the psu instead of just demanding 700W continuously?




No idea to be honest



newtekie1 said:


> I don't see why that matters exactly?  The end result is, for the less heat and power, Fermi provides more performance then ATi's last generation's high end card.  In any other situation that would have been praised as amazing.  It is only in the shadow of RV870 that Fermi doesn't look great.
> 
> Here is an interesting little tidbit of information: Not once has ATi been able to release a single GPU card that actually outperformed every card from the previous generation.  This includes the HD5870.  However, nVidia has with Fermi.
> 
> You know, it kind of makes me wonder what the power and heat of RV870 would be like if they did push it to that level of performance...



Hey what last gen card out does the 5970 and generation before that what out did the 4870x2


----------



## newtekie1 (Mar 31, 2010)

cdawall said:


> No idea to be honest
> 
> 
> 
> Hey what last gen card out does the 5970 and generation before that what out did the 4870x2



I guess you don't know what Single GPU means...

I'm guessing you also missed the point I was making entirely.


----------



## cdawall (Mar 31, 2010)

newtekie1 said:


> I guess you don't know what Single GPU means...
> 
> I'm guessing you also missed the point I was making entirely.



Maybe you missed that ati has this thing called crossfire and they use it to make cards like the 4870x2 and 5970 which use 2 gpus to render you know its kinda like a dual core cpu oh wait we shouldn't have those that's a mark of progress and should be shunned


----------



## crow1001 (Mar 31, 2010)

Wile E said:


> I have whole house AC, and it would be cooler than my X2 anyway.
> 
> 
> 
> ...



LMAO, you don't see anything that makes fermi "480" a terrible card, how about on par or slightly better performance than a card released six months a go, yeah you get the odd TWIMTBP title where it's higher but who gives a crap, the thing is hotter than the sun and sounds like a tornado, its consumes more watts than the Large Hadron Collider, overclock the six month old 5870 and it will leave the flawed 480 in it's wake, yeah nothing bad..muhhahaha.


----------



## newtekie1 (Mar 31, 2010)

cdawall said:


> Maybe you missed that ati has this thing called crossfire and they use it to make cards like the 4870x2 and 5970 which use 2 gpus to render you know its kinda like a dual core cpu oh wait we shouldn't have those that's a mark of progress and should be shunned



Yeah yeah, and nVidia uses SLI to do the same thing, that wasn't the point at all.

I'll explain it again, Fermi is the first single GPU that we have seen released that actually tops the previous generation's dual GPUs.  That is a huge feat, one that hasn't been done ever before.  In any other situation that alone would have made Fermi get praised as a wonderful GPU.  And in the case of the HD4870x2, Fermi actually does it with less power and less heat, making it even more amazing.

I'm not ignoring the dual GPU cards, obviously they exist and provide amazing performance, that wasn't my point at all.  Ignoring the complications that they add, they definitely are the top dogs.  However, that again was not the point.  The accomplishment of a single GPU topping the previous generation's dual-GPU cards has never been done, and Fermi doing it while using less power and producing less heat shows that it is an utterly amazing GPU.  However, RV870 is even better for other reasons.  If it wasn't for RV870 having the perfect balance of power usage/heat output/price and performance, Fermi would probably be praised right now instead of bashed.


----------



## cdawall (Mar 31, 2010)

newtekie1 said:


> Yeah yeah, and nVidia uses SLI to do the same thing, that wasn't the point at all.
> 
> I'll explain it again, Fermi is the first single GPU that we have seen released that actually tops the previous generation's dual GPUs.  That is a huge feat, one that hasn't been done ever before.  In any other situation that alone would have made Fermi get praised as a wonderful GPU.  And in the case of the HD4870x2, Fermi actually does it with less power and less heat, making it even more amazing.
> 
> I'm not ignoring the dual GPU cards, obviously they exist and provide amazing performance, that wasn't my point at all.  Ignoring the complications that they add, they definitely are the top dogs.  However, that again was not the point.  The accomplishment of a single GPU topping the previous generation's dual-GPU cards has never been done, and Fermi doing it while using less power and producing less heat shows that it is an utterly amazing GPU.  However, RV870 is even better for other reasons.  If it wasn't for RV870 having the perfect balance of power usage/heat output/price and performance, Fermi would probably be praised right now instead of bashed.



Only complaint I have is the 480 doesn't put out less heat or use less juice from what I have seen its smoking psu's we never saw that with 4870x2...the 5870 is also 6 months old and beat the 4870x2 in almost everything so how is it not in the same boat as fermi?


----------



## HalfAHertz (Mar 31, 2010)

@newtekie1: I think they misunderstood what you're trying to say. If you allow me to clear it up -  GTX480>GTX295>4870x2, he's not referring to it as doubling the performance of the  GTX280/285 but only as matching the dual solutions from last gen.


----------



## newtekie1 (Mar 31, 2010)

cdawall said:


> Only complaint I have is the 480 doesn't put out less heat or use less juice from what I have seen its smoking psu's we never saw that with 4870x2...the 5870 is also 6 months old and beat the 4870x2 in almost everything so how is it not in the same boat as fermi?



I don't care what you've seen, every competent review shows less heat and less power usage than the HD4870x2.

And the HD5870 doesn't beat the HD4870x2 in almost everything, in fact overall the HD5870 is about 5% behind the HD4870x2, that means that the HD4870x2 beats the HD5870 in more things...  That is how it is not in the same boat as Fermi.  Beating the HD4870x2 in a few things, but losing overall, it still a lose.  Fermin wins overall compared to the HD4870x2.


----------



## AzureOfTheSky (Mar 31, 2010)

so newtelie1, the reports that are floating around the net of people with 1+kw psu's (quility units like pcp&c,fortron,silverstone, and tt toughpower) blowing after people left them looping 3dmark/furmark/heaven when they say went to take a shower dont matter?

The thinking in each of these cases is that its probbly due to the card continuing to draw more and more power the longer its under load, one review i found showed 2 480's(reviewer also tested sli but it a seprate revies) each of them drew 340watts and hit 101c after being left running 3dmark or furmark or heaven for an extended time, it wasnt hours but the guy did leave it running a good while to simulate a real gaming session playing a stressfull game in a common case(think it was an antec 900 or something like that) 

the fact is, as the TPU review shows, the card can/does pull more power then nvidia wants to admit, it runs hot even at idle, and really dosnt perform that great if you get right for its specs.

1. an overclocked 5780 will be faster then a 480, even an overclocked one(they dont overclock well) 
2. even overclocked the 5780 couldnt draw as much power or create as much heat as the 480.

3. in idle the 5k cards use VERY LITTLE POWER and run VERY COOL

nVidia should have TESTED b4 they went to mass production,  I know amd/ati do, after the 2900 they went back to testing b4 they sent cards/chips to mass production, they ensure they will be able to keep them within a reasonable power/heat threshold had nV done proper testing b4 they started mass production they could have avoided
1. having such a hot card thats well below the planned specs.
2. having such a high fail rate on cores (make something hard to produce and its going to have higher fail rates) 
3. being 6 months late to market trying to work around heat and production problems.

yes the 5k cards had problems to, BUT notice they got worked out and yeilds are well above 7.1%

nV screwed up, I would guess they know they screwed up and are working hard to get out either a refresh OR a re-designed product that wont be so damn hot.

I wonder if even a water cooler like the older toxic cards used could keep these things heat in check....i have a feeling it would take a 120mm rad or dual 120mm rad even to do the job...


----------



## mdm-adph (Mar 31, 2010)

newtekie1 said:


> I'll explain it again, Fermi is the first single GPU that we have seen released that actually tops the previous generation's dual GPUs.  That is a huge feat, one that hasn't been done ever before.



Newtekie, I've said it before, but you deserve every single cent Nvidia is paying you.  You are able to effectively polish a turd like Fermi into something resembling a diamond better than anyone I've seen on the net, and statements like that just prove it.


----------



## digibucc (Mar 31, 2010)

newtekie1 said:


> Fermi wins overall compared to the HD4870x2.



what about a comparable card from NVidias lineup last generation? idk would that be a 295?

the 480 is not better all around than the 295 is it?


----------



## mdm-adph (Mar 31, 2010)

digibucc said:


> what about a comparable card from NVidias lineup last generation? idk would that be a 295?
> 
> the 480 is not better all around than the 295 is it?



Shhh -- he's in the zone.


----------



## newtekie1 (Mar 31, 2010)

digibucc said:


> what about a comparable card from NVidias lineup last generation? idk would that be a 295?
> 
> the 480 is not better all around than the 295 is it?



Actually, yes it is performance wise.  However, the GTX295 has better power and heat output than the GTX480.



AzureOfTheSky said:


> so newtelie1, the reports that are floating around the net of people with 1+kw psu's (quility units like pcp&c,fortron,silverstone, and tt toughpower) blowing after people left them looping 3dmark/furmark/heaven when they say went to take a shower dont matter?
> 
> The thinking in each of these cases is that its probbly due to the card continuing to draw more and more power the longer its under load, one review i found showed 2 480's(reviewer also tested sli but it a seprate revies) each of them drew 340watts and hit 101c after being left running 3dmark or furmark or heaven for an extended time, it wasnt hours but the guy did leave it running a good while to simulate a real gaming session playing a stressfull game in a common case(think it was an antec 900 or something like that)
> 
> ...



I won't believe the power supply stories until I see some real sources backing it up.  I find it hard to believe that we've seen so many reviews doing the exact same thing, including W1z's, and not a single reviewer had an issue.  Not to mention we've seen cards in the past that have drawn even more power than the GTX480.

Yes, I've already gone over that compared to the HD5870 the GTX480 isn't as good power and heat wise, even the GTX470 isn't.  But again, that is because the HD5870 set an amazingly high bar.  Really, compared to past cards, the GTX480 isn't terrible, it isn't great or even really good, but it isn't terrible heat and power wise.

And I find it funny that you talk about ATi never releasing another hot and problem card after the HD2900 series, because I seem to remember the HD4850 issues with heat, and furmark killing the cards...but yeah, ATi tests everything really well before they release it...  Don't post BS, ATi isn't perfect like you seem to think they are.


----------



## mdm-adph (Mar 31, 2010)

newtekie1 said:


> Actually, yes it is performance wise.  However, the GTX295 has better power and heat output than the GTX480.



Oops!  Nope, not at the highest resolutions demanded by the most uber-elite enthusiasts among us, no.  I find this interesting, too, since doesn't the GTX295 technically have a smaller frame buffer?  (1792MB/2)


----------



## digibucc (Mar 31, 2010)

100-101 lol


----------



## mdm-adph (Mar 31, 2010)

digibucc said:


> 100-101 lol



Still not "all around better."


----------



## newtekie1 (Mar 31, 2010)

mdm-adph said:


> Oops!  Nope, not at the highest resolutions demanded by the most uber-elite enthusiasts among us, no.  I find this interesting, too, since doesn't the GTX295 technically have a smaller frame buffer?  (1792MB/2)
> 
> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/perfrel_2560.gif



Oops!  Yep, overall it is.






One resolution doesn't matter, overall is what matters, next you'll be telling us we should pick whichever benchmark ATi did best in, and use that as the final word on which card is better...


----------



## mdm-adph (Mar 31, 2010)

newtekie1 said:


> Oops!  Yep, overall it is.
> 
> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/perfrel.gif
> 
> One resolution doesn't matter, overall is what matters, next you'll be telling us we should pick whichever benchmark ATi did best in, and use that as the final word on which card is better...



Hey -- you said "all-around."  Obviously, if I'm gaming at 2560x1600, the GTX480 is *not* better than the previous generation.  You can't pick and choose your stats.

Next you'll be saying the GTX480 is best because green is a pretty color.


----------



## newtekie1 (Mar 31, 2010)

mdm-adph said:


> Hey -- you said "all-around."  Obviously, if I'm gaming at 2560x1600, the GTX480 is *not* better than the previous generation.  You can't pick and choose your stats.
> 
> Next you'll be saying the GTX480 is best because green is a pretty color.



You don't know what all-around better means do you?

All around means: considering all aspects.  Looking at one resolution isn't considering all aspects now it is?

And you are the one trying to pick and choose stats here, when I use overall performance, I'M USING ALL THE STATS!  Guess what picking one resolution is...I'll tell you...it is picking and choosing your stats.


----------



## mdm-adph (Mar 31, 2010)

newtekie1 said:


> You don't know what all-around better means do you?
> 
> All around means: considering all aspects.  Looking at one resolution isn't considering all aspects now it is?
> 
> And you are the one trying to pick and choose stats here, when I use overall performance, I'M USING ALL THE STATS!  Guess what picking one resolution is...I'll tell you...it is picking and choosing your stats.



Funny -- "all-around better," to me, means "better in every way."  And obviously it's not, not to mention considering the heat it'll probably be an RMA in a few months, anyway.

But, please, continue with the defense of Nvidia.  It's cute, now that the tides have turned.


----------



## Velvet Wafer (Mar 31, 2010)

newtekie1 said:


> You don't know what all-around better means do you?
> 
> All around means: considering all aspects.  Looking at one resolution isn't considering all aspects now it is?


really newtekie...then we would also have to consider temperatures and wattage, and that wouldnt be all too good for fermi
come on, Nvidia cant always win!
this time, they fucked up, not ATI... they did that in the old hd2900 days, for exact the same reason.... trying to keep up with the fastest cards available (which were the g80 series at that time)without thinking first


----------



## cdawall (Mar 31, 2010)

newtekie1 said:


> You don't know what all-around better means do you?
> 
> All around means: considering all aspects.  Looking at one resolution isn't considering all aspects now it is?
> 
> And you are the one trying to pick and choose stats here, when I use overall performance, I'M USING ALL THE STATS!  Guess what picking one resolution is...I'll tell you...it is picking and choosing your stats.



Why not average things people use than cause overall includes 1024x768, 1280x1024 neither of those is used by someone who has a gtx480 take out useless benchmarks and the 480 starts to fall back. It should shine at high res but it doesn't hell I can't even render better than the 4870x2 is pulls more power and pushes more heat than a dualie card.


----------



## newtekie1 (Mar 31, 2010)

mdm-adph said:


> Funny -- "all-around better," to me, means "better in every way."  And obviously it's not, not to mention considering the heat it'll probably be an RMA in a few months, anyway.
> 
> But, please, continue with the defense of Nvidia.  It's cute, now that the tides have turned.



Re-read some of my posts, I'm hardly defending nVidia. In the shadow of RV870, Fermi doesn't look good at all.  I'm just putting in in perspective a little, because really I think Fermi is taking more flak than it really deserves.



Velvet Wafer said:


> really newtekie...then we would also have to consider temperatures and wattage, and that wouldnt be all too good for fermi
> come on, Nvidia cant always win!
> this time, they fucked up, not ATI... they did that in the old hd2900 days, for exact the same reason.... trying to keep up with the fastest cards available (which were the g80 series at that time)without thinking first



We were discussing pure performance.  Yes, to decide what is the better card, every aspect needs to be addressed.  RV870 without a doubt is the better GPU when considering every aspect.



cdawall said:


> Why not average things people use than cause overall includes 1024x768, 1280x1024 neither of those is used by someone who has a gtx480 take out useless benchmarks and the 480 starts to fall back. It should shine at high res but it doesn't hell I can't even render better than the 4870x2 is pulls more power and pushes more heat than a dualie card.



You certainly can rule out some resolutions, but then that wouldn't be overall performance.  When buying a card, I'd personally look directly at the resolution I was using, and nothing else.  However, in a discussion, I'm going to use overall performance because people use the different resolutions.

And I'm willing to bet that the higher resolution issues will be worked out with drivers, and there are pretty obviously driver issues still involved with the initial driver release.


----------



## cdawall (Mar 31, 2010)

newtekie1 said:


> You certainly can rule out some resolutions, but then that wouldn't be overall performance.  When buying a card, I'd personally look directly at the resolution I was using, and nothing else.  However, in a discussion, I'm going to use overall performance because people use the different resolutions.
> 
> And I'm willing to bet that the higher resolution issues will be worked out with drivers, and there are pretty obviously driver issues still involved with the initial driver release.



And there are plenty of pretty obvious design issues with fermi that will probably get worked out with a new card but that's against the point this is what we have now and it looks shitty in performance per watt and heat output both of which hugely effect overall rig performance and extra 300~400watts of heat has to go somewhere in the case and heat rises that just so happens to be cpu land in most cases. Can your cpu cooler handle loosing a good chunk of its cooling capacity due to hot air vs cool air?


----------



## eidairaman1 (Mar 31, 2010)

Velvet Wafer said:


> really newtekie...then we would also have to consider temperatures and wattage, and that wouldnt be all too good for fermi
> come on, Nvidia cant always win!
> this time, they fucked up, not ATI... they did that in the old hd2900 days, for exact the same reason.... trying to keep up with the fastest cards available (which were the g80 series at that time)without thinking first



and at that time ATI was going thru a Merger with AMD, what is Nvidia going through is what I'd like to know


----------



## newtekie1 (Mar 31, 2010)

cdawall said:


> And there are plenty of pretty obvious design issues with fermi that will probably get worked out with a new card but that's against the point this is what we have now and it looks shitty in performance per watt and heat output both of which hugely effect overall rig performance and extra 300~400watts of heat has to go somewhere in the case and heat rises that just so happens to be cpu land in most cases. Can your cpu cooler handle loosing a good chunk of its cooling capacity due to hot air vs cool air?



Where are you getting that all the heat is dumped in the case?  We had this discussion in another thread, I don't have to worry about the heat from my HD4870x2 raising case temps all that much, so I'm not worried about this card either.  Case temps are specifically why I only buy graphics cards that vent the hot air out the back of the case.  That 300-400w of heat does go somewhere, out the back of the case...


----------



## Velvet Wafer (Mar 31, 2010)

eidairaman1 said:


> and at that time ATI was going thru a Merger with AMD, what is Nvidia going through is what I'd like to know



IDK really... maybe severe money addiction?

rename/republish the same product 3 times, and then wont even get punished for it? Priceless. ;-)


----------



## cdawall (Mar 31, 2010)

newtekie1 said:


> Where are you getting that all the heat is dumped in the case?  We had this discussion in another thread, I don't have to worry about the heat from my HD4870x2 raising case temps all that much, so I'm not worried about this card either.  Case temps are specifically why I only buy graphics cards that vent the hot air out the back of the case.  That 300-400w of heat does go somewhere, out the back of the case...




I'll be first to say case temps went up with that x2 3-4c idle temps on the cpu load on gpu and 4-6c with both on load. It don't all go out the back a lot stays. That was using both stock coolers with my v10 the difference was a bit less 1-2c load and idle


----------



## newtekie1 (Mar 31, 2010)

cdawall said:


> I'll be first to say case temps went up with that x2 3-4c idle temps on the cpu load on gpu and 4-6c with both on load. It don't all go out the back a lot stays. That was using both stock coolers with my v10 the difference was a bit less 1-2c load and idle



Indeed, some of it stays in the case, but not the huge amount you tried to make it seem like.  1-2c load is hardly worth making a big deal about.


----------



## cdawall (Mar 31, 2010)

newtekie1 said:


> Indeed, some of it stays in the case, but not the huge amount you tried to make it seem like.  1-2c load is hardly worth making a big deal about.



Your right 1-2c is not but the 4-6c I got on the stock cooler at stock clocks using a 550be@quad was a bit warm for me load was around 55c with the 4870x2 vs a 8800gts 512 or 4650 gddr3 lp


----------



## imperialreign (Mar 31, 2010)

newtekie1 said:


> I'll explain it again, Fermi is the first single GPU that we have seen released that actually tops the previous generation's dual GPUs.




Well, technically - considering we're already 6 months past the release of the HD5000 series, those are already *technically* the previous generation.  HD6000 has already been floating around in the rumor mill, and based on ATI's recent release schedule, we'll probably see this series before the end of the year (or even before Q4).

Thing is, I don't see where Fermi's performance is really that respectable for a card that's 6 months late to the game and has been over-hyped.  I half expected Fermi to be more on par with Hemlock . . . or, at least, that's what nVidia were wanting everyone to believe.  As it stands, especially in the larger display res arena, the card is really not too much better than Cypress - even more-so if we factor in the price these cards are going to enter at; I'm 100% certain that either right before, or right after, the 480's release, ATI will drop their prices by nearly 25% (also one of their current tactics).


----------



## Wile E (Apr 1, 2010)

cdawall said:


> Only complaint I have is the 480 doesn't put out less heat or use less juice from what I have seen its smoking psu's we never saw that with 4870x2...the 5870 is also 6 months old and beat the 4870x2 in almost everything so how is it not in the same boat as fermi?



No, 5870 did not beat 4870x2. Look at the reviews again. http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/32.html 4870X2 is still slightly ahead in most things.

Why do you think I haven't bought a 5870 yet? I would only gain lower power usage (low on my considerations list), and DX11 (higher on my considerations list, but not high enough to essentially cross-grade performance wise).

And I take w1z's word on it over consumption, vs some random guys on forums making claims that ZOMG, FERMI KILLED MY PSU!!!!!!!!!

Don't get me wrong, I think the 5k series is absolutely amazing, it just doesn't fit my need for more performance in my price range. I would love to grab 5970 tho, but it is simply out of my price range. I think I'm gonna have to sit this one out until the next gen.


----------



## newtekie1 (Apr 1, 2010)

Wile E said:


> No, 5870 did not beat 4870x2. Look at the reviews again. http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/32.html 4870X2 is still slightly ahead in most things.
> 
> Why do you think I haven't bought a 5870 yet? I would only gain lower power usage (low on my considerations list), and DX11 (higher on my considerations list, but not high enough to essentially cross-grade performance wise).
> 
> ...



Careful, talk like and disagreeing with cdawall like likely lead to him labelling you an nVidia fanboy, and mdm-adph saying you are on nVidia's payroll...

Just like G80/R600, I too will be sitting this round out, there it isn't enough of a gain to drop what I have now and spend serveral hundred dollars for next to no noticeable performance improvement.


----------



## TheMailMan78 (Apr 1, 2010)

Meh people call me a fanboy all the time. My wife especially.


----------



## cdawall (Apr 1, 2010)

Wile E said:


> No, 5870 did not beat 4870x2. Look at the reviews again. http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/32.html 4870X2 is still slightly ahead in most things.
> 
> Why do you think I haven't bought a 5870 yet? I would only gain lower power usage (low on my considerations list), and DX11 (higher on my considerations list, but not high enough to essentially cross-grade performance wise).
> 
> ...



The 5870 clocks better which should put it slightly ahead but yes your right its a low power crossgrade I to havea 4870x2 and sit this one out....



And newteckie your known to be pretty nv flavored in your opinions just like I'm amd opinioned in mine


----------



## mastrdrver (Apr 1, 2010)

I went from a 4870x2 to a 5870......greatest upgrade ever. Though, I spend a lot of time playing games, don't know about you guys.

Reason I consider it an upgrade (just a few): runs cooler, fan only needs to run 35% max during games compared to ~60% to keep temperatures down, no crossfire profiles to be concerned about, better options for aftermarket cooler if I ever decide to

Thanks to the 5870, my setup is pretty much silent even when playing games. Not really thought about much until you're playing BC2 and trying to hear the guy around you that just shot at you.


----------



## Wile E (Apr 1, 2010)

mastrdrver said:


> I went from a 4870x2 to a 5870......greatest upgrade ever. Though, I spend a lot of time playing games, don't know about you guys.
> 
> Reason I consider it an upgrade (just a few): runs cooler, fan only needs to run 35% max during games compared to ~60% to keep temperatures down, no crossfire profiles to be concerned about, better options for aftermarket cooler if I ever decide to
> 
> Thanks to the 5870, my setup is pretty much silent even when playing games. Not really thought about much until you're playing BC2 and trying to hear the guy around you that just shot at you.



My 4870x2 is watercooled. Even more silent than the 5870. And none of my titles so far suffer from crossfire bugs. 5870 just is not a worthy upgrade for someone like me.

Biggest thing on the list of considerations for me is performance. I will not upgrade unless my upgrade gives me more performance. 5870 does not accomplish that. Fermi does, but has too many negatives to worry about right now. Thus the reason many of us like minded people have decided to sit this one out. The only exception would be stumbling across a really killer deal.


----------



## TAViX (Apr 1, 2010)

I'm thinking also that the Nvidia cards are a BIG NO-NO for multi-display gaming. I mean, you cannot run more than 2 anyways, and that with maximum power consumption and heat output, and you need a SLI for 3 monitors, with the same power/heating problems.

On the other hand ATI have even from a cheap 5770 to 5970 with EyeF6 support, that support 6 monitors and default freqs/temp on desktop use. (Launched yesterday...)


----------



## imperialreign (Apr 1, 2010)

Wile E said:


> My 4870x2 is watercooled. Even more silent than the 5870. And none of my titles so far suffer from crossfire bugs. 5870 just is not a worthy upgrade for someone like me.
> 
> Biggest thing on the list of considerations for me is performance. I will not upgrade unless my upgrade gives me more performance. 5870 does not accomplish that. Fermi does, but has too many negatives to worry about right now. Thus the reason many of us like minded people have decided to sit this one out. The only exception would be stumbling across a really killer deal.





Semi-agreed . . . although, I wouldn't be interested in Fermi even if it were the re-incarnation of the VooDoo3 - I'm too ATI loyal, been burned by the green camp a couple too many times in the past, I'll not spend my money with them . . .

. . . although, I'm not opposed to recieving free green-camp hardware, I just won't spend _my_ money on it 



Regarding the 4870x2, 100% agreed.  Even though I could get by with one, I'm a stickler for consistent smooth frame rates, hence my doubling up.  My next upgrade will be to a form of hardware that 100% outperforms just one of my current cards . . . and 5970 is the only one capable of that ATM . . .

. . . and supply is still non-existent.


----------



## mdm-adph (Apr 1, 2010)

newtekie1 said:


> Careful, talk like and disagreeing with cdawall like likely lead to him labelling you an nVidia fanboy, and mdm-adph saying you are on nVidia's payroll...



Keep that up, and I'll get MIT to run all your comments over the past few years through some sort of sophisticated algorithm and _prove_ you're on their payroll.


----------



## 20mmrain (Jun 2, 2010)

Wile E said:


> No, 5870 did not beat 4870x2. Look at the reviews again. http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/32.html 4870X2 is still slightly ahead in most things.
> 
> Why do you think I haven't bought a 5870 yet? I would only gain lower power usage (low on my considerations list), and DX11 (higher on my considerations list, but not high enough to essentially cross-grade performance wise).
> 
> ...



One of my rigs actually upgraded form a 4870x2 to a 5870. Besides the cooler temps, DX11, and much more overclockable. 
I got rid of any Crossfire micro stutter created from the 4870x2 being a Crossfire setup in one card.

Plus with that said in real world performance..... they are a lot closer than you think. As far as loss of FPS. In the real world I bet I lost about 5% if that. Which an 5870 can easily Clock to that with out a problem.

Now that I am getting rid of my 5870 and going to a 5850 Crossfire I really have gained allot.

For someone with a 4870x2 .... I can definitely see why it is not a priority to upgrade to a 5870 setup. But at the same time.... if you did it....it wouldn't be a waist of time or money at all.


----------



## Wile E (Jun 2, 2010)

20mmrain said:


> One of my rigs actually upgraded form a 4870x2 to a 5870. Besides the cooler temps, DX11, and much more overclockable.
> I got rid of any Crossfire micro stutter created from the 4870x2 being a Crossfire setup in one card.
> 
> Plus with that said in real world performance..... they are a lot closer than you think. As far as loss of FPS. In the real world I bet I lost about 5% if that. Which an 5870 can easily Clock to that with out a problem.
> ...



I don't get the micro stutter issues on my rig.

Paying $400 for little to no performance gains is a waste of time and money by my standards.

Besides, if I did decide to go with something from this gen, it would probably be nV. If anything, it's time for a change for me.


----------



## DrPepper (Jun 2, 2010)

Wile E said:


> If anything, it's time for a change for me.



Never thought I'd see you obamafied.


----------



## 20mmrain (Jun 2, 2010)

Wile E said:


> I don't get the micro stutter issues on my rig.
> 
> Paying $400 for little to no performance gains is a waste of time and money by my standards.
> 
> Besides, if I did decide to go with something from this gen, it would probably be nV. If anything, it's time for a change for me.



LOL I thought I was the only one considering doing that. <--- considering trading 2 x 5850'x and a Corsair 750 TX PSU for 2xGTX 470's Just for a change of pace.

The only thing that is stopping me though is worrying that these cards because they run so hot..... might only last a year at tops if I need to keep them for a long time.


----------



## Wile E (Jun 2, 2010)

20mmrain said:


> LOL I thought I was the only one considering doing that. <--- considering trading 2 x 5850'x and a Corsair 750 TX PSU for 2xGTX 470's Just for a change of pace.
> 
> The only thing that is stopping me though is worrying that these cards because they run so hot..... might only last a year at tops if I need to keep them for a long time.



They don't run that hot. The heat issues were fixed in the retail cards. It was a fan issue. The 470 was never really as hot as the 480 anyway tho, so you should be in even better shape.


----------



## 20mmrain (Jun 2, 2010)

Wile E said:


> They don't run that hot. The heat issues were fixed in the retail cards. It was a fan issue. The 470 was never really as hot as the 480 anyway tho, so you should be in even better shape.



I was just reading some very recent reviews from this month showing that a GTX 470 was running around 80c (Which isn't that bad) to 94c (Which is horrible)

Coming from a 5870 that did on Auto fan a max temp of 70c (While playing Crysis and other Highend games) Those temps are pretty high.

Considering that a 5870 is more powerful that a GTX 470. You would think that a GTX 470 would be the cooler one.

Not to mention my 5850's never run that hot.

But no pain no gain.... and this problem has still not deterred me from thinking about this decision. Ahhh choices choices.....

But I have been reading allot up on Physx lately and I didn't realize that I could be missing a fair amount in games with out it. That is the other factor here making me wonder if I should go for it.

But if you say they are not running that hot anymore.... maybe I will try to find some reviews  
talking about that aspect. I'll Check it out.... If they got those temps down around 70c on Auto fan.... I would definitely check a GTX 470 SLI setup out


----------



## lyndonguitar (Jun 2, 2010)

i heard some manufacturer also left Nvidia about a few weeks ago and got to ATI and built also a COOL card to have "revenge". may i ask what it is?


----------



## phanbuey (Jun 2, 2010)

20mmrain said:


> I was just reading some very recent reviews from this month showing that a GTX 470 was running around 80c (Which isn't that bad) to 94c (Which is horrible)
> 
> Coming from a 5870 that did on Auto fan a max temp of 70c (While playing Crysis and other Highend games) Those temps are pretty high.
> 
> ...




A fix of the thermal paste on the 470 would prolly bring those more in line.


----------



## 20mmrain (Jun 2, 2010)

phanbuey said:


> A fix of the thermal paste on the 470 would prolly bring those more in line.



Probably a great point


----------

