# Jen-Hsun Huang (NVIDIA): ''We Underestimated RV770''



## btarunr (Aug 15, 2008)

NVIDIA suffered its first red-quarter in five years. There are several contributors to this, namely an up to US $200M write-off to cover expenses in recalling and restoring faulty mobile graphics processors. 

Another factor has been a replenished product lineup from competitor AMD/ATI that is taking on NVIDIA products at mid thru high and enthusiast segments of the market, in essence ATI now has products to counter NVIDIA at every possible segment, with more dressing up to go to office. 

Seeking Alpha spoke with CEO Jen-Hsun Huang, he was quoted saying:



> We underestimated the price performance of our competitor's most recent GPU, which led us to mis-position our fall lineup. The first step of our response was to reset our price to reflect competitive realities. Our action put us again in a strong competitive position but we took hard hits with respect to our overall GPU ASPs and ultimately to our gross margins. The price action was particularly difficult since we are just ramping 55-nanometer and the weak market resulted in taking longer than expected to work through our 65-nanometer inventory.



Huang says that with their transit to the 55nm silicon fabrication process, they hope to do better.

*View at TechPowerUp Main Site*


----------



## wolf2009 (Aug 15, 2008)

Now they will come back more strongly, hope they learned their lesson . 

But they have lagged behind now in the technological race. ATI was first to 55nm and GDDR5 .

 By the time NVIDIA gets to that ATI will have moved onto 40nm while NVIDIA will still have 55nm power hungry parts compared to ATI's 40nm power efficient design.


----------



## btarunr (Aug 15, 2008)

Two ways: Rush in things slated for much later , shrink everything there is to 55nm, raise their MHz. 9800 GTX+ met with stiff resistance from non-reference HD4850 that came in by the dozen, strategically timed.


----------



## candle_86 (Aug 15, 2008)

Nvidia will have a response they always do. And GDDR5 is pure marketing, a 4850 with the core running same speed as the 4870 is freakishly close to it in scores that that point, the GDDR5 has to much latancy to be useful. GDDR3 is still more efficent untill they fix the high lat of GDDR5. I have a good idea though Nvidia will launch an anwser to the HD4870x2, most likly a dual GPU G200B based card.


----------



## johnnyfiive (Aug 15, 2008)

NVidia WILL come back strong like they always do. ATi played the market perfectly.


----------



## WarEagleAU (Aug 15, 2008)

yes, its about time they took some kind of blow to wake them up. Im glad to see ATI do so well, I even underestimated how good the HD 2xxx, 3xxx and 4xxx series is/was. I hope ATI continues to turn the screws on price/performance.


----------



## selway89 (Aug 15, 2008)

WarEagleAU said:


> yes, its about time they took some kind of blow to wake them up. Im glad to see ATI do so well, I even underestimated how good the HD 2xxx, 3xxx and 4xxx series is/was. I hope ATI continues to turn the screws on price/performance.



I agree, and I am still very happy with my 2900XT even though they got slated still a good card. I wounder if intels new gpu will kick nvidia where it hurts just as ati have?!


----------



## [I.R.A]_FBi (Aug 15, 2008)

candle_86 said:


> *Nvidia will have a response they always do. And GDDR5 is pure marketing*, a 4850 with the core running same speed as the 4870 is freakishly close to it in scores that that point, the GDDR5 has to much latancy to be useful. GDDR3 is still more efficent untill they fix the high lat of GDDR5. I have a good idea though Nvidia will launch an anwser to the HD4870x2, most likly a dual GPU G200B based card.



Do you honestly believe what you typed?


----------



## HousERaT (Aug 15, 2008)

[I.R.A]_FBi said:


> Do you honestly believe what you typed?


You know what they say "fantasy can be fun." 

Most tests show at least an 8% difference between the 4850 and 4870 clock for clock.  I don't know where that latency issue came from.


----------



## mlupple (Aug 15, 2008)

It takes some balls to say that.  I actually like NVidia now for not being pussies about it. ATI vs NVid = FTW!


----------



## INSTG8R (Aug 15, 2008)

Well being a long time "Red Team" man(I bought a FX5200 and owned it for about 1/2 an hour and never looked back)In the past year I was firmly convinced my next upgrade would be on the other side. the 4xxx series restored my faith and I hope it continues to do so. Im not for either side, my Laptop has an 8400M in it and I'm very happy with it. 
It just nice to see ATI pull it out finally and get back into being competitive.


----------



## CDdude55 (Aug 15, 2008)

Nvidia will come back with something. This always happens Ati comes out with something powerful and people say ''OMG!! Teh Nvidia is teh died!!1 Ati fuor teh win!11 and then Nvidia comes out with something great and vice versa with the OMG*insert ati or nvidia here* is dead.

At least they admitted that they underestimated Ati, and there working on the prices and there GPU's to bring some good competition to the market.


----------



## R_1 (Aug 15, 2008)

HousERaT said:


> ... I don't know where that latency issue came from.


Some NVIDIA marketing guy come up with the idea.
I am thinking if someone have managed  clocked GDDR5 so high, that it can't produce more additional game power. That may be above 5000Mhz DDR.


----------



## Megasty (Aug 15, 2008)

Bah, so he finally admits the obvious. I think they should have stuck to working together  Cheers to fastass cards @ great prices 

Just remember that hype is stupidity & results are everything


----------



## newconroer (Aug 16, 2008)

Meh, Nvidia isn't in as much trouble as people think, and ATi isn't doing as well as people think.

The problem is that we'd forgotten for a while what some competition felt like, that's all.

Nvidia could be blamed for resting on their laurels, but what good would it have served to put out newer architecture GPUs, when their competitor was struggling? Also notice what Huang said, "weak market."

We should also never forget that ATi ditched their whole 'power/heat/efficiency' forte, in order to catch up to Nvidia in performance, and when they did, Nvidia ironically was the one with the  better power/heat/efficiency' ratings, while still maintaning performance.


Personally I find the X2 just a publicity stunt; for all it's horsepower it's not impressive, and looks to just rope people into being ATi customers.

However the 4850 and 4870 are respectable cards, if nothing else, for how much more effective they are then their most recent predecessors.


Either way, both Nvidia and ATi have been at a bit of a roadblock, putting out more and more cards that are more or less 'suped' up versions of the previous ones, and don't have any real architectural changes.

Looking at the 'rumored' specs of each camp's next offering, it doesn't seem like that will change either.



I'm not too sure why Huang even bothered with the statement, but it did raise one question in my mind.. when was the last time ATi made a similar public statement?


----------



## Tatty_One (Aug 16, 2008)

wolf2009 said:


> Now they will come back more strongly, hope they learned their lesson .
> 
> But they have lagged behind now in the technological race. ATI was first to 55nm and GDDR5 .
> 
> By the time NVIDIA gets to that ATI will have moved onto 40nm while NVIDIA will still have 55nm power hungry parts compared to ATI's 40nm power efficient design.



I see where you are coming from but not so sure about that.....even with 65nm versus 55nm they still have the fastest single GPU solution that surprisingly produces less heat than it's 55nm competitor so TBH, in my opinion, apart from production costs, performance rather than fabrication process will remain the overiding factor, even if ATi hit the shelves first with a 40 or 45nm GPU, it wont matter much if the 55nm opposition is quicker.


----------



## Weer (Aug 16, 2008)

[I.R.A]_FBi said:


> Do you honestly believe what you typed?



Technically speaking, GDDR5 is a waste of money when you can get the same bandwidth with a higher bus width.


----------



## cdawall (Aug 16, 2008)

Weer said:


> Technically speaking, GDDR5 is a waste of money when you can get the same bandwidth with a higher bus width.



WRONG its not a waste 

for one with the same bus GDDR5  GDDR3/4
for two power usage GDDR5  GDDR3/4

only place GDDR3 leads is latency which doesn't matter when you jump up the clockspeed which GDDR5 does. if you wanted the lowest lats get DDR1!


----------



## $ReaPeR$ (Aug 16, 2008)

all i know is that nvidia was robbing us for years. does anyone here like being robbed? the point is that nvidia wins in the performance war because most of the games are build for their drivers and gpus isnt that right? i say that everything is PR and nvidia for a long time has been on top without doing any real work. taking into account my previous point ati  gpus must be twice as fast in order to have the same realworld performance as nvidias and all this due to the software of the games.so this war isnt about hardware but software so stop being such funboys we are the goddamn customers WE PAY so we should worry the best product in the best price and not about their goddamn war unless you are some richboys that have nothing else to do all day but copare who has the biggest gpu. I DO RESPECT THIS FORUM BECAUSE IT IS ONE OF THE BEST I HAVE EVER SEEN but SOME  people should try to to think like everyday people and not like techmaniacs and spoilt brats. i do not mean to offense anyone. thank you for being able to post here.


----------



## REVHEAD (Aug 16, 2008)

> I'm not too sure why Huang even bothered with the statement, but it did raise one question in my mind.. when was the last time ATi made a similar public statement?




 For the sake of there shareholders thats why.

 This is only going to get worse for them , well with them finally giving in on there crappy MB industry we all knew was a fail, over priced under performing boards that you couldnt OC without data corruption. There driver support is some of the worst in the business , using drivers to push the Consumer to newer products , that didnt work correctly in the firstplace.

 I dont like Nvidia but we need them, because as a consumer we are the winner.

 Nvidia can suck my balls!!


----------



## tkpenalty (Aug 16, 2008)

candle_86 said:


> Nvidia will have a response they always do. And GDDR5 is pure marketing, a 4850 with the core running same speed as the 4870 is freakishly close to it in scores that that point, the GDDR5 has to much latancy to be useful. GDDR3 is still more efficent untill they fix the high lat of GDDR5. I have a good idea though Nvidia will launch an anwser to the HD4870x2, most likly a dual GPU G200B based card.



20% Perf difference is marketing? Excuse me cut the biased views already? 

Nvidia, even though they still have more money, their profits are going downhill. Its not hard to see why when their graphics industry isn't fareing well with AMD's lineup, and the fact that their chipsets are rather unpopular, to both OEMs and Consumers.


----------



## candle_86 (Aug 16, 2008)

I am not a fanboi i am making a statement which is true, clock for clock the HD4850 and 4870 are close the 4870 is slightly faster clock for clock but not much, as said 8% at best and thats above 1680x1050, below that they are rather close, because of the higher latancy associated with GDDR5, this is not my statement this was made in a newspost here on TPU a month or so back. Call me what you want, but im stating facts.  The reason the 4870 is faster is its core and mem are clocked higher than the 4850 thats it.


----------



## Siman0 (Aug 16, 2008)

ROFL  ok come on guys Nvida was riding its butt on the 8000 series hell the 9000 series is a slightly modified 8000 core. Also honestly who didn't see this coming when DAAMIT released the 3000 series or even the 2000 series for that matter. A hint to the guys that say GDDR5 is not as good as GDDR3 ok, first hint: why is the GDDR3 4850 not as good as the GDDR5 4870 hmm maybe the memory bandwidth had something to do with it. don't know could jest be me. second hint: if its a marketing ploy why is Nvida frantically trying to throw it on there next Grfx core the 300 series... third hint: the Bit rate for GDDR3 is much higher than that of GDDR5 yes but then why is the 200 series still getting its rear end handed to it with a 256 bit GDDR5. fourth and final hint: DDR3 is now coming along for CPUs and DDR5 is coming to GPUs the speed of the DDR3 ram modules are much grater than that of DDR2 time change MOVE ON. Ok with that said yes to the guy you can get the same if you widen the buss but higher the with the more cash it costs and unless you want to pay 800 for 1 card...GDDR5 is a cheaper solution than constantly trying to make the buss width bigger.


----------



## candle_86 (Aug 16, 2008)

the 4870 is slower than the 280, the 4870's compitition is the 260. It took the 4870 X2 for AMD to take a lead.


----------



## PCpraiser100 (Aug 16, 2008)

HA! Another way of persuading customers to come back to the now crippled video card company! I love it! Good job Nvidia, however its not worth it. BTW I hate Crysis anyway so chow...

BTW shiman0, GDDR5 has a lot more bandwidth that Nvidia is planning to put this memory in their GTZ 300 series. Since new technologies mean steep price, ATI will have GDDR5-powered cards in the $100-$200 price range in no time!


----------



## Nyte (Aug 16, 2008)

candle_86 said:


> the 4870 is slower than the 280, the 4870's compitition is the 260. It took the 4870 X2 for AMD to take a lead.



The 4870 is slower in *50%* of the tested games than the 280.

I'll reverse that logic around and say 280 is slower than the 4870 then ok?


----------



## Nyte (Aug 16, 2008)

Weer said:


> Technically speaking, GDDR5 is a waste of money when you can get the same bandwidth with a higher bus width.



Higher bus width is considerably more complex in area, cost, design than just buying GDDR5 chips which cost almost the same as GDDR3 chips.

Trust me.


----------



## Scrizz (Aug 16, 2008)

The Gddr5 makes a huge difference, wish I had it


----------



## KainXS (Aug 16, 2008)

I have to agree with tk, the difference between the GTX280 and 4870 is 20% tops

maybe not even that high


----------



## Megasty (Aug 16, 2008)

KainXS said:


> I have to agree with tk, the difference between the GTX280 and 4870 is 20% tops
> 
> maybe not even that high



Even in Crysis its really close...This comes from my good buddy that managed to blow one of his 3 GTX280s 

http://www.youtube.com/watch?v=DYnXxI1UjxE

I say he did a great job with the editing the 4 videos together


----------



## captainskyhawk (Aug 16, 2008)

candle_86 said:


> the 4870 is slower than the 280, the 4870's compitition is the 260. It took the 4870 X2 for AMD to take a lead.



I'd say that considering the 4870 actually _bests_ the 280 in 50% of games up to 1920, it's hardly slower -- I'd say they were pretty much neck to neck, with the 280 having some benefits at even higher resolutions.


----------



## btarunr (Aug 16, 2008)

candle_86 said:


> I am not a fanboi



The Sun rises in Sunnyvale, sets in Tokyo.


----------



## knowledge123 (Aug 16, 2008)

I would hope that this would put them in a more humble and reflective mood; after their 'we're going to open a can of whoop-ass' rolleyes comments and smack talking I am glad that ATi have lived up to and beyond what nVida thought that they were capable of.   
Let us hope that they won't be as stuck-up about themselves. nVidia are a good company, but i feel that in recent times that they have grown too big for their boots, and I'm very happy that ATi have put them back in line.


----------



## Wile E (Aug 16, 2008)

candle_86 said:


> I am not a fanboi i am making a statement which is true, clock for clock the HD4850 and 4870 are close the 4870 is slightly faster clock for clock but not much, as said 8% at best and thats above 1680x1050, below that they are rather close, because of the higher latancy associated with GDDR5, this is not my statement this was made in a newspost here on TPU a month or so back. Call me what you want, but im stating facts.  The reason the 4870 is faster is its core and mem are clocked higher than the 4850 thats it.


You also realize that the 4870's native ram speed is actually 100Mhz lower than the 50's, right? The 70's check in at 900Mhz, the 50's check in at 999Mhz. Bump that GDDR5 to 999Mhz, and see what performance differences you come up with.


Weer said:


> Technically speaking, GDDR5 is a waste of money when you can get the same bandwidth with a higher bus width.


Do you realize that it is more expensive to increase the bit width of the bus than it is to use GDDR5?


----------



## blueskynis (Aug 16, 2008)

candle_86 said:


> Nvidia will have a response they always do. And GDDR5 is pure marketing, a 4850 with the core running same speed as the 4870 is freakishly close to it in scores that that point, the GDDR5 has to much latancy to be useful. GDDR3 is still more efficent untill they fix the high lat of GDDR5. I have a good idea though Nvidia will launch an anwser to the HD4870x2, most likly a dual GPU G200B based card.



High latency does not impact the performance of GPUs like it does impact CPUs.


----------



## candle_86 (Aug 16, 2008)

knowledge123 said:


> I would hope that this would put them in a more humble and reflective mood; after their 'we're going to open a can of whoop-ass' rolleyes comments and smack talking I am glad that ATi have lived up to and beyond what nVida thought that they were capable of.
> Let us hope that they won't be as stuck-up about themselves. nVidia are a good company, but i feel that in recent times that they have grown too big for their boots, and I'm very happy that ATi have put them back in line.



If you rememeber anything you'd know Nvidia keeps ATI in check, not the other way around. This would be the second time ATI has a decent card out, and the last time that happened Nvidia fought back hard.

FX5800 was a joke, FX5900Ultra was actully faster in games of its day than the 9800Pro, it wasn't untill the next gen GPU's arrived did the FX5900Ultra loose its place, but the 6800Ultra was tied with the x800XT, and the 6800GT soundly trumped the x800PRo. When ATI got cocky with the x850XT a few months later the 7800GTX, ATI got cockcy 7800GTX 512, ATI tried again 7900GTX, and again ATI tried so we got the 7950GX2. ATI offered little to no threat to Nvidia for 2007 and half of 2008. Nvidia doesn't like to loose, all ATI did was wake a sleeping beast.


----------



## Wile E (Aug 16, 2008)

candle_86 said:


> If you rememeber anything you'd know Nvidia keeps ATI in check, not the other way around. This would be the second time ATI has a decent card out, and the last time that happened Nvidia fought back hard.
> 
> FX5800 was a joke, FX5900Ultra was actully faster in games of its day than the 9800Pro, it wasn't untill the next gen GPU's arrived did the FX5900Ultra loose its place, but the 6800Ultra was tied with the x800XT, and the 6800GT soundly trumped the x800PRo. When ATI got cocky with the x850XT a few months later the 7800GTX, ATI got cockcy 7800GTX 512, ATI tried again 7900GTX, and again ATI tried so we got the 7950GX2. ATI offered little to no threat to Nvidia for 2007 and half of 2008. Nvidia doesn't like to loose, all ATI did was wake a sleeping beast.



ATI didn't get cocky. How in hell did you come to that conclusion? First, the 5900Ultra didn't beat the 9800Pro, and even if it did happen to match it in some games, ATI still had the 9800XT. 

7800GTX might have been the answer to X850, but ATI answered right back with X1800. Then nVidia released 7900, and ATI threw X1900 in their face. I can't remember if 7950 or 1950 came out first, but that doesn't matter, ATI either matched, or beat nVidia in every price segment at that time. They just never put out an answer to GX2. They didn't have to. The driver support for it was so terrible, it died before it could ever take off.

Your fanboyism has blinded you to what ACTUALLY happened. NV didn't take a solid lead until they released 8800, which ATI left unanswered for much too long.


----------



## newconroer (Aug 16, 2008)

PCpraiser100 said:


> HA! Another way of persuading customers to come back to the now crippled video card company! I love it! Good job Nvidia, however its not worth it. BTW I hate Crysis anyway so chow...
> 
> BTW shiman0, GDDR5 has a lot more bandwidth that Nvidia is planning to put this memory in their GTZ 300 series. Since new technologies mean steep price, ATI will have GDDR5-powered cards in the $100-$200 price range in no time!




If you 'hate' Crysis, then you should be laughing at the absurdity that is the 4870 X2, (or a GTX 280) for that matter.





We don't have any proof about Nvidia using DDR5, even if they said they might down the road. PROOF, is when it's in your system, and WORKING.



blueskynis said:


> High latency does not impact the performance of GPUs like it does impact CPUs.



Despite that lower latency, in theory should do wonders for your system performance, unfortunatley it doesn't. And it probably doesn't as much as we'd like to think with GPUs, however I find it's usually the other way around.
I would continue to take lower latency memory over higher frequency, definatley on a graphics card. Random access seeking time is far more important to me than bandwidth, especially if the bus is twice the size as well.

DDR3 has proven itself to be very good memory for GPUs, DDR5 has proven itself to be acceptable memory for GPUs. That's two entirely different things.


----------



## [I.R.A]_FBi (Aug 16, 2008)

candle, if gddr5 is just marketing, how much do u have to lose that nvidia will leave gdd3 soon?


----------



## qwerty_lesh (Aug 16, 2008)

altho i agree with most of whats said here, i dont feel that ati's 2xxx series was all that good, from what ive seen they really didnt make much of a comeback untill they released the 3xxx series, i know where i work nvidia was still popular for price and performance untill the 3xxx series got released. and  yeah the 4xxx series rocks, they perform very well and are cheaper then current nvidia counterparts.


----------



## Tatty_One (Aug 16, 2008)

I really dont know why some get so antagonised about this gfx card war, it's pretty simple really, one side comes out on top sometimes, the other side other times....thats gotta be good right?  good for us consumers at least, so now we find ourselves in a position that ATi has the fastest single card solution and NVidia the 2nd fastest, ATi just about the 3rd with NVidia just about the fourth, so what.....but to say that just because the "other side" to your preference brings a techonolgy to the table that pushes the boundries of performance (AKA GDDR5) that its pointless is TBH, fanboism of the highest order......and thats coming from a fanboi!

Enjoy the technology, enjoy the breakthroughs and enjoy the competative pricing that brings because no doubt, down the line the "other side" will edge in front....and when they do, things get even cheaper!


----------



## TheMailMan78 (Aug 16, 2008)

Ya know you guys are arguing for nothing. He didnt say ATI anywhere! I think hes talking about Intels IGPU 

Anyway ATI has won this match. No arguing this by any logic. However this is a war not a battle so to both teams..."ALEMEN FOWARD!"


----------



## btarunr (Aug 16, 2008)

TheMailMan78 said:


> Ya know you guys are arguing for nothing. He didnt say ATI anywhere! I think hes talking about Intels IGPU



He said GPU, not IGP. Intel doesn't make discrete GPUs now. Obviously he wasn't referring to S3 Graphics.



> We underestimated the price performance of our competitor’s most recent GPU, which led us to mis-position our fall lineup.


----------



## TheMailMan78 (Aug 16, 2008)

btarunr said:


> He said GPU, not IGP. Intel doesn't make discrete GPUs now. Obviously he wasn't referring to S3 Graphics.



Your right. I was just joking anyway  I think its pretty clear what he was saying.


----------



## erocker (Aug 16, 2008)

Tatty_One said:


> I really dont know why some get so antagonised about this gfx card war, it's pretty simple really, one side comes out on top sometimes, the other side other times....thats gotta be good right?  good for us consumers at least, so now we find ourselves in a position that ATi has the fastest single card solution and NVidia the 2nd fastest, ATi just about the 3rd with NVidia just about the fourth, so what.....but to say that just because the "other side" to your preference brings a techonolgy to the table that pushes the boundries of performance (AKA GDDR5) that its pointless is TBH, fanboism of the highest order......and thats coming from a fanboi!
> 
> Enjoy the technology, enjoy the breakthroughs and enjoy the competative pricing that brings because no doubt, down the line the "other side" will edge in front....and when they do, things get even cheaper!



Exactly.  People argue for the sake of arguing thinking thier view is more important or more correct then the other's for the sake of boosting thier self-esteem.  I like Huang's statement as it's honest.  It's not something that is seen very much.


----------



## Nyte (Aug 16, 2008)

We should make non-constructive arguments a bannable offense, that would silence alot of fanboys in my opinion.


----------



## Viscarious (Aug 16, 2008)

I cant stop laughing at all of you people!

Buy what you want and shut up. Play your games and be happy. If you want to play the latest games then buy the best card from your favorite company but cut all the bashing and biased remarks. Your not doing anything but raising your blood pressure and starting useless arguments.

Megasty's sig says everything anyone should really give a damn about. "gods are created through gaming not 3dmark..." Who gives a crap about your 22,000 3Dmark score when you go 3 and 22 in Call of Duty 4.

I know this wont stop anything so I'll come back to get another good laugh in.


----------



## cdawall (Aug 16, 2008)

i got 22 and 3 with a 6200TC hehe


----------



## Viscarious (Aug 16, 2008)

cdawall said:


> i got 22 and 3 with a 6200TC hehe



Exactly. Ride that card hard till it dies!


----------



## Megasty (Aug 16, 2008)

I really don't get it when it comes to the fanboism. These are gfx cards not baseball teams. Gfx card performance is set in stone right off the production block. They have a variable range of operation due to bin & driver issues. Huang's statement is very suitable to them being the gfx leaders for the last 2 years. They got too complacent & it costed them, but its not the end of the world. NV will always be NV.


----------



## CDdude55 (Aug 16, 2008)

I would put a GTX 280 with my Core 2 Duo E4400 at stock 2.0Ghz.

I know, im a mad man.


----------



## zithe (Aug 16, 2008)

Siman0 said:


> ROFL  ok come on guys Nvida was riding its butt on the 8000 series hell the 9000 series is a slightly modified 8000 core. Also honestly who didn't see this coming when DAAMIT released the 3000 series or even the 2000 series for that matter. A hint to the guys that say GDDR5 is not as good as GDDR3 ok, first hint: why is the GDDR3 4850 not as good as the GDDR5 4870 hmm maybe the memory bandwidth had something to do with it. don't know could jest be me. second hint: if its a marketing ploy why is Nvida frantically trying to throw it on there next Grfx core the 300 series... third hint: the Bit rate for GDDR3 is much higher than that of GDDR5 yes but then why is the 200 series still getting its rear end handed to it with a 256 bit GDDR5. fourth and final hint: DDR3 is now coming along for CPUs and DDR5 is coming to GPUs the speed of the DDR3 ram modules are much grater than that of DDR2 time change MOVE ON. Ok with that said yes to the guy you can get the same if you widen the buss but higher the with the more cash it costs and unless you want to pay 800 for 1 card...GDDR5 is a cheaper solution than constantly trying to make the buss width bigger.



DDR5 doesn't exist. GDDR5 does.


----------



## Hayder_Master (Aug 17, 2008)

I think this is desperate talk , we need real work


----------



## hv43082 (Aug 17, 2008)

Hah the constant cat and mouse game.


----------



## Wile E (Aug 17, 2008)

Viscarious said:


> I cant stop laughing at all of you people!
> 
> Buy what you want and shut up. Play your games and be happy. If you want to play the latest games then buy the best card from your favorite company but cut all the bashing and biased remarks. Your not doing anything but raising your blood pressure and starting useless arguments.
> 
> ...



Meh, I'd rather benchmark. lol.


----------



## AsRock (Aug 17, 2008)

Maybe if game company's got of there asses and supported ATI's design like BIOSHOCK did, ATI would be in a much better place now.

Would it not be harder to support ATI's (shaders)design than nV's ?. Game companys from what i have seen are not upto the task due to costs and time.


----------



## btarunr (Aug 17, 2008)

Viscarious said:


> I cant stop laughing at all of you people!
> 
> Buy what you want and shut up. Play your games and be happy. If you want to play the latest games then buy the best card from your favorite company but cut all the bashing and biased remarks. Your not doing anything but raising your blood pressure and starting useless arguments.
> 
> ...



Just as you have gaming competitions, you have OC competitions too (and they are sometimes equally rewarding). So it does sometimes matter what 3DMark score you have. For overclockers, their favourite game is 3DMark


----------



## Tatty_One (Aug 17, 2008)

AsRock said:


> Maybe if game company's got of there asses and supported ATI's design like BIOSHOCK did, ATI would be in a much better place now.
> 
> Would it not be harder to support ATI's (shaders)design than nV's ?. Game companys from what i have seen are not upto the task due to costs and time.



Yes perhaps, but some could argue that NVidia's design is better in as much as they can match or beat ATi's single GPU performance with less than a third of the shaders....IDK, lots of people say understandably that ATi are at an disadvantage when it seems so many game developers support NVidia and it's architecture....they are probably right but hey.....I am sure ATi would do the same if they had both the market share and the money/marketing strategy to do it........it is naive to beleive IMO that NVidia is bad for doing so when ATi would probably like to also jump on the bandwagon I am sure.


----------



## Widjaja (Aug 17, 2008)

Hmm
Nvidia underestimated the RV770 so they let NGO have the SDK to PhysX so they can implement it into the AMD drivers in hopes more game developers will be inclined to use PhysX instead of havok.
More game devs using PhysX = money for nVidia.
No game devs using HAVOK = AMD leasing HAVOK from Intel being a huge waste of money.

Dirty tactics.


----------



## PCpraiser100 (Aug 17, 2008)

btarunr said:


> Just as you have gaming competitions, you have OC competitions too (and they are sometimes equally rewarding). So it does sometimes matter what 3DMark score you have. For overclockers, their favourite game is 3DMark



Hes right, I went in a pretty cool competition all about OC. There were over 1000 CPU casualties and my P4 630 bumped to about 4.9GHz with dry ice. Two days later, it died.


----------



## Viscarious (Aug 17, 2008)

Ehh. Fine, I'll admit that tweaking your PC is fun. However, I still dont think its worth all the time and money for those few extra marks.


----------



## PCpraiser100 (Aug 17, 2008)

Viscarious said:


> Ehh. Fine, I'll admit that tweaking your PC is fun. However, I still dont think its worth all the time and money for those few extra marks.



People don't really OC to get better scores in 3DMark. They use OC as the most cheapest way to stay compatible and even get the smallest siblings of the processor family pushed to performance thats very close to the biggest sibling. For example, take a look at this E7200, thanks to its stock fan-cooled OC its performance is very close to the E8400 and Q6600.

http://hothardware.com/Articles/Intel-Core-2-Duo-E7200/?page=6


----------



## newconroer (Aug 17, 2008)

btarunr said:


> Just as you have gaming competitions, you have OC competitions too (and they are sometimes equally rewarding). So it does sometimes matter what 3DMark score you have. For overclockers, their favourite game is 3DMark




When you get it in 'writing' that ATi and Nvidia make cards for people to 'compete' with one another in synthetic benchmarks, give me a call.

Until then, I find that laughable.

Oh damn, I laughed... EMO!



PCpraiser100 said:


> People don't really OC to get better scores in 3DMark. They use OC as the most cheapest way to stay compatible and even get the smallest siblings of the processor family pushed to performance thats very close to the biggest sibling. For example, take a look at this E7200, thanks to its stock fan-cooled OC its performance is very close to the E8400 and Q6600.
> 
> http://hothardware.com/Articles/Intel-Core-2-Duo-E7200/?page=6





Not a fair analogy/comparison. CPUs aren't GPUs...well they have a processor but you know what I mean. Gains from overclocking on GPUs are becoming less and less. More and more you only benefit under acute situations, and most likely those situations are synthetic related.


----------



## btarunr (Aug 17, 2008)

newconroer said:


> When you get it in 'writing' that ATi and Nvidia make cards for people to 'compete' with one another in synthetic benchmarks, give me a call.
> 
> Until then, I find that laughable.
> 
> Oh damn, I laughed... EMO!



Get it in 'writing' that graphics cards are meant only for gaming then


----------



## Megasty (Aug 17, 2008)

I knew that silly sig would start some stuff sooner or later. I've been on both sides of that mess, but I have blown up more cards benching than gaming. Only about 25% of binned chips out of any given batch are actually meant for OCing. In NV's camp K|ngp|n gets the best of the best, but he blown up more cards than I ever owned with his own cash b4 he was recognized by NV  Benching with cards is always hit or miss, any kind of unstable OC can blow a _normal_ card. But benchers tend to push their cards higher than gamers, also the majority of gamers out there don't OC at all  I personally don't game with OC'd cards. 

Being part fo a bunch of guys that finally got their dues is great and all but they blew up so much of my wallet its a shame - but they also spent many weeks broke, eating cheeseburgers & drinking cheap beer. Buying non-OC'd cards is just a shot in the dark when it comes to benching while its a sure thing when it comes to daily gaming. I know that's just spliting hairs, but that's just how it. Graphics cards were always meant for gaming. The early goofy perverts like us just turned them over to the dark side of death & destruction, accompanied by glimmers of success, stardom, & dare I say godhood


----------



## Tatty_One (Aug 17, 2008)

PCpraiser100 said:


> People don't really OC to get better scores in 3DMark. They use OC as the most cheapest way to stay compatible and even get the smallest siblings of the processor family pushed to performance thats very close to the biggest sibling. For example, take a look at this E7200, thanks to its stock fan-cooled OC its performance is very close to the E8400 and Q6600.
> 
> http://hothardware.com/Articles/Intel-Core-2-Duo-E7200/?page=6



Very true....hence why my baby E8200 does 4.5gig


----------



## Tatty_One (Aug 17, 2008)

newconroer said:


> When you get it in 'writing' that ATi and Nvidia make cards for people to 'compete' with one another in synthetic benchmarks, give me a call.
> 
> Until then, I find that laughable.
> 
> ...



Maybe not so much ATi or NVidia.....but certainly their manufacturing partners.....all the leading card companies have "teams" or sponsor individuals, specifically so they can break records with their hardware which promotes it to the enthusiast community.


----------



## Wile E (Aug 17, 2008)

Tatty_One said:


> Maybe not so much ATi or NVidia.....but certainly their manufacturing partners.....all the leading card companies have "teams" or sponsor individuals, specifically so they can break records with their hardware which promotes it to the enthusiast community.



*cough* Overclocking Team Palit *cough*


----------



## Tatty_One (Aug 18, 2008)

Wile E said:


> *cough* Overclocking Team Palit *cough*



You need to take something for that cough my friend.........


----------



## Darkrealms (Aug 18, 2008)

Glad to see Nvidia is waking up and willing to admit they fell asleep.  

Now bring on the fighting!  I want to see Nvidia fighting back : )


----------



## Megasty (Aug 18, 2008)

Yeah NV, we need something from you to make the 4870x2 come down in price. AMD must have been sitting on their hands with the initial numbers of those things. They're basically sold out everywhere even at that high price...that boy must be really good or just really hyped up. C'mon AMD I want mine too


----------



## X1REME (Aug 19, 2008)

INSTG8R said:


> Well being a long time "Red Team" man(I bought a FX5200 and owned it for about 1/2 an hour and never looked back)In the past year I was firmly convinced my next upgrade would be on the other side. the 4xxx series restored my faith and I hope it continues to do so. Im not for either side, my Laptop has an 8400M in it and I'm very happy with it.
> It just nice to see ATI pull it out finally and get back into being competitive.



8400M in laptops are failing fast which where the 200 million charge comes from lol, so i wouldn't be to happy if i got a nvidia card in any of my set ups bcoz even in the desktop they have problems but not being seen to the benefits of larger and cooler heat sinks and bigger fans. but ultimately all chips are defective as they all carry the same same thermal qualities or packaging.


----------



## X1REME (Aug 19, 2008)

candle_86 said:


> I am not a fanboi i am making a statement which is true, clock for clock the HD4850 and 4870 are close the 4870 is slightly faster clock for clock but not much, as said 8% at best and thats above 1680x1050, below that they are rather close, because of the higher latancy associated with GDDR5, this is not my statement this was made in a newspost here on TPU a month or so back. Call me what you want, but im stating facts.  The reason the 4870 is faster is its core and mem are clocked higher than the 4850 thats it.



you state facts but yet provide no proof or reviews as the say contrary to what you say. talk about 10/27% as in most reviews not 8%. please do some research b4 typing anything.


----------



## X1REME (Aug 19, 2008)

Tatty_One said:


> I really dont know why some get so antagonised about this gfx card war, it's pretty simple really, one side comes out on top sometimes, the other side other times....thats gotta be good right?  good for us consumers at least, so now we find ourselves in a position that ATi has the fastest single card solution and NVidia the 2nd fastest, ATi just about the 3rd with NVidia just about the fourth, so what.....but to say that just because the "other side" to your preference brings a techonolgy to the table that pushes the boundries of performance (AKA GDDR5) that its pointless is TBH, fanboism of the highest order......and thats coming from a fanboi!
> 
> Enjoy the technology, enjoy the breakthroughs and enjoy the competative pricing that brings because no doubt, down the line the "other side" will edge in front....and when they do, things get even cheaper!



muhahaha, that's funny coz ATi is 1st hd4870x2, 2nd hd4850x2, 3rd hd4870, 4th hd4850 etc performance and price wise. am sure you realize there's new ones just coming (rv730-rc740-rv760 etc) and next year its R800 called little dragon. nvidia is not going no where any time soon coz amd is not playing games anymore as they are applying the Intel tick tock on nvidia with deadly results. now am just waiting for Intel`s turn to get slaughtered.

currently have Intel cpu and nvidia gpu but gonna change to all AMD = motherboard 790fx or 800fx, gpu hd4870 or hd4870x2 , cpu deneb 2.6 to 3.4ghz etc


----------



## Wile E (Aug 19, 2008)

X1REME said:


> muhahaha, that's funny coz ATi is 1st hd4870x2, 2nd hd4850x2, 3rd hd4870, 4th hd4850 etc performance and price wise. am sure you realize there's new ones just coming (rv730-rc740-rv760 etc) and next year its R800 called little dragon. nvidia is not going no where any time soon coz amd is not playing games anymore as they are applying the Intel tick tock on nvidia with deadly results. now am just waiting for Intel`s turn to get slaughtered.
> 
> currently have Intel cpu and nvidia gpu but gonna change to all AMD = motherboard 790fx or 800fx, gpu hd4870 or hd4870x2 , cpu deneb 2.6 to 3.4ghz etc



NV will come back out on top again. It's the way it has happened for years. It wasn't until NV released the 8800 that they took a commanding lead. They never held the performance crown for this long in the past, but neither did ATI. They both went back and forth every few months. I have a feeling this is what we are returning to.


----------



## X1REME (Aug 19, 2008)

to everybody saying am glad nvidia has learned or that ati has woken up nvidia the beast.

all nvidia has woken upto is a powerpoint slide not fully complete yet, they don't have an answer for anther 8/9 months. look you cant just make new gpu cards from the 8 series architecture in a few months which exactly what it is + name changes for the past 2 years (nobody can OK)

the funny thing is when nvidia does come back after 8 to 9 months min they will get smacked right back down again with the r800 little dragon, nvidia gonna be 4rth for at least 2+ years. amd has finally learned there is no rest for the wicked as you may find your self bankrupt if you don't have the crown e.g cpu`s

nvidia fans make me laugh the things they come out with even when there on the loosing side


----------



## Wile E (Aug 19, 2008)

X1REME said:


> to everybody saying am glad nvidia has learned or that ati has woken up nvidia the beast.
> 
> all nvidia has woken upto is a powerpoint slide not fully complete yet, they don't have an answer for anther 8/9 months. look you cant just make new gpu cards from the 8 series architecture in a few months which exactly what it is + name changes for the past 2 years (nobody can OK)
> 
> ...


And 8-9months between lead changes is normal. It's what has happened in the ATI/NV battle for years. You can't say that R800 will beat NV's next offerings at all. You have no idea what either NV's next design, or even R800, has to offer. For all we know, R800 could be a flop, as could the next NV design.

To sit here and claim that ATI will retain the lead is just silly. There is absolutely no way to predict that.


----------



## $ReaPeR$ (Aug 19, 2008)

IMO nv underestimated ati and that was the starting point for their current situation , they will come back with an answer because they have the funds and the tech resources and that is good for us because if there is only one company in any kind of market the customer gets raped over and over again because of the lack of antagonism.


----------



## Tatty_One (Aug 19, 2008)

Whats this "ATi will retain the lead" malarkey?  Do they have the lead?  I thought NVidia had the fastest SINGLE gpu????  am I missing something here, OK if you bolt 2 GPU's together then thats a different story but for all of ATi's excellent marketing strategy this time around, coupled with their "futureistic & innovative" architecture, fact is, it's still a slower GPU........or have I missed the release of the HD4880?


----------



## Wile E (Aug 20, 2008)

Tatty_One said:


> Whats this "ATi will retain the lead" malarkey?  Do they have the lead?  I thought NVidia had the fastest SINGLE gpu????  am I missing something here, OK if you bolt 2 GPU's together then thats a different story but for all of ATi's excellent marketing strategy this time around, coupled with their "futureistic & innovative" architecture, fact is, it's still a slower GPU........or have I missed the release of the HD4880?



It doesn't matter if it's a slower gpu. We don't buy gpus, we buy gfx cards. ATI has the fastest video card on the planet.


----------



## Tatty_One (Aug 20, 2008)

Wile E said:


> It doesn't matter if it's a slower gpu. We don't buy gpus, we buy gfx cards. ATI has the fastest video card on the planet.



Very true but as I said, bang 2 together and you have double (well ish) the performance, I can only think that NVidia havent done it because they have an even better single GPU solution up their sleeve to bash the R700......I cant beleive they will not at least try to  ATi very soon, it's just not like them..........reports of the R700 were being leaked months ago so it's not as if they had no warning.


----------



## mixa (Aug 20, 2008)

Its always been like that and in the end its the end-user who benefits the most.
Only those fanboys lose cos of staying on the same coast.The situation is now the same as it was with 5900 Ultra and 9800 Pro/XT.NVIDIA will take an year or smth to recover as usual, then I guess they will release something better than ATi (read AMD), cos ATi tends to launch a real monster once in a while that leave NV down in the bush, but then ATi starts to lack behind laying on on the old architecture.And then booom NV comes with something better 'cos they were working their ass off to reach ATi's beast.

It's a great show, go watch it


----------



## newconroer (Aug 20, 2008)

Wile E said:


> It doesn't matter if it's a slower gpu. We don't buy gpus, we buy gfx cards. ATI has the fastest video card on the planet.



Well sure, if you eliminate the 'price/performance' badge that so many people seem to wear, especially when discussing ATi.

But the X2 is as much of a 'flop' in that category as the 280 is/was, and this is where the variable of TWO gpus DOES matter.

Two GPUs,
DDR5,
How many shaders again? I can't count that high!
etc.

It boasts no real world advantage to the average consumer, or even some of the not so average consumers. It's a piece of hardware that only 'shines' (and by not that much..) in very acute situations that most people won't encounter.

It also draws 100 more watts, is natively hotter (and two times the heat at that), and costs $100 more(which should be a moot point, but SOMEHOW, price always gets involved whether it's TOP end products or not).

So...

Let's reverse the comparison.

280, single solution
Less power, heat and price.
Neck and neck, and at times, better(slightly) or worse performance (slightly) than the X2, in average comparisons. It falls short 10-25% (is that fair? on average?) to the X2 in acute or synthetic situations.


We could keep going, saying the 4870 is close to the 280, at times, and costs less and etc.etc.

The key difference being, that a 280 has more real world purpose than an X2. Then, from a tech standpoint, the performance of the X2, considering it's horsepower is far from impressive. Tack on the cost, heat, power etc. and it's even less impressive, and therefore just as much of a 'dog' as the 280.


In some ways, I think both sides failed.

Nvidia should have released the 280 as 55nm with better shaders.
ATI shouldn't have bothered with the X2, trying to attain some pointless 'crown,' and rather tried to keep the performance of their 4870/4850, but without giving the finger to heat/power/efficiency etc.


In the end, if a 280 isn't enough for you, then a X2 won't be either. The only real world application that will demand either of these cards is Crysis, more or less, and it's sad how everyone is using THAT as a benchmark, when five minutes before they were bitching about how Crysis is coded so 'poorly.' Yet even in Crysis the X2 will not give you that elusive 60 fps, or even a constant 40-45 - unless you turn things down or off, but then that defeats the purpose. But if you run a tuned custom config for Crysis, then you can get your 45+ FPS with all the eye candy, with EITHER card. 

Back to square one we go.



This graph pretty much sums up my understanding and perception of GPUs these days, in that many of them run the majority of 3d applications without fault.

The top two games are popular, modern and have a general requirement in regards to the power needed to run them. They are, average. All cards perform exceptionally well, easily achieving the elusive '60 fps'(or near it) requirement. The bottom two games, are examples of programs that can heavily tax the same GPUs used in the previous games, but are also popular and modern, just not average, hence 'acute.'

Crysis seems self-explanitory. Good choice using Arma, I was hoping someone would. Older engine, but the rules of GPU physics (not physics like PHYSX) still apply. Lots of objects, shaders, long range viewing distance and high resolutions can result in very crippled frame rates. It's interesting how well the 4870 does, but more importantly how well the X2 doesn't.


----------



## zithe (Aug 20, 2008)

newconroer said:


> Nvidia should have released the 280 as 55nm with better shaders.
> ATI shouldn't have bothered with the X2, trying to attain some pointless 'crown,' and rather tried to keep the performance of their 4870/4850, but without giving the finger to heat/power/efficiency etc.



Not necessarily. The X2 attracts attention. If a (uneducated) consumer is told that this card is the best in the world, they'll think "Oh, I can't afford that, but they made this and it has to be good too!"

I think flagship cards are for gathering our attention. They have to have a purpose or else companies wouldn't compete for the strongest product.


----------



## newconroer (Aug 20, 2008)

zithe said:


> Not necessarily. The X2 attracts attention. If a (uneducated) consumer is told that this card is the best in the world, they'll think "Oh, I can't afford that, but they made this and it has to be good too!"
> 
> I think flagship cards are for gathering our attention. They have to have a purpose or else companies wouldn't compete for the strongest product.




Well yes of course they get attention, but I'm not trying to discuss the fickleness of the average consumer's mentality or ignorance; rather trying to discuss about their needs. If they don't understand their needs then that's again, about ignorance, and perception, not fact.


The unfortunate thing about flagship cards is that they attract people in two ways, there's the:

*WTFBBQSAUCE pwnzerz* - bragging rights and I want the best!
and then the
*SynthetiX4Life benchers*

And this IS unfortunate because the first type should be pointless and irrelevant. The second type, benchers, are pitting themselves against technological odds, in order to achieve some 'goal.' They are using GPUs (primarily made for games) in order to benchmark.

If benchmarking was done with programs that utilised lots of vertexes and things like CAD or cinematics, design tools etc, then they would be having to use Quadro type GPUs, which I would much rather prefer, as that has less to do with gaming, and more to do with pure horsepower (of a different type), accuracy and things of an acute and statistical nature.


----------



## yogurt_21 (Aug 20, 2008)

newconroer said:


> Well yes of course they get attention, but I'm not trying to discuss the fickleness of the average consumer's mentality or ignorance; rather trying to discuss about their needs. If they don't understand their needs then that's again, about ignorance, and perception, not fact.
> 
> 
> The unfortunate thing about flagship cards is that they attract people in two ways, there's the:
> ...



true, sometimes I think ati and nvidia fawn over the flagship and forget about where the money is (well it's evident ati did for a long time as they got bought out while pumping impressive flagship cards.) 

right now If I were to think about it, neither the gtx280 nor the 4870x2 are practical at all, and the gtx260 and 4870 are even a stretch. the 9800gtx+ and the 4850 seem to be much better buys as they can play everything out there with a nice detail setting and can be dual-d and sometimes tri-d for cheaper than the next card up. the flagships may become more useful in a year or so when games can tap into their power, but right now, I'm cruising on a 9600gt and have yet to find a complaint.


----------



## Wile E (Aug 21, 2008)

newconroer said:


> Well sure, if you eliminate the 'price/performance' badge that so many people seem to wear, especially when discussing ATi.
> 
> But the X2 is as much of a 'flop' in that category as the 280 is/was, and this is where the variable of TWO gpus DOES matter.
> 
> ...


All the charts I have seen point to the X2 winning by a fair percentage, more often than it loses to a GTX.

With 280's dipping down as low as $420 on Newegg, it probably does take the price/perf crown now, but that wasn't the discussion here. The discussion turned into merely who had the fastest card, nothing more.

The fact remains the fastest card is the 4870X2.

Practical or not, I wish I could have 2 of them for my Xfire board. lol.

I also wouldn't mind having 2 280's for my AMD rig (Now that is truly overkill with it's 1440x900 monitor. lol.)


----------



## candle_86 (Aug 21, 2008)

yogurt_21 said:


> true, sometimes I think ati and nvidia fawn over the flagship and forget about where the money is (well it's evident ati did for a long time as they got bought out while pumping impressive flagship cards.)
> 
> right now If I were to think about it, neither the gtx280 nor the 4870x2 are practical at all, and the gtx260 and 4870 are even a stretch. the 9800gtx+ and the 4850 seem to be much better buys as they can play everything out there with a nice detail setting and can be dual-d and sometimes tri-d for cheaper than the next card up. the flagships may become more useful in a year or so when games can tap into their power, but right now, I'm cruising on a 9600gt and have yet to find a complaint.



really how so did Nvidia forget the mid range ever?

TNT2
Geforce2 MX400
Geforce3 Ti 200
Geforce4 Ti 4200
Geforce FX5600
Geforce FX5700
Geforce FX5900XT
Geforce 6600GT
Geforce 6800GS
Geforce 7600GT
Geforce 7900GS
Geforce 8600GTS 
Geforce 8800GS
Geforce 9600GT

It seems to me since 1999 Nvidia has been covering the midrange, you could argue the FX cards loose to the Radeon 9600 but anyone remember those days actully, the time of DX8, when DX9 wasn't being really used to potental. The FX cards kept up and the Radeon 9600 sucks just as much at FarCry or HL2 as the FX midrange does. The 8600GTS while not faster than the old highend does not seem like a real issue, it offered 7950GT preformance and DX10 support where is the problem? Now lets look at ATI's midrange and tell me who tends to have the best midrange

Radeon 7500
Radeon 8500Le
Radeon 9500
Radeon 9600
Radeon 9800SE
Radeon x600
Radeon x700
Radeon x800GT
Radeon x800GTO
Radeon x1600
Radeon x1650
Radeon x1800GTO
Radeon HD2600
Radeon HD36x0
Radeon HD3850

so in the sub 200 market who had the best cards at launch. Let me remind you a few things again. The x600 went up agasint the 6600GT at first which it couldn't compete with and later the x700pro couldn't keep up either. They made the x800GT and GTO to compete with the 6800GS but the 6800GS was once again faster. The x1600 was a joke, the x1650 was also a joke save the x1650XT but when it came out the 7900GS was the same price, and the x1800GTO lost to the 7600GT most of the time. HD2600 cards couldn't keep up with the 8600's and the HD36x0 didtn help. The HD3850 was a good midrange till the 8800GS showed up followed by the 9600GT

In truth the good ATI midrange look like this

Radeon 9500
Radeon 9600
Radeon HD3850

Nvidia had the faster midrange at launch every other time


----------



## AsRock (Aug 21, 2008)

newconroer said:


> Well sure, if you eliminate the 'price/performance' badge that so many people seem to wear, especially when discussing ATi.
> 
> But the X2 is as much of a 'flop' in that category as the 280 is/was, and this is where the variable of TWO gpus DOES matter.
> 
> ...



Any chance you know the program used to get the FPS for arma ?.  Did not know the community actually made one yet..  Well in fact there is one but how Arma works makes benchmark programs pointless.

You could load a  part of the game 5 times and find that each time different textures had not loaded there fore giving off false FPS.

As i was trying to get W1z to benchmark Arma to find it was a pretty much pointless.  How ever he said he might do it for Arma 2 if things improve.

Here's a message i got of some one who does a benchmark program for arma and says what the issue's are.


> Hi! Still a little bit surprised here but ill try to answer your questions thoroughly.
> 
> The biggest problem about the ArmA Mark was & still is the fact that (no matter what you do) you´ll always get varying results - that´s due to ArmA´s memory management &/or LOD handling.
> Another stumble stone with it are the thousands of different performance settings people are using - very few can be arsed to setup their ArmA the way someone else told them to - maybe not so important though for an isolated benchmark.
> ...


----------



## Tatty_One (Aug 21, 2008)

Wile E said:


> All the charts I have seen point to the X2 winning by a fair percentage, more often than it loses to a GTX.
> 
> With 280's dipping down as low as $420 on Newegg, it probably does take the price/perf crown now, but that wasn't the discussion here. The discussion turned into merely who had the fastest card, nothing more.
> 
> ...




Actually you said the fastest card......I said the fastest GPU


----------



## Wile E (Aug 21, 2008)

Tatty_One said:


> Actually you said the fastest card......I said the fastest GPU



OK fine, fair enough.


----------



## ktr (Aug 21, 2008)

Meh, its just a cycle. Nvidia has a few years of the best gpu's, then ATI has a few years of the best gpu's...and so forth. This also includes AMD and Intel.


----------



## Tatty_One (Aug 21, 2008)

Wile E said:


> OK fine, fair enough.



But you were right!


----------

