Friday, August 15th 2008

Jen-Hsun Huang (NVIDIA): ''We Underestimated RV770''

NVIDIA suffered its first red-quarter in five years. There are several contributors to this, namely an up to US $200M write-off to cover expenses in recalling and restoring faulty mobile graphics processors.

Another factor has been a replenished product lineup from competitor AMD/ATI that is taking on NVIDIA products at mid thru high and enthusiast segments of the market, in essence ATI now has products to counter NVIDIA at every possible segment, with more dressing up to go to office.

Seeking Alpha spoke with CEO Jen-Hsun Huang, he was quoted saying:
We underestimated the price performance of our competitor's most recent GPU, which led us to mis-position our fall lineup. The first step of our response was to reset our price to reflect competitive realities. Our action put us again in a strong competitive position but we took hard hits with respect to our overall GPU ASPs and ultimately to our gross margins. The price action was particularly difficult since we are just ramping 55-nanometer and the weak market resulted in taking longer than expected to work through our 65-nanometer inventory.
Huang says that with their transit to the 55nm silicon fabrication process, they hope to do better.
Source: Seeking Alpha
Add your own comment

92 Comments on Jen-Hsun Huang (NVIDIA): ''We Underestimated RV770''

#76
Wile E
Power User
X1REMEto everybody saying am glad nvidia has learned or that ati has woken up nvidia the beast.

all nvidia has woken upto is a powerpoint slide not fully complete yet, they don't have an answer for anther 8/9 months. look you cant just make new gpu cards from the 8 series architecture in a few months which exactly what it is + name changes for the past 2 years (nobody can OK)

the funny thing is when nvidia does come back after 8 to 9 months min they will get smacked right back down again with the r800 little dragon, nvidia gonna be 4rth for at least 2+ years. amd has finally learned there is no rest for the wicked as you may find your self bankrupt if you don't have the crown e.g cpu`s

nvidia fans make me laugh the things they come out with even when there on the loosing side
And 8-9months between lead changes is normal. It's what has happened in the ATI/NV battle for years. You can't say that R800 will beat NV's next offerings at all. You have no idea what either NV's next design, or even R800, has to offer. For all we know, R800 could be a flop, as could the next NV design.

To sit here and claim that ATI will retain the lead is just silly. There is absolutely no way to predict that.
Posted on Reply
#77
$ReaPeR$
IMO nv underestimated ati and that was the starting point for their current situation , they will come back with an answer because they have the funds and the tech resources and that is good for us because if there is only one company in any kind of market the customer gets raped over and over again because of the lack of antagonism.
Posted on Reply
#78
Tatty_Two
Gone Fishing
Whats this "ATi will retain the lead" malarkey? Do they have the lead? I thought NVidia had the fastest SINGLE gpu???? am I missing something here, OK if you bolt 2 GPU's together then thats a different story but for all of ATi's excellent marketing strategy this time around, coupled with their "futureistic & innovative" architecture, fact is, it's still a slower GPU........or have I missed the release of the HD4880? :D
Posted on Reply
#79
Wile E
Power User
Tatty_OneWhats this "ATi will retain the lead" malarkey? Do they have the lead? I thought NVidia had the fastest SINGLE gpu???? am I missing something here, OK if you bolt 2 GPU's together then thats a different story but for all of ATi's excellent marketing strategy this time around, coupled with their "futureistic & innovative" architecture, fact is, it's still a slower GPU........or have I missed the release of the HD4880? :D
It doesn't matter if it's a slower gpu. We don't buy gpus, we buy gfx cards. ATI has the fastest video card on the planet.
Posted on Reply
#80
Tatty_Two
Gone Fishing
Wile EIt doesn't matter if it's a slower gpu. We don't buy gpus, we buy gfx cards. ATI has the fastest video card on the planet.
Very true but as I said, bang 2 together and you have double (well ish) the performance, I can only think that NVidia havent done it because they have an even better single GPU solution up their sleeve to bash the R700......I cant beleive they will not at least try to :nutkick: ATi very soon, it's just not like them..........reports of the R700 were being leaked months ago so it's not as if they had no warning.
Posted on Reply
#81
mixa
Its always been like that and in the end its the end-user who benefits the most.
Only those fanboys lose cos of staying on the same coast.The situation is now the same as it was with 5900 Ultra and 9800 Pro/XT.NVIDIA will take an year or smth to recover as usual, then I guess they will release something better than ATi (read AMD), cos ATi tends to launch a real monster once in a while that leave NV down in the bush, but then ATi starts to lack behind laying on on the old architecture.And then booom NV comes with something better 'cos they were working their ass off to reach ATi's beast.

It's a great show, go watch it :D
Posted on Reply
#82
newconroer
Wile EIt doesn't matter if it's a slower gpu. We don't buy gpus, we buy gfx cards. ATI has the fastest video card on the planet.
Well sure, if you eliminate the 'price/performance' badge that so many people seem to wear, especially when discussing ATi.

But the X2 is as much of a 'flop' in that category as the 280 is/was, and this is where the variable of TWO gpus DOES matter.

Two GPUs,
DDR5,
How many shaders again? I can't count that high!
etc.

It boasts no real world advantage to the average consumer, or even some of the not so average consumers. It's a piece of hardware that only 'shines' (and by not that much..) in very acute situations that most people won't encounter.

It also draws 100 more watts, is natively hotter (and two times the heat at that), and costs $100 more(which should be a moot point, but SOMEHOW, price always gets involved whether it's TOP end products or not).

So...

Let's reverse the comparison.

280, single solution
Less power, heat and price.
Neck and neck, and at times, better(slightly) or worse performance (slightly) than the X2, in average comparisons. It falls short 10-25% (is that fair? on average?) to the X2 in acute or synthetic situations.


We could keep going, saying the 4870 is close to the 280, at times, and costs less and etc.etc.

The key difference being, that a 280 has more real world purpose than an X2. Then, from a tech standpoint, the performance of the X2, considering it's horsepower is far from impressive. Tack on the cost, heat, power etc. and it's even less impressive, and therefore just as much of a 'dog' as the 280.


In some ways, I think both sides failed.

Nvidia should have released the 280 as 55nm with better shaders.
ATI shouldn't have bothered with the X2, trying to attain some pointless 'crown,' and rather tried to keep the performance of their 4870/4850, but without giving the finger to heat/power/efficiency etc.


In the end, if a 280 isn't enough for you, then a X2 won't be either. The only real world application that will demand either of these cards is Crysis, more or less, and it's sad how everyone is using THAT as a benchmark, when five minutes before they were bitching about how Crysis is coded so 'poorly.' Yet even in Crysis the X2 will not give you that elusive 60 fps, or even a constant 40-45 - unless you turn things down or off, but then that defeats the purpose. But if you run a tuned custom config for Crysis, then you can get your 45+ FPS with all the eye candy, with EITHER card.

Back to square one we go.



This graph pretty much sums up my understanding and perception of GPUs these days, in that many of them run the majority of 3d applications without fault.

The top two games are popular, modern and have a general requirement in regards to the power needed to run them. They are, average. All cards perform exceptionally well, easily achieving the elusive '60 fps'(or near it) requirement. The bottom two games, are examples of programs that can heavily tax the same GPUs used in the previous games, but are also popular and modern, just not average, hence 'acute.'

Crysis seems self-explanitory. Good choice using Arma, I was hoping someone would. Older engine, but the rules of GPU physics (not physics like PHYSX) still apply. Lots of objects, shaders, long range viewing distance and high resolutions can result in very crippled frame rates. It's interesting how well the 4870 does, but more importantly how well the X2 doesn't.
Posted on Reply
#83
zithe
newconroerNvidia should have released the 280 as 55nm with better shaders.
ATI shouldn't have bothered with the X2, trying to attain some pointless 'crown,' and rather tried to keep the performance of their 4870/4850, but without giving the finger to heat/power/efficiency etc.
Not necessarily. The X2 attracts attention. If a (uneducated) consumer is told that this card is the best in the world, they'll think "Oh, I can't afford that, but they made this and it has to be good too!"

I think flagship cards are for gathering our attention. They have to have a purpose or else companies wouldn't compete for the strongest product.
Posted on Reply
#84
newconroer
zitheNot necessarily. The X2 attracts attention. If a (uneducated) consumer is told that this card is the best in the world, they'll think "Oh, I can't afford that, but they made this and it has to be good too!"

I think flagship cards are for gathering our attention. They have to have a purpose or else companies wouldn't compete for the strongest product.
Well yes of course they get attention, but I'm not trying to discuss the fickleness of the average consumer's mentality or ignorance; rather trying to discuss about their needs. If they don't understand their needs then that's again, about ignorance, and perception, not fact.


The unfortunate thing about flagship cards is that they attract people in two ways, there's the:

WTFBBQSAUCE pwnzerz - bragging rights and I want the best!
and then the
SynthetiX4Life benchers

And this IS unfortunate because the first type should be pointless and irrelevant. The second type, benchers, are pitting themselves against technological odds, in order to achieve some 'goal.' They are using GPUs (primarily made for games) in order to benchmark.

If benchmarking was done with programs that utilised lots of vertexes and things like CAD or cinematics, design tools etc, then they would be having to use Quadro type GPUs, which I would much rather prefer, as that has less to do with gaming, and more to do with pure horsepower (of a different type), accuracy and things of an acute and statistical nature.
Posted on Reply
#85
yogurt_21
newconroerWell yes of course they get attention, but I'm not trying to discuss the fickleness of the average consumer's mentality or ignorance; rather trying to discuss about their needs. If they don't understand their needs then that's again, about ignorance, and perception, not fact.


The unfortunate thing about flagship cards is that they attract people in two ways, there's the:

WTFBBQSAUCE pwnzerz - bragging rights and I want the best!
and then the
SynthetiX4Life benchers

And this IS unfortunate because the first type should be pointless and irrelevant. The second type, benchers, are pitting themselves against technological odds, in order to achieve some 'goal.' They are using GPUs (primarily made for games) in order to benchmark.

If benchmarking was done with programs that utilised lots of vertexes and things like CAD or cinematics, design tools etc, then they would be having to use Quadro type GPUs, which I would much rather prefer, as that has less to do with gaming, and more to do with pure horsepower (of a different type), accuracy and things of an acute and statistical nature.
true, sometimes I think ati and nvidia fawn over the flagship and forget about where the money is (well it's evident ati did for a long time as they got bought out while pumping impressive flagship cards.)

right now If I were to think about it, neither the gtx280 nor the 4870x2 are practical at all, and the gtx260 and 4870 are even a stretch. the 9800gtx+ and the 4850 seem to be much better buys as they can play everything out there with a nice detail setting and can be dual-d and sometimes tri-d for cheaper than the next card up. the flagships may become more useful in a year or so when games can tap into their power, but right now, I'm cruising on a 9600gt and have yet to find a complaint.
Posted on Reply
#86
Wile E
Power User
newconroerWell sure, if you eliminate the 'price/performance' badge that so many people seem to wear, especially when discussing ATi.

But the X2 is as much of a 'flop' in that category as the 280 is/was, and this is where the variable of TWO gpus DOES matter.

Two GPUs,
DDR5,
How many shaders again? I can't count that high!
etc.

It boasts no real world advantage to the average consumer, or even some of the not so average consumers. It's a piece of hardware that only 'shines' (and by not that much..) in very acute situations that most people won't encounter.

It also draws 100 more watts, is natively hotter (and two times the heat at that), and costs $100 more(which should be a moot point, but SOMEHOW, price always gets involved whether it's TOP end products or not).

So...

Let's reverse the comparison.

280, single solution
Less power, heat and price.
Neck and neck, and at times, better(slightly) or worse performance (slightly) than the X2, in average comparisons. It falls short 10-25% (is that fair? on average?) to the X2 in acute or synthetic situations.


We could keep going, saying the 4870 is close to the 280, at times, and costs less and etc.etc.

The key difference being, that a 280 has more real world purpose than an X2. Then, from a tech standpoint, the performance of the X2, considering it's horsepower is far from impressive. Tack on the cost, heat, power etc. and it's even less impressive, and therefore just as much of a 'dog' as the 280.


In some ways, I think both sides failed.

Nvidia should have released the 280 as 55nm with better shaders.
ATI shouldn't have bothered with the X2, trying to attain some pointless 'crown,' and rather tried to keep the performance of their 4870/4850, but without giving the finger to heat/power/efficiency etc.


In the end, if a 280 isn't enough for you, then a X2 won't be either. The only real world application that will demand either of these cards is Crysis, more or less, and it's sad how everyone is using THAT as a benchmark, when five minutes before they were bitching about how Crysis is coded so 'poorly.' Yet even in Crysis the X2 will not give you that elusive 60 fps, or even a constant 40-45 - unless you turn things down or off, but then that defeats the purpose. But if you run a tuned custom config for Crysis, then you can get your 45+ FPS with all the eye candy, with EITHER card.

Back to square one we go.



This graph pretty much sums up my understanding and perception of GPUs these days, in that many of them run the majority of 3d applications without fault.

The top two games are popular, modern and have a general requirement in regards to the power needed to run them. They are, average. All cards perform exceptionally well, easily achieving the elusive '60 fps'(or near it) requirement. The bottom two games, are examples of programs that can heavily tax the same GPUs used in the previous games, but are also popular and modern, just not average, hence 'acute.'

Crysis seems self-explanitory. Good choice using Arma, I was hoping someone would. Older engine, but the rules of GPU physics (not physics like PHYSX) still apply. Lots of objects, shaders, long range viewing distance and high resolutions can result in very crippled frame rates. It's interesting how well the 4870 does, but more importantly how well the X2 doesn't.
All the charts I have seen point to the X2 winning by a fair percentage, more often than it loses to a GTX.

With 280's dipping down as low as $420 on Newegg, it probably does take the price/perf crown now, but that wasn't the discussion here. The discussion turned into merely who had the fastest card, nothing more.

The fact remains the fastest card is the 4870X2.

Practical or not, I wish I could have 2 of them for my Xfire board. lol.

I also wouldn't mind having 2 280's for my AMD rig (Now that is truly overkill with it's 1440x900 monitor. lol.)
Posted on Reply
#87
candle_86
yogurt_21true, sometimes I think ati and nvidia fawn over the flagship and forget about where the money is (well it's evident ati did for a long time as they got bought out while pumping impressive flagship cards.)

right now If I were to think about it, neither the gtx280 nor the 4870x2 are practical at all, and the gtx260 and 4870 are even a stretch. the 9800gtx+ and the 4850 seem to be much better buys as they can play everything out there with a nice detail setting and can be dual-d and sometimes tri-d for cheaper than the next card up. the flagships may become more useful in a year or so when games can tap into their power, but right now, I'm cruising on a 9600gt and have yet to find a complaint.
really how so did Nvidia forget the mid range ever?

TNT2
Geforce2 MX400
Geforce3 Ti 200
Geforce4 Ti 4200
Geforce FX5600
Geforce FX5700
Geforce FX5900XT
Geforce 6600GT
Geforce 6800GS
Geforce 7600GT
Geforce 7900GS
Geforce 8600GTS
Geforce 8800GS
Geforce 9600GT

It seems to me since 1999 Nvidia has been covering the midrange, you could argue the FX cards loose to the Radeon 9600 but anyone remember those days actully, the time of DX8, when DX9 wasn't being really used to potental. The FX cards kept up and the Radeon 9600 sucks just as much at FarCry or HL2 as the FX midrange does. The 8600GTS while not faster than the old highend does not seem like a real issue, it offered 7950GT preformance and DX10 support where is the problem? Now lets look at ATI's midrange and tell me who tends to have the best midrange

Radeon 7500
Radeon 8500Le
Radeon 9500
Radeon 9600
Radeon 9800SE
Radeon x600
Radeon x700
Radeon x800GT
Radeon x800GTO
Radeon x1600
Radeon x1650
Radeon x1800GTO
Radeon HD2600
Radeon HD36x0
Radeon HD3850

so in the sub 200 market who had the best cards at launch. Let me remind you a few things again. The x600 went up agasint the 6600GT at first which it couldn't compete with and later the x700pro couldn't keep up either. They made the x800GT and GTO to compete with the 6800GS but the 6800GS was once again faster. The x1600 was a joke, the x1650 was also a joke save the x1650XT but when it came out the 7900GS was the same price, and the x1800GTO lost to the 7600GT most of the time. HD2600 cards couldn't keep up with the 8600's and the HD36x0 didtn help. The HD3850 was a good midrange till the 8800GS showed up followed by the 9600GT

In truth the good ATI midrange look like this

Radeon 9500
Radeon 9600
Radeon HD3850

Nvidia had the faster midrange at launch every other time
Posted on Reply
#88
AsRock
TPU addict
newconroerWell sure, if you eliminate the 'price/performance' badge that so many people seem to wear, especially when discussing ATi.

But the X2 is as much of a 'flop' in that category as the 280 is/was, and this is where the variable of TWO gpus DOES matter.

Two GPUs,
DDR5,
How many shaders again? I can't count that high!
etc.

It boasts no real world advantage to the average consumer, or even some of the not so average consumers. It's a piece of hardware that only 'shines' (and by not that much..) in very acute situations that most people won't encounter.

It also draws 100 more watts, is natively hotter (and two times the heat at that), and costs $100 more(which should be a moot point, but SOMEHOW, price always gets involved whether it's TOP end products or not).

So...

Let's reverse the comparison.

280, single solution
Less power, heat and price.
Neck and neck, and at times, better(slightly) or worse performance (slightly) than the X2, in average comparisons. It falls short 10-25% (is that fair? on average?) to the X2 in acute or synthetic situations.


We could keep going, saying the 4870 is close to the 280, at times, and costs less and etc.etc.

The key difference being, that a 280 has more real world purpose than an X2. Then, from a tech standpoint, the performance of the X2, considering it's horsepower is far from impressive. Tack on the cost, heat, power etc. and it's even less impressive, and therefore just as much of a 'dog' as the 280.


In some ways, I think both sides failed.

Nvidia should have released the 280 as 55nm with better shaders.
ATI shouldn't have bothered with the X2, trying to attain some pointless 'crown,' and rather tried to keep the performance of their 4870/4850, but without giving the finger to heat/power/efficiency etc.


In the end, if a 280 isn't enough for you, then a X2 won't be either. The only real world application that will demand either of these cards is Crysis, more or less, and it's sad how everyone is using THAT as a benchmark, when five minutes before they were bitching about how Crysis is coded so 'poorly.' Yet even in Crysis the X2 will not give you that elusive 60 fps, or even a constant 40-45 - unless you turn things down or off, but then that defeats the purpose. But if you run a tuned custom config for Crysis, then you can get your 45+ FPS with all the eye candy, with EITHER card.

Back to square one we go.



This graph pretty much sums up my understanding and perception of GPUs these days, in that many of them run the majority of 3d applications without fault.

The top two games are popular, modern and have a general requirement in regards to the power needed to run them. They are, average. All cards perform exceptionally well, easily achieving the elusive '60 fps'(or near it) requirement. The bottom two games, are examples of programs that can heavily tax the same GPUs used in the previous games, but are also popular and modern, just not average, hence 'acute.'

Crysis seems self-explanitory. Good choice using Arma, I was hoping someone would. Older engine, but the rules of GPU physics (not physics like PHYSX) still apply. Lots of objects, shaders, long range viewing distance and high resolutions can result in very crippled frame rates. It's interesting how well the 4870 does, but more importantly how well the X2 doesn't.
Any chance you know the program used to get the FPS for arma ?. Did not know the community actually made one yet.. Well in fact there is one but how Arma works makes benchmark programs pointless.

You could load a part of the game 5 times and find that each time different textures had not loaded there fore giving off false FPS.

As i was trying to get W1z to benchmark Arma to find it was a pretty much pointless. How ever he said he might do it for Arma 2 if things improve.

Here's a message i got of some one who does a benchmark program for arma and says what the issue's are.
Hi! Still a little bit surprised here but ill try to answer your questions thoroughly.

The biggest problem about the ArmA Mark was & still is the fact that (no matter what you do) you´ll always get varying results - that´s due to ArmA´s memory management &/or LOD handling.
Another stumble stone with it are the thousands of different performance settings people are using - very few can be arsed to setup their ArmA the way someone else told them to - maybe not so important though for an isolated benchmark.

As for updates: Sadly there aren´t any - but despite some wrong text (says OFP instead of ArmA) in the mission header or briefing there should be no major flaws or show stoppers.

Just be advised that sometimes ArmA behaves weird -
Back in OFP it was strongly advised to let the benchmark run through first, then restart it and let it run again to get a more comparable score (precaching objects helped alot) - but in ArmA that doesn´t help any longer. So sometimes the result will be influenced in a negative or positive way because some textures didn´t show up right from the start, or in another case some AI would decide to rather not cross a bridge at first (hence bringing the results down due to longer needed time for a part of the test) and so on. But since all those are ArmA 'features' nothing can be done about it from our side.


Coming to an end, im still suprised, but absolutely love your initiative!
Would be great to see ArmA being used for serious hardware tests - because we all know its the most demanding game ever. So yeah, plz go ahead!


Cheers
burns
Posted on Reply
#89
Tatty_Two
Gone Fishing
Wile EAll the charts I have seen point to the X2 winning by a fair percentage, more often than it loses to a GTX.

With 280's dipping down as low as $420 on Newegg, it probably does take the price/perf crown now, but that wasn't the discussion here. The discussion turned into merely who had the fastest card, nothing more.

The fact remains the fastest card is the 4870X2.

Practical or not, I wish I could have 2 of them for my Xfire board. lol.

I also wouldn't mind having 2 280's for my AMD rig (Now that is truly overkill with it's 1440x900 monitor. lol.)
Actually you said the fastest card......I said the fastest GPU :p
Posted on Reply
#90
Wile E
Power User
Tatty_OneActually you said the fastest card......I said the fastest GPU :p
OK fine, fair enough. :laugh:
Posted on Reply
#91
ktr
Meh, its just a cycle. Nvidia has a few years of the best gpu's, then ATI has a few years of the best gpu's...and so forth. This also includes AMD and Intel.
Posted on Reply
#92
Tatty_Two
Gone Fishing
Wile EOK fine, fair enough. :laugh:
But you were right!
Posted on Reply
Add your own comment
Dec 20th, 2024 14:30 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts