Friday, August 15th 2008
Jen-Hsun Huang (NVIDIA): ''We Underestimated RV770''
NVIDIA suffered its first red-quarter in five years. There are several contributors to this, namely an up to US $200M write-off to cover expenses in recalling and restoring faulty mobile graphics processors.
Another factor has been a replenished product lineup from competitor AMD/ATI that is taking on NVIDIA products at mid thru high and enthusiast segments of the market, in essence ATI now has products to counter NVIDIA at every possible segment, with more dressing up to go to office.
Seeking Alpha spoke with CEO Jen-Hsun Huang, he was quoted saying:
Source:
Seeking Alpha
Another factor has been a replenished product lineup from competitor AMD/ATI that is taking on NVIDIA products at mid thru high and enthusiast segments of the market, in essence ATI now has products to counter NVIDIA at every possible segment, with more dressing up to go to office.
Seeking Alpha spoke with CEO Jen-Hsun Huang, he was quoted saying:
We underestimated the price performance of our competitor's most recent GPU, which led us to mis-position our fall lineup. The first step of our response was to reset our price to reflect competitive realities. Our action put us again in a strong competitive position but we took hard hits with respect to our overall GPU ASPs and ultimately to our gross margins. The price action was particularly difficult since we are just ramping 55-nanometer and the weak market resulted in taking longer than expected to work through our 65-nanometer inventory.Huang says that with their transit to the 55nm silicon fabrication process, they hope to do better.
92 Comments on Jen-Hsun Huang (NVIDIA): ''We Underestimated RV770''
To sit here and claim that ATI will retain the lead is just silly. There is absolutely no way to predict that.
Only those fanboys lose cos of staying on the same coast.The situation is now the same as it was with 5900 Ultra and 9800 Pro/XT.NVIDIA will take an year or smth to recover as usual, then I guess they will release something better than ATi (read AMD), cos ATi tends to launch a real monster once in a while that leave NV down in the bush, but then ATi starts to lack behind laying on on the old architecture.And then booom NV comes with something better 'cos they were working their ass off to reach ATi's beast.
It's a great show, go watch it :D
But the X2 is as much of a 'flop' in that category as the 280 is/was, and this is where the variable of TWO gpus DOES matter.
Two GPUs,
DDR5,
How many shaders again? I can't count that high!
etc.
It boasts no real world advantage to the average consumer, or even some of the not so average consumers. It's a piece of hardware that only 'shines' (and by not that much..) in very acute situations that most people won't encounter.
It also draws 100 more watts, is natively hotter (and two times the heat at that), and costs $100 more(which should be a moot point, but SOMEHOW, price always gets involved whether it's TOP end products or not).
So...
Let's reverse the comparison.
280, single solution
Less power, heat and price.
Neck and neck, and at times, better(slightly) or worse performance (slightly) than the X2, in average comparisons. It falls short 10-25% (is that fair? on average?) to the X2 in acute or synthetic situations.
We could keep going, saying the 4870 is close to the 280, at times, and costs less and etc.etc.
The key difference being, that a 280 has more real world purpose than an X2. Then, from a tech standpoint, the performance of the X2, considering it's horsepower is far from impressive. Tack on the cost, heat, power etc. and it's even less impressive, and therefore just as much of a 'dog' as the 280.
In some ways, I think both sides failed.
Nvidia should have released the 280 as 55nm with better shaders.
ATI shouldn't have bothered with the X2, trying to attain some pointless 'crown,' and rather tried to keep the performance of their 4870/4850, but without giving the finger to heat/power/efficiency etc.
In the end, if a 280 isn't enough for you, then a X2 won't be either. The only real world application that will demand either of these cards is Crysis, more or less, and it's sad how everyone is using THAT as a benchmark, when five minutes before they were bitching about how Crysis is coded so 'poorly.' Yet even in Crysis the X2 will not give you that elusive 60 fps, or even a constant 40-45 - unless you turn things down or off, but then that defeats the purpose. But if you run a tuned custom config for Crysis, then you can get your 45+ FPS with all the eye candy, with EITHER card.
Back to square one we go.
This graph pretty much sums up my understanding and perception of GPUs these days, in that many of them run the majority of 3d applications without fault.
The top two games are popular, modern and have a general requirement in regards to the power needed to run them. They are, average. All cards perform exceptionally well, easily achieving the elusive '60 fps'(or near it) requirement. The bottom two games, are examples of programs that can heavily tax the same GPUs used in the previous games, but are also popular and modern, just not average, hence 'acute.'
Crysis seems self-explanitory. Good choice using Arma, I was hoping someone would. Older engine, but the rules of GPU physics (not physics like PHYSX) still apply. Lots of objects, shaders, long range viewing distance and high resolutions can result in very crippled frame rates. It's interesting how well the 4870 does, but more importantly how well the X2 doesn't.
I think flagship cards are for gathering our attention. They have to have a purpose or else companies wouldn't compete for the strongest product.
The unfortunate thing about flagship cards is that they attract people in two ways, there's the:
WTFBBQSAUCE pwnzerz - bragging rights and I want the best!
and then the
SynthetiX4Life benchers
And this IS unfortunate because the first type should be pointless and irrelevant. The second type, benchers, are pitting themselves against technological odds, in order to achieve some 'goal.' They are using GPUs (primarily made for games) in order to benchmark.
If benchmarking was done with programs that utilised lots of vertexes and things like CAD or cinematics, design tools etc, then they would be having to use Quadro type GPUs, which I would much rather prefer, as that has less to do with gaming, and more to do with pure horsepower (of a different type), accuracy and things of an acute and statistical nature.
right now If I were to think about it, neither the gtx280 nor the 4870x2 are practical at all, and the gtx260 and 4870 are even a stretch. the 9800gtx+ and the 4850 seem to be much better buys as they can play everything out there with a nice detail setting and can be dual-d and sometimes tri-d for cheaper than the next card up. the flagships may become more useful in a year or so when games can tap into their power, but right now, I'm cruising on a 9600gt and have yet to find a complaint.
With 280's dipping down as low as $420 on Newegg, it probably does take the price/perf crown now, but that wasn't the discussion here. The discussion turned into merely who had the fastest card, nothing more.
The fact remains the fastest card is the 4870X2.
Practical or not, I wish I could have 2 of them for my Xfire board. lol.
I also wouldn't mind having 2 280's for my AMD rig (Now that is truly overkill with it's 1440x900 monitor. lol.)
TNT2
Geforce2 MX400
Geforce3 Ti 200
Geforce4 Ti 4200
Geforce FX5600
Geforce FX5700
Geforce FX5900XT
Geforce 6600GT
Geforce 6800GS
Geforce 7600GT
Geforce 7900GS
Geforce 8600GTS
Geforce 8800GS
Geforce 9600GT
It seems to me since 1999 Nvidia has been covering the midrange, you could argue the FX cards loose to the Radeon 9600 but anyone remember those days actully, the time of DX8, when DX9 wasn't being really used to potental. The FX cards kept up and the Radeon 9600 sucks just as much at FarCry or HL2 as the FX midrange does. The 8600GTS while not faster than the old highend does not seem like a real issue, it offered 7950GT preformance and DX10 support where is the problem? Now lets look at ATI's midrange and tell me who tends to have the best midrange
Radeon 7500
Radeon 8500Le
Radeon 9500
Radeon 9600
Radeon 9800SE
Radeon x600
Radeon x700
Radeon x800GT
Radeon x800GTO
Radeon x1600
Radeon x1650
Radeon x1800GTO
Radeon HD2600
Radeon HD36x0
Radeon HD3850
so in the sub 200 market who had the best cards at launch. Let me remind you a few things again. The x600 went up agasint the 6600GT at first which it couldn't compete with and later the x700pro couldn't keep up either. They made the x800GT and GTO to compete with the 6800GS but the 6800GS was once again faster. The x1600 was a joke, the x1650 was also a joke save the x1650XT but when it came out the 7900GS was the same price, and the x1800GTO lost to the 7600GT most of the time. HD2600 cards couldn't keep up with the 8600's and the HD36x0 didtn help. The HD3850 was a good midrange till the 8800GS showed up followed by the 9600GT
In truth the good ATI midrange look like this
Radeon 9500
Radeon 9600
Radeon HD3850
Nvidia had the faster midrange at launch every other time
You could load a part of the game 5 times and find that each time different textures had not loaded there fore giving off false FPS.
As i was trying to get W1z to benchmark Arma to find it was a pretty much pointless. How ever he said he might do it for Arma 2 if things improve.
Here's a message i got of some one who does a benchmark program for arma and says what the issue's are.