Thursday, September 20th 2018
NVIDIA Stock Falls 2.1% After Turing GPU Reviews Fail to Impress Morgan Stanley
NVIDIA's embargo on their Turing-based RTX 2080 and RTX 2080 Ti ended Wednesday, September 19 and it appears that enthusiasts were not the only ones left wanting more from these graphics cards. In particular, Morgan Stanley analyst Joseph Moore shared a note today (Thursday, September 20) with company clients saying "As review embargos broke for the new gaming products, performance improvements in older games is not the leap we had initially hoped for. Performance boost on older games that do not incorporate advanced features is somewhat below our initial expectations, and review recommendations are mixed given higher price points." The NVIDIA Corporation share value on the NASDAQ exchange had closed at $271.98 (USD) Wednesday and immediately tumbled down to a low of $264.10 opening today before recovering to close at $266.28, down 2.1% over the previous closure.
The Morgan Stanley report further mentioned that "We are surprised that the 2080 is only slightly better than the 1080ti, which has been available for over a year and is slightly less expensive. With higher clock speeds, higher core count, and 40% higher memory bandwidth, we had expected a bigger boost." Accordingly, the market analyst expects a slower adoption of these new GPUs as well as no expectation of "much upside" from NVIDIA's gaming business unit for the next two quarters. Despite all this, Morgan Stanley remains bullish on NVIDIA and expects a $273 price point in the long term.
Source:
CNBC
The Morgan Stanley report further mentioned that "We are surprised that the 2080 is only slightly better than the 1080ti, which has been available for over a year and is slightly less expensive. With higher clock speeds, higher core count, and 40% higher memory bandwidth, we had expected a bigger boost." Accordingly, the market analyst expects a slower adoption of these new GPUs as well as no expectation of "much upside" from NVIDIA's gaming business unit for the next two quarters. Despite all this, Morgan Stanley remains bullish on NVIDIA and expects a $273 price point in the long term.
96 Comments on NVIDIA Stock Falls 2.1% After Turing GPU Reviews Fail to Impress Morgan Stanley
I know, it's just a joke but honestly, wouldn't take me any wonder at all.
Would be virtually the very same what they did back then when they were forced to comply with a settlement on that GTX 970 3.5+5-issue.
Since the very same day they had to comply to end that class action lawsuit (and had to compensate all given owners their part of it by paying that 35$ USD each) to get the issue off the table, that very paricular day
Jensenthe green leather jacket wearing Hobbit™ decided to show everyone the bold, naked F as they were forced to comply with the GTX 970 owner's settlement and dropped the GTX 970 the very same day afterwards, making it legacy.It's literally like they punished their own customers for even dare to questioning nVidia's God-given right to screw over their user-base and robs their customers with rather fancy prices … Absolutely, yes.
At this moment they're left with huge amounts of stocks of the older GTX 10x0-Generation, huge like really vast amounts of those.
There are rumours that their stock actually are up to a million (sic!), as even a single major AIB had to return +300.000 cards on his own alone, which is all to that mining-boom. Seems they just got way too greedy and misjudged the demand by wide margin.
Just imagine 1M cards à, let's say 500$ each? That would be half a billion on inventory. Half a fucking billion.
I'm pret·ty sure nVidia is NOT going to amortise those as being a classical loss-making business. nVidia is not going to waste such an opportunity to sell their older cards at MSRP.
Not going to happen, nVidia ist just way too greedy to let slip that offer to inflate profit.
Technical or analytical side is nonexistent. This is almost completely incorrect facts.
- Higher clock speeds - 1480/1582 vs 1515/1710. Yeah, in specs. Not necessarily so much in reality as Boost 3/4 obfuscate things.
- Higher core count - 3584:224:88 vs 2944:184:64. Nope.
- 40% higher memory bandwidth - 484.3 vs 448.0. Nope.
If they knew what was good for them. It's the perfect opportunity for them to shine.
Vega20 is too expensive plus it seems to have some problems with gaming architecture-wise. Polaris is small. Navi is 2019/2020, so in a year, maybe.
And I bet Nvidia has a contingency plan if AMD should decide to compete in high end.
Nvidia is probably dieshrinking Turing to 7nm next year with whatever changes turn out to be necessary for RT and that is what AMD needs to compete with. That will be Navi and timeframe is next holiday season, not sooner. I mean... do you guys actually believe that?
The fact that they felt it necessary to release the TI version at the same time shows that they knew the 2080's rasterizing performance alone wasn't going to impress anyone.
Until DLSS is supported in pre-exsisting games (if they choose to support it), not just upcoming titles, the 2080 looks bad.
In the past, a new GPU launch's main focus has been about raw performance, shiny new features being secondary.
The fact that they mainly focused on RTX tech felt like they were trying to hide something, and they were.
The 2080's performance is unremarkable, unless you're using DLSS anti aliasing.
If they wanted to impress with raw performance, which I think is what most of us care about, then DLSS is the real gem here, not RTX.
We're not fools nvidia. This launch left a bad taste in our mouths, and the stock market reflects this.
At least the reflections were ray-traced I suppose :rolleyes:
With all that money saved by buying more cards, them stocks should be up by @ least 10% ...
Surely Stanley must have interpreted this the wrong way: maybe Stanley used the wrong tensor cores (Titan V's instead of RTX 2080 Ti's) and "machine learned" wrong ...
Yes, used cards and stock clearances from miners will keep the used price down and 1080Ti an excellent value proposition but manufacturer is not able to affect the second hand market anyway.
Much more expensive is a temporary thing.
Remember the launch of 1080? $599/$699 MSRP and these sold like hotcakes for 750-800 moneys.
1080Ti? $699 MSRP and it took over a month to get prices below the same 750-800 moneys.
Vega64? $499 MSRP, besides some fairly small shipments these sold for 700-ish moneys for a long while.
And none of these launches were really affected by mining craze. That reached high end cards later.
Then nobody would have complained a bit. On a contrary even...
Just imagine all the benefits!
- You're skipping a whole true generation and given costs on R&D and manufacturing
- nVidia officially postponed their 'Next Generation Mainstream GPU' based upon Volta on Hot Chips 30, wasn't by accident
- You're able to clear your insane inventories and sell the vast stock of older cards at full MSRP
- You're able to establish another price increase literally by just passing by
- You're able to establish another proprietary nVidia-only standard à la
- It will work
All you… since everyone is going to say »Well, fuck off nVidia – I'm not going to buy this overpriced shit. I'll get a GTX 1080 instead!«
PhysXGameWorks, also just en passant… since PhysX, then Tesselation, then HairWorks then G-Sync didn't really paid out for you as people look through your shitty attempt in dividing the market and tighten your closed ecosystem
… since AMD isn't able to keep up with you anyway.
havehad to do was to bring a 'new' generation's card which has techniques which are pretty much useless and ain't even any futureproof as of now (but rather purely hypothetical to be of any significance in the near future or any future at all) – but have a way too to high price tag to be considered being any reasonable compromise nor any sound purchase for a ordinary gamer. Funny enough, that's exactly what Turing represents. Coincidence? I don't like to think so.They're smart and they're greedy. Given that the above points are pretty realistic and not too far-fetched to be the actual case, it's pretty reasonable at least to think about it or even take them into consideration they might be the actual case. They already fooled us on Pascal and how it would be a completely new architecture. Turned out, it wasn't but just Maxwell 2.0. They have done stunts like that already in the past, pretty often to be honest.
You just bring a 'new' generation that doesn't bring anything real new (hence the RT-deception to hide that fact), make it so that those cards are in fact technically un·purchasable in terms of inflated price tag, that the older generation must seem to look like a real bargain compared to the new ones – and then you're just sell the old generation instead. The customer has no other choice but to bite the bullet (due to lack of any reasonable competition's product) - there is simply no alternative but to swallow the pill.
The best part is, you make insane profits out of all of that. Like whatever the customer will do, you're listening to a good ol' Ka-Ching all day long, every day. The even better part is, that your next generation cards with a actual real new architecture will look even better compared to your last one – since you toned them down on purpose.
Indications
A pretty decent red flag (at least for me) was, when it came up that DICE is going to wind down the given effects and usage of any RT in Battlefield V. Why? And why should that be worth any red flag, you may ask?
Well, if we consider the state of facts than we see that DICE had to tone down the RT-effects to even reach playable frame-rates … And to be honest, they were pretty transparent on the reasons why. DICE, of all things! They're gamers and they're freaks (in a positive way), they're total geeks from top to toe and they absolutely love to do such things like that, joyfully! They're alwas prone to use the newest stuff, techniques and gadgets. A studio which – together with Square Enix – nine times out of ten are always the very first to adapt a new technology quite instantly (Mantle, TrueAudio, DirectX 12, Vulkan et cetera), if it has any greater potential in the future. You never have to ask them twice if it brings any benefit after all, at least from a gamer's perspective. Yes, if …
The fact that the whole response of the developers in general is anything but euphoric and also the overall corresponding echo (not only from DICE) on the whole chapter RT is rather noncommital, tells me there's something wrong. Especially way more than nVidia is ever going to tell us or acknowledge.
If even DICE doesn't really give the impression that the technology is that good nVidia tries to convince us it would be, well, who else then?
Somehow it doth me, that apart from the insanely hyped presentiation nVidia deliberately delivered, there is just simply way less substance than they want us to believe there would be – especially can't be there any talk of a "spectacular surprise", all conviction aside. If, at least on hardware side the potential would exist, yes, sure … Thing is, it just doesn't. 1080 not even on 60 fps. What kind of a joke is that even? I think exactly that seems to be the actual problem, if you look above …
They are neither unpurchasable nor overpriced. What many expected and Turing cards fail to do is push the price point of the performance level down. Perf/$ remains pretty exactly the same. 2080 effectively replaces 1080Ti and 2080Ti sits 30% above it, both in performance and price. And this is purely based on raster units and results. If any other new tech bets (RT, DLSS, variable-rate shading) should succeed, that would be on top of this.
By the way, Nvidia is clearly making less profit from 2080 than it does from 1080Ti. Developers response has been measured but positive. Game devs did not have Turing cards until very last moment before announcement so while they knew RT tech was coming, they had no details about what hardware did or the performance of it.
Reality: Nvidia develops GPU for new markets (deep learning / AI / automotive / HPC) and consumers get cut down leftovers called Geforce. A major accelerator in these new markets were Tensor cores. Furthermore, RT is actively being used in development of products and it offers real advantages in that setting - it already does that for many years now.
Turing @ Geforce is just their attempt to maximize profits on that development and find a use for their new GPUs that don't make the Quadro / Tesla / Titan 'cut'.
Anything they get along the way is bonus points to begin with. As for your assumptions:
- Nvidia does not benefit from making RT proprietary, they only benefit from making it run very well on their own hardware. Your idea of it being the next Hairworks or PhysX completely counters Nvidia's alleged push to make RT a 'thing to have'. You don't promote something and then do what you can to stop broad adoption of it. Remember: AMD still holds the console market so if they want broad adoption, they need to include console gaming in some way or another.
- Price point of Turing compared to Pascal, yes, obviously they priced Turing such that 'for a small premium' you can get similar performance as Pascal but with improved RT performance. That's just common sense when there is no competition and old stock to sell. The next move is, if Turing doesn't sell at all, that it gets a price cut when Pascal stock is gone. Your logic of them making Turing unattainable doesn't make any sense.
- Turing can be bought, its not like people didn't buy 700+ dollar GPUs before.
- Nvidia isn't skipping cost on generational R&D, they just don't release entire generations under a Geforce label ;) Volta never turned into a Geforce product. If you really think they stall development, you haven't paid attention. The reason Maxwell wasn't Pascal is because we were stuck with 28nm for much longer than anyone would have liked. The reason Pascal is Pascal is because the competition had no real game plan for anything past Hawaii, which nicely coincides with the console deals AMD struck. Their focus is elsewhere and its clear as day, Vega just confirmed it. Not Nvidia's problem, really, is it? At the same time, Pascal offered the largest generational performance boost in many years. That's quite a feat, considering they were doing nothing substantial to their architecture ;)
- Nvidia does not benefit from stalling because there are massive potential, emerging markets where they want to secure profit and marketshare for the next few decades. If they lag behind, they lose all of their investments in those markets - automotive is already hanging by a thread.
Had to fix this snippet from the technology overview conclusion. I mean who in their right mind would praise or give Editors Choice Awards to something that even the investors can see didn't meet expectations.
Some review sites seem to have taken a marketing role for companies rather than being honest. Its a shame really.