Thursday, September 20th 2018

NVIDIA Stock Falls 2.1% After Turing GPU Reviews Fail to Impress Morgan Stanley

NVIDIA's embargo on their Turing-based RTX 2080 and RTX 2080 Ti ended Wednesday, September 19 and it appears that enthusiasts were not the only ones left wanting more from these graphics cards. In particular, Morgan Stanley analyst Joseph Moore shared a note today (Thursday, September 20) with company clients saying "As review embargos broke for the new gaming products, performance improvements in older games is not the leap we had initially hoped for. Performance boost on older games that do not incorporate advanced features is somewhat below our initial expectations, and review recommendations are mixed given higher price points." The NVIDIA Corporation share value on the NASDAQ exchange had closed at $271.98 (USD) Wednesday and immediately tumbled down to a low of $264.10 opening today before recovering to close at $266.28, down 2.1% over the previous closure.

The Morgan Stanley report further mentioned that "We are surprised that the 2080 is only slightly better than the 1080ti, which has been available for over a year and is slightly less expensive. With higher clock speeds, higher core count, and 40% higher memory bandwidth, we had expected a bigger boost." Accordingly, the market analyst expects a slower adoption of these new GPUs as well as no expectation of "much upside" from NVIDIA's gaming business unit for the next two quarters. Despite all this, Morgan Stanley remains bullish on NVIDIA and expects a $273 price point in the long term.
Source: CNBC
Add your own comment

96 Comments on NVIDIA Stock Falls 2.1% After Turing GPU Reviews Fail to Impress Morgan Stanley

#26
techy1
DeathtoGnomesThis makes me wonder, after sufficient "old stock" is cleared out, I bet Nvidia will come out with a driver update that will increase performance by atleast 25%.
or Nvidia will come out with a driver update that will decrease (for Pascal and older) performance by atleast 25%
Posted on Reply
#27
Prima.Vera
megamanxtremeHow did that turn out?
Just like in Apple's case of slowing down older phones with each iOS upgrade....nothing. Nothing happened. And nothing will, since apparently and unfortunately there are no laws to prevent this kind of pure scumbag behavior.
Posted on Reply
#28
GreiverBlade
ahahah i knew it!
GreiverBlademmhhh not much impressed, for 1440p~
well technically ... new name and all, new tech and all (worse ... perf loss using it ... at 1080p :laugh: ) but same perf jump as if it was a GTX 1180 vs GTX 1080 ... but the price doesn't do like that ... small leap in perf, huge boost in price :roll:
Prima.VeraNah. They will come with a driver update to "increase" the performance of previous generation(s) with -25% ;), going Apple style.
Irony is, it won't be the first time they'll do that too...
well that confirm my goal to replace my 1070 with a... not a 1080/1080Ti, but a Vega 64 ... once the price stabilize (finally seeing some price going down ) if they do that (oohhh and they will probably do it )
Posted on Reply
#29
Xaled
FordGT90Concept2.1% is it? Expected a bigger drop.
nVidia bribed almost every "bribable" market analyst, youtuber, reviewers to crative a fake positive atmosohere, but they couldnt bribe respected ones like Stanley .. that may delay the impact of the coming storm. but it just cant stop it
Posted on Reply
#30
Smartcom5
DeathtoGnomesThis makes me wonder, after sufficient "old stock" is cleared out, I bet Nvidia will come out with a driver update that will increase performance by atleast 25%.
You mean „I bet Nvidia will come out with a driver update that will decrease performance by atleast 25%.“ for the (in that case) then older generation – just to ensure sales for the RTX 2xxx-Generation will go up again afterwards (since it's also pretty old by then).

I know, it's just a joke but honestly, wouldn't take me any wonder at all.
Would be virtually the very same what they did back then when they were forced to comply with a settlement on that GTX 970 3.5+5-issue.

Since the very same day they had to comply to end that class action lawsuit (and had to compensate all given owners their part of it by paying that 35$ USD each) to get the issue off the table, that very paricular day Jensen the green leather jacket wearing Hobbit™ decided to show everyone the bold, naked F as they were forced to comply with the GTX 970 owner's settlement and dropped the GTX 970 the very same day afterwards, making it legacy.

It's literally like they punished their own customers for even dare to questioning nVidia's God-given right to screw over their user-base and robs their customers with rather fancy prices …
ZubasaThats the thing, don't buy 2080 but 1080ti instead.
Don't buy nVidia by buying nVidia. :laugh:
So in the end it doesn't really matter for Huang.
Either pay for overprice GPU, or help clear out old stocks of 2-year old cards that are still over MSRP.
Absolutely, yes.
At this moment they're left with huge amounts of stocks of the older GTX 10x0-Generation, huge like really vast amounts of those.
There are rumours that their stock actually are up to a million (sic!), as even a single major AIB had to return +300.000 cards on his own alone, which is all to that mining-boom. Seems they just got way too greedy and misjudged the demand by wide margin.

Just imagine 1M cards à, let's say 500$ each? That would be half a billion on inventory. Half a fucking billion.
I'm pret·ty sure nVidia is NOT going to amortise those as being a classical loss-making business. nVidia is not going to waste such an opportunity to sell their older cards at MSRP.
Not going to happen, nVidia ist just way too greedy to let slip that offer to inflate profit.
Posted on Reply
#31
eidairaman1
The Exiled Airman
Smartcom5You mean „I bet Nvidia will come out with a driver update that will decrease performance by atleast 25%.“ for the (in that case) then older generation – just to ensure sales for the RTX 2xxx-Generation will go up again (since it's also pretty old by then).

I know, it's just a joke but honestly, wouldn't take me any wonder at all.
Would be virtually the very same what they did back then when they were forced to comply with a settlement on that GTX 970 3.5+5-issue.

Since the very same day they had to comply to end that class action lawsuit (and had to compensate all given owners their part of it by paying that 35$ USD each) to get the issue off the table, that very paricular day Jensen the green leather jacket wearing Hobbit™ decided to show everyone the bold, naked F as they were forced to comply with the GTX 970 owner's settlement and dropped the GTX 970 the very same day afterwards, making it legacy.

It's literally like they punished their own customers for even dare to questioning nVidia's God-given right to screw over their user-base and robs their customers with rather fancy prices …

Absolutely, yes.
At this moment they're left with huge amounts of stocks of the older GTX 10x0-Generation, huge like really vast amounts of those.
There are rumours that their stock actually are up to a million (sic!), as even a single major AIB had to return +300.000 cards on his own alone, which is all to that mining-boom. Seems they just got way too greedy and misjudged the demand by wide margin.

Just imagine 1M cards à, let's say 500$ each? That would be half a billion on inventory. Half a fucking billion.
I'm pret·ty sure nVidia is NOT going to amortise those as being a classical loss-making business. nVidia is not going to waste such an opportunity to sell their older cards at MSRP.
Not going to happen, nVidia ist just way too greedy to let slip that offer to inflate profit.
I heard nv is trying to get the game devs to reduce the amount of rays in games to make it seem like it is worth it when it hardly is not
Posted on Reply
#32
londiste
"We are surprised that the 2080 is only slightly better than the 1080ti, which has been available for over a year and is slightly less expensive," he said. "With higher clock speeds, higher core count, and 40% higher memory bandwidth, we had expected a bigger boost."
Aaannnd... feel free to disregard anything they say. Market sentiment, sure. It appears to be nicely based on pure emotion.

Technical or analytical side is nonexistent. This is almost completely incorrect facts.
- Higher clock speeds - 1480/1582 vs 1515/1710. Yeah, in specs. Not necessarily so much in reality as Boost 3/4 obfuscate things.
- Higher core count - 3584:224:88 vs 2944:184:64. Nope.
- 40% higher memory bandwidth - 484.3 vs 448.0. Nope.
Posted on Reply
#33
StrayKAT
AMD needs to just play it straight and focus on raw performance..

If they knew what was good for them. It's the perfect opportunity for them to shine.
Posted on Reply
#34
londiste
StrayKATAMD needs to just play it straight and focus on raw performance..
If they knew what was good for them. It's the perfect opportunity for them to shine.
They do not have enough time. Pulling a new chip out of the blue will take half a year, at least.
Vega20 is too expensive plus it seems to have some problems with gaming architecture-wise. Polaris is small. Navi is 2019/2020, so in a year, maybe.
And I bet Nvidia has a contingency plan if AMD should decide to compete in high end.

Nvidia is probably dieshrinking Turing to 7nm next year with whatever changes turn out to be necessary for RT and that is what AMD needs to compete with. That will be Navi and timeframe is next holiday season, not sooner.
DeathtoGnomesThis makes me wonder, after sufficient "old stock" is cleared out, I bet Nvidia will come out with a driver update that will increase performance by atleast 25%.
ZubasaAt best they can do is to screw over the old cards performance :roll:
Prima.VeraNah. They will come with a driver update to "increase" the performance of previous generation(s) with -25% ;), going Apple style.
techy1or Nvidia will come out with a driver update that will decrease (for Pascal and older) performance by atleast 25%
I mean... do you guys actually believe that?
Posted on Reply
#35
Candor
Clearly nvidia is counting on DLSS to widen the gap between the 1080ti and the 2080.

The fact that they felt it necessary to release the TI version at the same time shows that they knew the 2080's rasterizing performance alone wasn't going to impress anyone.

Until DLSS is supported in pre-exsisting games (if they choose to support it), not just upcoming titles, the 2080 looks bad.

In the past, a new GPU launch's main focus has been about raw performance, shiny new features being secondary.

The fact that they mainly focused on RTX tech felt like they were trying to hide something, and they were.

The 2080's performance is unremarkable, unless you're using DLSS anti aliasing.

If they wanted to impress with raw performance, which I think is what most of us care about, then DLSS is the real gem here, not RTX.

We're not fools nvidia. This launch left a bad taste in our mouths, and the stock market reflects this.

At least the reflections were ray-traced I suppose :rolleyes:
Posted on Reply
#36
HTC
How can this be?

With all that money saved by buying more cards, them stocks should be up by @ least 10% ...

Surely Stanley must have interpreted this the wrong way: maybe Stanley used the wrong tensor cores (Titan V's instead of RTX 2080 Ti's) and "machine learned" wrong ...
Posted on Reply
#37
techy1
2.1% is nothing (stocks without news or analyst input can move more than that). but this RTX can be more then just an one gen failure for nivida, because PC gaming is at its lowest since that mining craze there are very few gamers that wanted to buy or upgrade hardware. and now they face 2.5 year old Pascal or RTX (aka -performance/price evenworse than Vega) that can baerly move 4k60 cost 900$+ and could only move 1080 40fps in RTX demos (that in real life wontexist for this gen). most of these gamers say - "F it, I buy 400$ console". About the bright Ray Traced Future potential - there is NONE, at least for this and next gen - no devs will do anything for 0.1% of the small PC market (rtx capable card - lets assume rtx 2080 - it cost from 900$ - there will never be a lot of people that will have 800-900$ gpus even if these would wipe the floow with gtx 1080 ti). When rtx 3060 (or 4060) will cost 250$ and can run RTX 1440p high frame rate or 4K - then there will be lot of RTX capable hardware out and only then devs will start to implement and optimize Ray Tracing. But I see that nvidia just want to milk this weak PC gaming cow till it dies and then nvidia will go full AI, I guess.
Posted on Reply
#38
londiste
CandorClearly nvidia is counting on DLSS to widen the gap between the 1080ti and the 2080.
Not directly, no. 2080 and 1080Ti are pretty evenly matched. Once the launch crazyness subsides, even the price should even out.
Yes, used cards and stock clearances from miners will keep the used price down and 1080Ti an excellent value proposition but manufacturer is not able to affect the second hand market anyway.
Posted on Reply
#39
Xaled
londisteAaannnd... feel free to disregard anything they say. Market sentiment, sure. It appears to be nicely based on pure emotion.

Technical or analytical side is nonexistent. This is almost completely incorrect facts.
- Higher clock speeds - 1480/1582 vs 1515/1710. Yeah, in specs. Not necessarily so much in reality as Boost 3/4 obfuscate things.
- Higher core count - 3584:224:88 vs 2944:184:64. Nope.
- 40% higher memory bandwidth - 484.3 vs 448.0. Nope.
So you are accepting that nVidia made a worse card that is much more expensive?
Posted on Reply
#40
RejZoR
Morgan Stanley doesn't seem to understand how graphics industry works. It's always that new high end is about equal to the last year top of the line. And new current year top of the line surpasses all of it by at least 20%. RTX 2000 series are no exception. RTX 2080 is as fast as GTX 1080Ti. RTX 2080Ti is faster than both. So, nothing out of the ordinary here. What is the actual issue here is the pricing. Positioning RTX 2080 250€ higher because it has some ray tracing is just absurd when performance wise it's the same as 250€ cheaper GTX 1080Ti. That's where is the real issue. And same is with RTX 2080Ti. Sure, it has no direct competitor, but that doesn't mean they can just price it to infinity. And that's where Morgan Stanley is probably having the objections. Every year top end cards were 800€ and now it's all of a sudden 1200€. Sure there are people who were throwing money at it even before anyone even released performance numbers, but we can all be sure number of such people is way lower than the number which was willing to shell out 800€...
Posted on Reply
#41
londiste
XaledSo you are accepting that nVidia made a worse card that is much more expensive?
I would not say it is a worse card. They are about the same. MSRP is the same as well at $699.
Much more expensive is a temporary thing.

Remember the launch of 1080? $599/$699 MSRP and these sold like hotcakes for 750-800 moneys.
1080Ti? $699 MSRP and it took over a month to get prices below the same 750-800 moneys.
Vega64? $499 MSRP, besides some fairly small shipments these sold for 700-ish moneys for a long while.
And none of these launches were really affected by mining craze. That reached high end cards later.
Posted on Reply
#42
Prima.Vera
Actually the 2080 should have been the 2070 of this generation, at least with 250$ cheaper. At least.
Then nobody would have complained a bit. On a contrary even...
Posted on Reply
#43
ianatikin
3 years ago their stock was worth tenth of today's - some 25 bucks. Jenny is still laughing at Your face going to his bank.
Posted on Reply
#44
Smartcom5
Midland Dogturing was ment for 7nm just as maxwell was meant for 16(14 or whatever same shit)
I doubt that. For me it seems they just slammed some Tensor-cores on Pascal to have a multitude of opportunities … and made Turing way to overpriced and literally unpurchasable on purpose.

Just imagine all the benefits!
  • You're skipping a whole true generation and given costs on R&D and manufacturing
  • nVidia officially postponed their 'Next Generation Mainstream GPU' based upon Volta on Hot Chips 30, wasn't by accident
  • You're able to clear your insane inventories and sell the vast stock of older cards at full MSRP
    … since everyone is going to say »Well, fuck off nVidia – I'm not going to buy this overpriced shit. I'll get a GTX 1080 instead!«
  • You're able to establish another price increase literally by just passing by
  • You're able to establish another proprietary nVidia-only standard à la PhysX GameWorks, also just en passant
    … since PhysX, then Tesselation, then HairWorks then G-Sync didn't really paid out for you as people look through your shitty attempt in dividing the market and tighten your closed ecosystem
  • It will work
    … since AMD isn't able to keep up with you anyway.
All you have had to do was to bring a 'new' generation's card which has techniques which are pretty much useless and ain't even any futureproof as of now (but rather purely hypothetical to be of any significance in the near future or any future at all) – but have a way too to high price tag to be considered being any reasonable compromise nor any sound purchase for a ordinary gamer. Funny enough, that's exactly what Turing represents. Coincidence? I don't like to think so.

They're smart and they're greedy. Given that the above points are pretty realistic and not too far-fetched to be the actual case, it's pretty reasonable at least to think about it or even take them into consideration they might be the actual case. They already fooled us on Pascal and how it would be a completely new architecture. Turned out, it wasn't but just Maxwell 2.0. They have done stunts like that already in the past, pretty often to be honest.

You just bring a 'new' generation that doesn't bring anything real new (hence the RT-deception to hide that fact), make it so that those cards are in fact technically un·purchasable in terms of inflated price tag, that the older generation must seem to look like a real bargain compared to the new ones – and then you're just sell the old generation instead. The customer has no other choice but to bite the bullet (due to lack of any reasonable competition's product) - there is simply no alternative but to swallow the pill.

The best part is, you make insane profits out of all of that. Like whatever the customer will do, you're listening to a good ol' Ka-Ching all day long, every day. The even better part is, that your next generation cards with a actual real new architecture will look even better compared to your last one – since you toned them down on purpose.

Indications
A pretty decent red flag (at least for me) was, when it came up that DICE is going to wind down the given effects and usage of any RT in Battlefield V. Why? And why should that be worth any red flag, you may ask?

Well, if we consider the state of facts than we see that DICE had to tone down the RT-effects to even reach playable frame-rates … And to be honest, they were pretty transparent on the reasons why. DICE, of all things! They're gamers and they're freaks (in a positive way), they're total geeks from top to toe and they absolutely love to do such things like that, joyfully! They're alwas prone to use the newest stuff, techniques and gadgets. A studio which – together with Square Enix – nine times out of ten are always the very first to adapt a new technology quite instantly (Mantle, TrueAudio, DirectX 12, Vulkan et cetera), if it has any greater potential in the future. You never have to ask them twice if it brings any benefit after all, at least from a gamer's perspective. Yes, if

The fact that the whole response of the developers in general is anything but euphoric and also the overall corresponding echo (not only from DICE) on the whole chapter RT is rather noncommital, tells me there's something wrong. Especially way more than nVidia is ever going to tell us or acknowledge.

If even DICE doesn't really give the impression that the technology is that good nVidia tries to convince us it would be, well, who else then?
Somehow it doth me, that apart from the insanely hyped presentiation nVidia deliberately delivered, there is just simply way less substance than they want us to believe there would be – especially can't be there any talk of a "spectacular surprise", all conviction aside. If, at least on hardware side the potential would exist, yes, sure … Thing is, it just doesn't. 1080 not even on 60 fps. What kind of a joke is that even?
eidairaman1I heard nv is trying to get the game devs to reduce the amount of rays in games to make it seem like it is worth it when it hardly is not
I think exactly that seems to be the actual problem, if you look above …
Posted on Reply
#45
Vya Domus
Considering how over inflated their share price is this is nothing.
StrayKATIf they knew what was good for them.
They do know what is good for them and it turns out it's not high-end consumer GPUs.
Posted on Reply
#46
londiste
Smartcom5I doubt that. For me it seems they just slammed some Tensor-cores on Pascal to have a multitude of opportunities …
For one thing, compute units in Turing are clearly based on Volta, not Pascal. Volta already had Tensor cores. RT cores are what were added.
Smartcom5You're able to establish another proprietary nVidia-only standard à la PhysX GameWorks, also just en passant
DX12's DXR and Vulkan's Raytracing extensions disagree with what you are saying.
Smartcom5You just bring a 'new' generation that doesn't bring anything real new (hence the RT-deception to hide that fact), make it so that those cards are in fact technically un·purchasable in terms of inflated price tag, that the older generation must seem to look like a real bargain compared to the new ones – and then you're just sell the old generation instead. The customer has no other choice but to bite the bullet (due to lack of any reasonable competition's product) - there is simply no alternative but to swallow the pill.
Why is RT not new?
They are neither unpurchasable nor overpriced. What many expected and Turing cards fail to do is push the price point of the performance level down. Perf/$ remains pretty exactly the same. 2080 effectively replaces 1080Ti and 2080Ti sits 30% above it, both in performance and price. And this is purely based on raster units and results. If any other new tech bets (RT, DLSS, variable-rate shading) should succeed, that would be on top of this.

By the way, Nvidia is clearly making less profit from 2080 than it does from 1080Ti.
Smartcom5The fact that the whole response of the developers in general is anything but euphoric and also the overall corresponding echo (not only from DICE) on the whole chapter RT is rather noncommital, tells me there's something wrong.
Developers response has been measured but positive. Game devs did not have Turing cards until very last moment before announcement so while they knew RT tech was coming, they had no details about what hardware did or the performance of it.
Posted on Reply
#47
Vayra86
Smartcom5I doubt that. For me it seems they just slammed some Tensor-cores on Pascal to have a multitude of opportunities … and made Turing way to overpriced and literally unpurchasable on purpose.

Just imagine all the benefits!
  • You're skipping a whole true generation and given costs on R&D and manufacturing
  • nVidia officially postponed their 'Next Generation Mainstream GPU' based upon Volta on Hot Chips 30, wasn't by accident
  • You're able to clear your insane inventories and sell the vast stock of older cards at full MSRP
    … since everyone is going to say »Well, fuck off nVidia – I'm not going to buy this overpriced shit. I'll get a GTX 1080 instead!«
  • You're able to establish another price increase literally by just passing by
  • You're able to establish another proprietary nVidia-only standard à la PhysX GameWorks, also just en passant
    … since PhysX, then Tesselation, then HairWorks then G-Sync didn't really paid out for you as people look through your shitty attempt in dividing the market and tighten your closed ecosystem
  • It will work
    … since AMD isn't able to keep up with you anyway.
All you have had to do was to bring a 'new' generation's card which has techniques which are pretty much useless and ain't even any futureproof as of now (but rather purely hypothetical to be of any significance in the near future or any future at all) – but have a way too to high price tag to be considered being any reasonable compromise nor any sound purchase for a ordinary gamer. Funny enough, that's exactly what Turing represents. Coincidence? I don't like to think so.

They're smart and they're greedy. Given that the above points are pretty realistic and not too far-fetched to be the actual case, it's pretty reasonable at least to think about it or even take them into consideration they might be the actual case. They already fooled us on Pascal and how it would be a completely new architecture. Turned out, it wasn't but just Maxwell 2.0. They have done stunts like that already in the past, pretty often to be honest.

You just bring a 'new' generation that doesn't bring anything real new (hence the RT-deception to hide that fact), make it so that those cards are in fact technically un·purchasable in terms of inflated price tag, that the older generation must seem to look like a real bargain compared to the new ones – and then you're just sell the old generation instead. The customer has no other choice but to bite the bullet (due to lack of any reasonable competition's product) - there is simply no alternative but to swallow the pill.

The best part is, you make insane profits out of all of that. Like whatever the customer will do, you're listening to a good ol' Ka-Ching all day long, every day. The even better part is, that your next generation cards with a actual real new architecture will look even better compared to your last one – since you toned them down on purpose.

Indications
A pretty decent red flag (at least for me) was, when it came up that DICE is going to wind down the given effects and usage of any RT in Battlefield V. Why? And why should that be worth any red flag, you may ask?

Well, if we consider the state of facts than we see that DICE had to tone down the RT-effects to even reach playable frame-rates … And to be honest, they were pretty transparent on the reasons why. DICE, of all things! They're gamers and they're freaks (in a positive way), they're total geeks from top to toe and they absolutely love to do such things like that, joyfully! They're alwas prone to use the newest stuff, techniques and gadgets. A studio which – together with Square Enix – nine times out of ten are always the very first to adapt a new technology quite instantly (Mantle, TrueAudio, DirectX 12, Vulkan et cetera), if it has any greater potential in the future. You never have to ask them twice if it brings any benefit after all, at least from a gamer's perspective. Yes, if

The fact that the whole response of the developers in general is anything but euphoric and also the overall corresponding echo (not only from DICE) on the whole chapter RT is rather noncommital, tells me there's something wrong. Especially way more than nVidia is ever going to tell us or acknowledge.

If even DICE doesn't really give the impression that the technology is that good nVidia tries to convince us it would be, well, who else then?
Somehow it doth me, that apart from the insanely hyped presentiation nVidia deliberately delivered, there is just simply way less substance than they want us to believe there would be – especially can't be there any talk of a "spectacular surprise", all conviction aside. If, at least on hardware side the potential would exist, yes, sure … Thing is, it just doesn't. 1080 not even on 60 fps. What kind of a joke is that even?

I think exactly that seems to be the actual problem, if you look above …
Way too complicated.

Reality: Nvidia develops GPU for new markets (deep learning / AI / automotive / HPC) and consumers get cut down leftovers called Geforce. A major accelerator in these new markets were Tensor cores. Furthermore, RT is actively being used in development of products and it offers real advantages in that setting - it already does that for many years now.

Turing @ Geforce is just their attempt to maximize profits on that development and find a use for their new GPUs that don't make the Quadro / Tesla / Titan 'cut'.

Anything they get along the way is bonus points to begin with. As for your assumptions:

- Nvidia does not benefit from making RT proprietary, they only benefit from making it run very well on their own hardware. Your idea of it being the next Hairworks or PhysX completely counters Nvidia's alleged push to make RT a 'thing to have'. You don't promote something and then do what you can to stop broad adoption of it. Remember: AMD still holds the console market so if they want broad adoption, they need to include console gaming in some way or another.

- Price point of Turing compared to Pascal, yes, obviously they priced Turing such that 'for a small premium' you can get similar performance as Pascal but with improved RT performance. That's just common sense when there is no competition and old stock to sell. The next move is, if Turing doesn't sell at all, that it gets a price cut when Pascal stock is gone. Your logic of them making Turing unattainable doesn't make any sense.

- Turing can be bought, its not like people didn't buy 700+ dollar GPUs before.

- Nvidia isn't skipping cost on generational R&D, they just don't release entire generations under a Geforce label ;) Volta never turned into a Geforce product. If you really think they stall development, you haven't paid attention. The reason Maxwell wasn't Pascal is because we were stuck with 28nm for much longer than anyone would have liked. The reason Pascal is Pascal is because the competition had no real game plan for anything past Hawaii, which nicely coincides with the console deals AMD struck. Their focus is elsewhere and its clear as day, Vega just confirmed it. Not Nvidia's problem, really, is it? At the same time, Pascal offered the largest generational performance boost in many years. That's quite a feat, considering they were doing nothing substantial to their architecture ;)

- Nvidia does not benefit from stalling because there are massive potential, emerging markets where they want to secure profit and marketshare for the next few decades. If they lag behind, they lose all of their investments in those markets - automotive is already hanging by a thread.
Posted on Reply
#49
LDNL
"We have several RTX 2080 and 2080 Ti boards here for full review, and the results are extremely disappointing so far. "

Had to fix this snippet from the technology overview conclusion. I mean who in their right mind would praise or give Editors Choice Awards to something that even the investors can see didn't meet expectations.

Some review sites seem to have taken a marketing role for companies rather than being honest. Its a shame really.
Posted on Reply
#50
ShurikN
DeathtoGnomesThis makes me wonder, after sufficient "old stock" is cleared out, I bet Nvidia will come out with a driver update that will increase performance by atleast 25%.
After old stock clears out, Nvidia will come out with next gen cards on 7nm, with more rt cores, more tensor cores, less power, more clocks etc. And those will actually run RT stuff the way it was meant.
Posted on Reply
Add your own comment
Sep 17th, 2024 11:25 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts