Thursday, March 21st 2019

NVIDIA: Turing Adoption Rate 45% Higher Than Pascal, 90% of Users Buying Upwards in Graphics Product Tiers
NVIDIA during its investor day revealed some interesting statistics on its Turing-based graphics cards. The company essentially announced that revenue for Turing graphics cards sales increased 45% over that generated when NVIDIA introduced their Pascal architecture - which does make sense, when you consider how NVIDIA actually positioned its same processor tiers (**60, **70, **80) in higher pricing brackets than previously. NVIDIA's own graphics showcase this better than anyone else could, with a clear indication of higher pricing for the same graphics tier. According to the company, 90% of users are actually buying pricier graphics cards this generation than they were in the previous one -which makes sense, since a user buying a 1060 at launch would only have to pay $249, while the new RTX 2060 goes for $349.
Other interesting tidbits from NVIDIA's presentation at its investor day is that Pascal accounts for around 50% of the installed NVIDIA graphics cards, while Turing, for now, only accounts for 2% of that. This means 48% of users sporting an NVIDIA graphics card are using Maxwell or earlier designs, which NVIDIA says presents an incredible opportunity for increased sales as these users make the jump to the new Turing offerings - and extended RTX feature set. NVIDIA stock valuation grew by 5.82% today, likely on the back of this info.
Source:
NVIDIA via WCCFTech
Other interesting tidbits from NVIDIA's presentation at its investor day is that Pascal accounts for around 50% of the installed NVIDIA graphics cards, while Turing, for now, only accounts for 2% of that. This means 48% of users sporting an NVIDIA graphics card are using Maxwell or earlier designs, which NVIDIA says presents an incredible opportunity for increased sales as these users make the jump to the new Turing offerings - and extended RTX feature set. NVIDIA stock valuation grew by 5.82% today, likely on the back of this info.
111 Comments on NVIDIA: Turing Adoption Rate 45% Higher Than Pascal, 90% of Users Buying Upwards in Graphics Product Tiers
I smell a new class action lawsuit down the road from stock holders.
Never trust a leather jacket smiling snake like Jensen.
I bet people start complaining about my suggestion - despite that if we had the same tech development as it was in 2000s, rtx 2070 perfomance for low-end 150$ card wouldnt be an absurd.
Real problem with him is that i do not get the vibe he is honest when he is on stage talking, something is really off with him.
Sure he is a salesman BUT this one acts like he is selling snake oil trying to over convince you and comes across like a fake.
Then you see the 50% GPU drop in NV's own report. :D 1. It was 286 in October, so I do not get your point.
2. So the $699 RTX2080 is competing with the 2 year old $699 1080Ti. And NV has only one better card, the 2080Ti, with less than 30% performance upgrade and a 1,5x cost for $1200. Funny thing.
3. If you are for FHD, even the RX590 is a better deal with 3 games included than a 1660Ti with 0 games. Not to speak about the cheaper RX580. Performance. Kicks. HAHA. Its a 30% bump for a $100 increase. You don't remember what increase there was from the GTX 980 to the GTX 1080? I help you: more than 60% for $50 increase. Poor milked guy.
My excess income could easily afford many 2080 Ti's a year so it isn't about whether I can. I refuse to be herded into price bracket simply because I want the best. $900-1200 for a graphics card is absurd.
Nvidia can go kick rocks.
If the price for new cards increased as much as the 10xx to 20xx series every gen GPU's should cost 10's of thousands of dollars by now.
Nvidia is using their performance gap over AMD to make as much money as possible and I cant fault them for that.
But they really do rake their customers over the coals for no good reason to chase greater profits and consumers praise them for it which confuses me.
There're alot of
dummiesusers here, who bought rtx in the heat of passion, so now they won't overthink that purchase. Or, they already did, realised all the stuff, and that makes them even more angry.I mean, 1200$ - u must be REALLY itchy to justify that much for cut tu102 :D :toast:
You're just an angry troll who can't afford an RTX card and is talking down to those who can/have. Or maybe you're an AMD fanboy who's butt-hurt because NVidia has pulled ahead in performance again. Maybe both. Either way, get over it and quit trolling.
Bottom line Turing is selling like crap. They've admitted that Turing is not selling well on other calls that I've posted article links to in other threads. They're just not getting it - or they're just too arrogant to admit they made an error (likely the latter). Turing is priced HORRIBLY and most people are not buying what they're selling - both literally and figuratively. The reviews were garbage for the price points they are selling them at. If they want to sell Turing, they have to price it appropriately and right now they're nowhere close.
At this point I have no faith Nvidia will get the 3000-series priced right, either. I'm honestly expecting to have to sit on my 1080 Ti cards until Intel releases discrete video cards and see if they're at all competitive. I'm someone who generally buys Titan & Ti top-end cards - I've been ready to upgrade my cards for about 6-9 months now as I'm GPU limited in certain games at this point (not to mention I want to get ready for CP2077), but I simply will not pay what NVidia wants for minimal performance upgrades in Turing - for the price, they simply perform like shit.
2. The price of their top card, adjusted for inflation has been around $700 since year 2000
3. Not long ago, the 1080 was running as high as $950 ... now many at $700. Is there an $800 difference between the 2 ? And lets
images.hardocp.com/images/news/1489189662xrJkzvohX8_1_1.png
And while yes it has more and better everything, i remember paying $1k for a 1 GB HD ... and my desktop workstations in the early 1990s were $6k ... now they $2k or under.
Are they taking advantage of the lack of competition ? Of course, welcome to capitalism :) .... companies and directors are legally required to maximize profits .... without breaking any laws of course. Failure to do so has the board members sitting down with headhunters to find new jobs. I have a hard time calling the 2080 anything but a hi-end card .... If we look at the radeon VII as AMDs "hi-end", then by comparison everything from the 2070 on up must be hi end.
2080 => 2080 Ti = 21.6% increase in performance ... for 68.8% increase in price .... 27.5% increase in system price (ROI = 0.785)
2070 => 2080 = 19.8% increase in performance ... for 45.5% increase in price .... 14.3% increase in system price (ROI = 1.385)
2060 => 2070 = 16.7% increase in performance ... for 41.0% increase in price .... 10.0% increase in system price (ROI = 1.670)
1660 Ti => 2060 = 17.0% increase in performance ... for 26.2 % increase in price .... 5.4% increase in system price (ROI = 3.148)
1660 => 1660 Ti = 13.0% increase in performance ... for 20.2 % increase in price .... 3.6% increase in system price (ROI = 3.611)
The above assumes a $1200 build cost exclusive of GFX card .... now of course if your spending $1500 on a GFX card, your not choosing a lower end CPU, prolly a 9900k. So it is heavily skewed and conservative. At the top if the list, those system prices will likely be significantly higher. In addition, the above the law of diminishing returns. The higher you go, the more it costs for the same incremental performance increases. So it's rather easy to justify the extra $52 for a 1660 Ti oiver a 1660. Easy again to go the $81 to the 2060 ... each step ya take will cost you much more money however. That last jump has a negative ROI. At some point, and assuming tariffs on PC parts end, I think we will see the 2080 < $700 (say $659 ish) and the 2070 < $500, say $459 ish. The Ti will hold its premium a bit longer but by fall, I'm thinking $850 - $899
The stock market is a poor yardstick for anything as in the tech sector it's so subject to manipulation bu pump and dumpers, pumping the stock and then selling their shares short. Then its all doom and gloom, again with no real evidence, where the follow up and buy cheap.... only t repeat the cycle a few months later. The realty remains nVidia has increased their market share significantly (1%)in the last month while AMD has dropped 0.6% and Intel has dropped 0.5% and that can not be explained away ... by no means evidence of a long term trend, but certainly indicative of the market seeing current pricing levels as acceptable.
As far as ranking the Nvidia GPUs as Entry Level, Midrange and High End take a look at the Memory Bus and size of the chip/ transistor count. If you do then you will see a very, very clear distinction between the 2080 and 2080 Ti. Price is irrelevant in that regard. The 2080 (non-Ti) is nothing more than a midrange card.
Look at the Memory Bus width and the size of the chip/ transistor count to see that the 2080 (non-Ti) is a midrange Turing GPU exactly as Nvidia intended it to be.
In the real world people look at price and performance.