• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Stock Falls 2.1% After Turing GPU Reviews Fail to Impress Morgan Stanley

Joined
Jan 20, 2014
Messages
299 (0.08/day)
System Name gamingPZ
Processor i7-6700k
Motherboard Asrock Z170M Pro4S
Cooling scythe mugen4
Memory 32GB ddr4 2400mhz crucial ballistix sport lt
Video Card(s) gigabyte GTX 1070 ti
Storage ssd - crucial MX500 1TB
Case silverstone sugo sg10
Power Supply Evga G2 650w
Software win10
This makes me wonder, after sufficient "old stock" is cleared out, I bet Nvidia will come out with a driver update that will increase performance by atleast 25%.
or Nvidia will come out with a driver update that will decrease (for Pascal and older) performance by atleast 25%
 
Joined
Sep 15, 2011
Messages
6,759 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
How did that turn out?

Just like in Apple's case of slowing down older phones with each iOS upgrade....nothing. Nothing happened. And nothing will, since apparently and unfortunately there are no laws to prevent this kind of pure scumbag behavior.
 
Joined
May 9, 2012
Messages
8,545 (1.85/day)
Location
Ovronnaz, Wallis, Switzerland
System Name main/SFFHTPCARGH!(tm)/Xiaomi Mi TV Stick/Samsung Galaxy S23/Ally
Processor Ryzen 7 5800X3D/i7-3770/S905X/Snapdragon 8 Gen 2/Ryzen Z1 Extreme
Motherboard MSI MAG B550 Tomahawk/HP SFF Q77 Express/uh?/uh?/Asus
Cooling Enermax ETS-T50 Axe aRGB /basic HP HSF /errr.../oh! liqui..wait, no:sizable vapor chamber/a nice one
Memory 64gb DDR4 3600/8gb DDR3 1600/2gbLPDDR3/8gbLPDDR5x/16gb(10 sys)LPDDR5 6400
Video Card(s) Hellhound Spectral White RX 7900 XTX 24gb/GT 730/Mali 450MP5/Adreno 740/Radeon 780M 6gb LPDDR5
Storage 250gb870EVO/500gb860EVO/2tbSandisk/NVMe2tb+1tb/4tbextreme V2/1TB Arion/500gb/8gb/256gb/4tb SN850X
Display(s) X58222 32" 2880x1620/32"FHDTV/273E3LHSB 27" 1920x1080/6.67"/AMOLED 2X panel FHD+120hz/7" FHD 120hz
Case Cougar Panzer Max/Elite 8300 SFF/None/Gorilla Glass Victus 2/front-stock back-JSAUX RGB transparent
Audio Device(s) Logi Z333/SB Audigy RX/HDMI/HDMI/Dolby Atmos/KZ x HBB PR2/Moondrop Chu II + TRN BT20S
Power Supply Chieftec Proton BDF-1000C /HP 240w/12v 1.5A/USAMS GAN PD 33w/USAMS GAN 100w
Mouse Speedlink Sovos Vertical-Asus ROG Spatha-Logi Ergo M575/Xiaomi XMRM-006/touch/touch
Keyboard Endorfy Thock 75%/Lofree Edge/none/touch/virtual
VR HMD Medion Erazer
Software Win10 64/Win8.1 64/Android TV 8.1/Android 14/Win11 64
Benchmark Scores bench...mark? i do leave mark on bench sometime, to remember which one is the most comfortable. :o
ahahah i knew it!
mmhhh not much impressed, for 1440p~

well technically ... new name and all, new tech and all (worse ... perf loss using it ... at 1080p :laugh: ) but same perf jump as if it was a GTX 1180 vs GTX 1080 ... but the price doesn't do like that ... small leap in perf, huge boost in price :roll:

Nah. They will come with a driver update to "increase" the performance of previous generation(s) with -25% ;), going Apple style.
Irony is, it won't be the first time they'll do that too...
well that confirm my goal to replace my 1070 with a... not a 1080/1080Ti, but a Vega 64 ... once the price stabilize (finally seeing some price going down ) if they do that (oohhh and they will probably do it )
 
Joined
Jul 17, 2011
Messages
85 (0.02/day)
System Name Custom build, AMD/ATi powered.
Processor AMD FX™ 8350 [8x4.6 GHz]
Motherboard AsRock 970 Extreme3 R2.0
Cooling be quiet! Dark Rock Advanced C1
Memory Crucial, Ballistix Tactical, 16 GByte, 1866, CL9
Video Card(s) AMD Radeon HD 7850 Black Edition, 2 GByte GDDR5
Storage 250/500/1500/2000 GByte, SSD: 60 GByte
Display(s) Samsung SyncMaster 950p
Case CoolerMaster HAF 912 Pro
Audio Device(s) 7.1 Digital High Definition Surround
Power Supply be quiet! Straight Power E9 CM 580W
Software Windows 7 Ultimate x64, SP 1
This makes me wonder, after sufficient "old stock" is cleared out, I bet Nvidia will come out with a driver update that will increase performance by atleast 25%.
You mean „I bet Nvidia will come out with a driver update that will decrease performance by atleast 25%.“ for the (in that case) then older generation – just to ensure sales for the RTX 2xxx-Generation will go up again afterwards (since it's also pretty old by then).

I know, it's just a joke but honestly, wouldn't take me any wonder at all.
Would be virtually the very same what they did back then when they were forced to comply with a settlement on that GTX 970 3.5+5-issue.

Since the very same day they had to comply to end that class action lawsuit (and had to compensate all given owners their part of it by paying that 35$ USD each) to get the issue off the table, that very paricular day Jensen the green leather jacket wearing Hobbit™ decided to show everyone the bold, naked F as they were forced to comply with the GTX 970 owner's settlement and dropped the GTX 970 the very same day afterwards, making it legacy.

It's literally like they punished their own customers for even dare to questioning nVidia's God-given right to screw over their user-base and robs their customers with rather fancy prices …
Thats the thing, don't buy 2080 but 1080ti instead.
Don't buy nVidia by buying nVidia. :laugh:
So in the end it doesn't really matter for Huang.
Either pay for overprice GPU, or help clear out old stocks of 2-year old cards that are still over MSRP.
Absolutely, yes.
At this moment they're left with huge amounts of stocks of the older GTX 10x0-Generation, huge like really vast amounts of those.
There are rumours that their stock actually are up to a million (sic!), as even a single major AIB had to return +300.000 cards on his own alone, which is all to that mining-boom. Seems they just got way too greedy and misjudged the demand by wide margin.

Just imagine 1M cards à, let's say 500$ each? That would be half a billion on inventory. Half a fucking billion.
I'm pret·ty sure nVidia is NOT going to amortise those as being a classical loss-making business. nVidia is not going to waste such an opportunity to sell their older cards at MSRP.
Not going to happen, nVidia ist just way too greedy to let slip that offer to inflate profit.
 
Last edited:

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,540 (6.67/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
You mean „I bet Nvidia will come out with a driver update that will decrease performance by atleast 25%.“ for the (in that case) then older generation – just to ensure sales for the RTX 2xxx-Generation will go up again (since it's also pretty old by then).

I know, it's just a joke but honestly, wouldn't take me any wonder at all.
Would be virtually the very same what they did back then when they were forced to comply with a settlement on that GTX 970 3.5+5-issue.

Since the very same day they had to comply to end that class action lawsuit (and had to compensate all given owners their part of it by paying that 35$ USD each) to get the issue off the table, that very paricular day Jensen the green leather jacket wearing Hobbit™ decided to show everyone the bold, naked F as they were forced to comply with the GTX 970 owner's settlement and dropped the GTX 970 the very same day afterwards, making it legacy.

It's literally like they punished their own customers for even dare to questioning nVidia's God-given right to screw over their user-base and robs their customers with rather fancy prices …

Absolutely, yes.
At this moment they're left with huge amounts of stocks of the older GTX 10x0-Generation, huge like really vast amounts of those.
There are rumours that their stock actually are up to a million (sic!), as even a single major AIB had to return +300.000 cards on his own alone, which is all to that mining-boom. Seems they just got way too greedy and misjudged the demand by wide margin.

Just imagine 1M cards à, let's say 500$ each? That would be half a billion on inventory. Half a fucking billion.
I'm pret·ty sure nVidia is NOT going to amortise those as being a classical loss-making business. nVidia is not going to waste such an opportunity to sell their older cards at MSRP.
Not going to happen, nVidia ist just way too greedy to let slip that offer to inflate profit.

I heard nv is trying to get the game devs to reduce the amount of rays in games to make it seem like it is worth it when it hardly is not
 
Joined
Feb 3, 2017
Messages
3,812 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
"We are surprised that the 2080 is only slightly better than the 1080ti, which has been available for over a year and is slightly less expensive," he said. "With higher clock speeds, higher core count, and 40% higher memory bandwidth, we had expected a bigger boost."
Aaannnd... feel free to disregard anything they say. Market sentiment, sure. It appears to be nicely based on pure emotion.

Technical or analytical side is nonexistent. This is almost completely incorrect facts.
- Higher clock speeds - 1480/1582 vs 1515/1710. Yeah, in specs. Not necessarily so much in reality as Boost 3/4 obfuscate things.
- Higher core count - 3584:224:88 vs 2944:184:64. Nope.
- 40% higher memory bandwidth - 484.3 vs 448.0. Nope.
 
Joined
Sep 7, 2017
Messages
3,244 (1.22/day)
System Name Grunt
Processor Ryzen 5800x
Motherboard Gigabyte x570 Gaming X
Cooling Noctua NH-U12A
Memory Corsair LPX 3600 4x8GB
Video Card(s) Gigabyte 6800 XT (reference)
Storage Samsung 980 Pro 2TB
Display(s) Samsung CFG70, Samsung NU8000 TV
Case Corsair C70
Power Supply Corsair HX750
Software Win 10 Pro
AMD needs to just play it straight and focus on raw performance..

If they knew what was good for them. It's the perfect opportunity for them to shine.
 
Joined
Feb 3, 2017
Messages
3,812 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
AMD needs to just play it straight and focus on raw performance..
If they knew what was good for them. It's the perfect opportunity for them to shine.
They do not have enough time. Pulling a new chip out of the blue will take half a year, at least.
Vega20 is too expensive plus it seems to have some problems with gaming architecture-wise. Polaris is small. Navi is 2019/2020, so in a year, maybe.
And I bet Nvidia has a contingency plan if AMD should decide to compete in high end.

Nvidia is probably dieshrinking Turing to 7nm next year with whatever changes turn out to be necessary for RT and that is what AMD needs to compete with. That will be Navi and timeframe is next holiday season, not sooner.

This makes me wonder, after sufficient "old stock" is cleared out, I bet Nvidia will come out with a driver update that will increase performance by atleast 25%.
At best they can do is to screw over the old cards performance :roll:
Nah. They will come with a driver update to "increase" the performance of previous generation(s) with -25% ;), going Apple style.
or Nvidia will come out with a driver update that will decrease (for Pascal and older) performance by atleast 25%
I mean... do you guys actually believe that?
 
Joined
Jul 3, 2018
Messages
229 (0.10/day)
Location
Australia
Processor Intel Core i7-13700KF
Motherboard GIGABYTE Z690 AORUS ELITE
Cooling Dark Rock Pro 4
Memory G.Skill Ripjaws DDR4-4000 32GB (4x8GB)
Video Card(s) ASUS TUF Gaming Radeon RX 7900 XT OC Edition 20GB GDDR6
Storage Various
Display(s) GIGABYTE M32U 4K 144hz
Audio Device(s) External Amp
Software KDE Neon
Clearly nvidia is counting on DLSS to widen the gap between the 1080ti and the 2080.

The fact that they felt it necessary to release the TI version at the same time shows that they knew the 2080's rasterizing performance alone wasn't going to impress anyone.

Until DLSS is supported in pre-exsisting games (if they choose to support it), not just upcoming titles, the 2080 looks bad.

In the past, a new GPU launch's main focus has been about raw performance, shiny new features being secondary.

The fact that they mainly focused on RTX tech felt like they were trying to hide something, and they were.

The 2080's performance is unremarkable, unless you're using DLSS anti aliasing.

If they wanted to impress with raw performance, which I think is what most of us care about, then DLSS is the real gem here, not RTX.

We're not fools nvidia. This launch left a bad taste in our mouths, and the stock market reflects this.

At least the reflections were ray-traced I suppose :rolleyes:
 

HTC

Joined
Apr 1, 2008
Messages
4,664 (0.76/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
How can this be?

With all that money saved by buying more cards, them stocks should be up by @ least 10% ...

Surely Stanley must have interpreted this the wrong way: maybe Stanley used the wrong tensor cores (Titan V's instead of RTX 2080 Ti's) and "machine learned" wrong ...
 
Joined
Jan 20, 2014
Messages
299 (0.08/day)
System Name gamingPZ
Processor i7-6700k
Motherboard Asrock Z170M Pro4S
Cooling scythe mugen4
Memory 32GB ddr4 2400mhz crucial ballistix sport lt
Video Card(s) gigabyte GTX 1070 ti
Storage ssd - crucial MX500 1TB
Case silverstone sugo sg10
Power Supply Evga G2 650w
Software win10
2.1% is nothing (stocks without news or analyst input can move more than that). but this RTX can be more then just an one gen failure for nivida, because PC gaming is at its lowest since that mining craze there are very few gamers that wanted to buy or upgrade hardware. and now they face 2.5 year old Pascal or RTX (aka -performance/price evenworse than Vega) that can baerly move 4k60 cost 900$+ and could only move 1080 40fps in RTX demos (that in real life wontexist for this gen). most of these gamers say - "F it, I buy 400$ console". About the bright Ray Traced Future potential - there is NONE, at least for this and next gen - no devs will do anything for 0.1% of the small PC market (rtx capable card - lets assume rtx 2080 - it cost from 900$ - there will never be a lot of people that will have 800-900$ gpus even if these would wipe the floow with gtx 1080 ti). When rtx 3060 (or 4060) will cost 250$ and can run RTX 1440p high frame rate or 4K - then there will be lot of RTX capable hardware out and only then devs will start to implement and optimize Ray Tracing. But I see that nvidia just want to milk this weak PC gaming cow till it dies and then nvidia will go full AI, I guess.
 
Joined
Feb 3, 2017
Messages
3,812 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Clearly nvidia is counting on DLSS to widen the gap between the 1080ti and the 2080.
Not directly, no. 2080 and 1080Ti are pretty evenly matched. Once the launch crazyness subsides, even the price should even out.
Yes, used cards and stock clearances from miners will keep the used price down and 1080Ti an excellent value proposition but manufacturer is not able to affect the second hand market anyway.
 
Joined
Jun 18, 2015
Messages
578 (0.17/day)
Aaannnd... feel free to disregard anything they say. Market sentiment, sure. It appears to be nicely based on pure emotion.

Technical or analytical side is nonexistent. This is almost completely incorrect facts.
- Higher clock speeds - 1480/1582 vs 1515/1710. Yeah, in specs. Not necessarily so much in reality as Boost 3/4 obfuscate things.
- Higher core count - 3584:224:88 vs 2944:184:64. Nope.
- 40% higher memory bandwidth - 484.3 vs 448.0. Nope.
So you are accepting that nVidia made a worse card that is much more expensive?
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
Morgan Stanley doesn't seem to understand how graphics industry works. It's always that new high end is about equal to the last year top of the line. And new current year top of the line surpasses all of it by at least 20%. RTX 2000 series are no exception. RTX 2080 is as fast as GTX 1080Ti. RTX 2080Ti is faster than both. So, nothing out of the ordinary here. What is the actual issue here is the pricing. Positioning RTX 2080 250€ higher because it has some ray tracing is just absurd when performance wise it's the same as 250€ cheaper GTX 1080Ti. That's where is the real issue. And same is with RTX 2080Ti. Sure, it has no direct competitor, but that doesn't mean they can just price it to infinity. And that's where Morgan Stanley is probably having the objections. Every year top end cards were 800€ and now it's all of a sudden 1200€. Sure there are people who were throwing money at it even before anyone even released performance numbers, but we can all be sure number of such people is way lower than the number which was willing to shell out 800€...
 
Joined
Feb 3, 2017
Messages
3,812 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
So you are accepting that nVidia made a worse card that is much more expensive?
I would not say it is a worse card. They are about the same. MSRP is the same as well at $699.
Much more expensive is a temporary thing.

Remember the launch of 1080? $599/$699 MSRP and these sold like hotcakes for 750-800 moneys.
1080Ti? $699 MSRP and it took over a month to get prices below the same 750-800 moneys.
Vega64? $499 MSRP, besides some fairly small shipments these sold for 700-ish moneys for a long while.
And none of these launches were really affected by mining craze. That reached high end cards later.
 
Joined
Sep 15, 2011
Messages
6,759 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Actually the 2080 should have been the 2070 of this generation, at least with 250$ cheaper. At least.
Then nobody would have complained a bit. On a contrary even...
 
Joined
Apr 1, 2015
Messages
23 (0.01/day)
3 years ago their stock was worth tenth of today's - some 25 bucks. Jenny is still laughing at Your face going to his bank.
 
Joined
Jul 17, 2011
Messages
85 (0.02/day)
System Name Custom build, AMD/ATi powered.
Processor AMD FX™ 8350 [8x4.6 GHz]
Motherboard AsRock 970 Extreme3 R2.0
Cooling be quiet! Dark Rock Advanced C1
Memory Crucial, Ballistix Tactical, 16 GByte, 1866, CL9
Video Card(s) AMD Radeon HD 7850 Black Edition, 2 GByte GDDR5
Storage 250/500/1500/2000 GByte, SSD: 60 GByte
Display(s) Samsung SyncMaster 950p
Case CoolerMaster HAF 912 Pro
Audio Device(s) 7.1 Digital High Definition Surround
Power Supply be quiet! Straight Power E9 CM 580W
Software Windows 7 Ultimate x64, SP 1
turing was ment for 7nm just as maxwell was meant for 16(14 or whatever same shit)
I doubt that. For me it seems they just slammed some Tensor-cores on Pascal to have a multitude of opportunities … and made Turing way to overpriced and literally unpurchasable on purpose.

Just imagine all the benefits!
  • You're skipping a whole true generation and given costs on R&D and manufacturing
  • nVidia officially postponed their 'Next Generation Mainstream GPU' based upon Volta on Hot Chips 30, wasn't by accident
  • You're able to clear your insane inventories and sell the vast stock of older cards at full MSRP
    … since everyone is going to say »Well, fuck off nVidia – I'm not going to buy this overpriced shit. I'll get a GTX 1080 instead!«
  • You're able to establish another price increase literally by just passing by
  • You're able to establish another proprietary nVidia-only standard à la PhysX GameWorks, also just en passant
    … since PhysX, then Tesselation, then HairWorks then G-Sync didn't really paid out for you as people look through your shitty attempt in dividing the market and tighten your closed ecosystem
  • It will work
    … since AMD isn't able to keep up with you anyway.


All you have had to do was to bring a 'new' generation's card which has techniques which are pretty much useless and ain't even any futureproof as of now (but rather purely hypothetical to be of any significance in the near future or any future at all) – but have a way too to high price tag to be considered being any reasonable compromise nor any sound purchase for a ordinary gamer. Funny enough, that's exactly what Turing represents. Coincidence? I don't like to think so.

They're smart and they're greedy. Given that the above points are pretty realistic and not too far-fetched to be the actual case, it's pretty reasonable at least to think about it or even take them into consideration they might be the actual case. They already fooled us on Pascal and how it would be a completely new architecture. Turned out, it wasn't but just Maxwell 2.0. They have done stunts like that already in the past, pretty often to be honest.

You just bring a 'new' generation that doesn't bring anything real new (hence the RT-deception to hide that fact), make it so that those cards are in fact technically un·purchasable in terms of inflated price tag, that the older generation must seem to look like a real bargain compared to the new ones – and then you're just sell the old generation instead. The customer has no other choice but to bite the bullet (due to lack of any reasonable competition's product) - there is simply no alternative but to swallow the pill.

The best part is, you make insane profits out of all of that. Like whatever the customer will do, you're listening to a good ol' Ka-Ching all day long, every day. The even better part is, that your next generation cards with a actual real new architecture will look even better compared to your last one – since you toned them down on purpose.

Indications
A pretty decent red flag (at least for me) was, when it came up that DICE is going to wind down the given effects and usage of any RT in Battlefield V. Why? And why should that be worth any red flag, you may ask?

Well, if we consider the state of facts than we see that DICE had to tone down the RT-effects to even reach playable frame-rates … And to be honest, they were pretty transparent on the reasons why. DICE, of all things! They're gamers and they're freaks (in a positive way), they're total geeks from top to toe and they absolutely love to do such things like that, joyfully! They're alwas prone to use the newest stuff, techniques and gadgets. A studio which – together with Square Enix – nine times out of ten are always the very first to adapt a new technology quite instantly (Mantle, TrueAudio, DirectX 12, Vulkan et cetera), if it has any greater potential in the future. You never have to ask them twice if it brings any benefit after all, at least from a gamer's perspective. Yes, if

The fact that the whole response of the developers in general is anything but euphoric and also the overall corresponding echo (not only from DICE) on the whole chapter RT is rather noncommital, tells me there's something wrong. Especially way more than nVidia is ever going to tell us or acknowledge.

If even DICE doesn't really give the impression that the technology is that good nVidia tries to convince us it would be, well, who else then?
Somehow it doth me, that apart from the insanely hyped presentiation nVidia deliberately delivered, there is just simply way less substance than they want us to believe there would be – especially can't be there any talk of a "spectacular surprise", all conviction aside. If, at least on hardware side the potential would exist, yes, sure … Thing is, it just doesn't. 1080 not even on 60 fps. What kind of a joke is that even?
I heard nv is trying to get the game devs to reduce the amount of rays in games to make it seem like it is worth it when it hardly is not
I think exactly that seems to be the actual problem, if you look above …
 
Last edited:
Joined
Jan 8, 2017
Messages
9,499 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Considering how over inflated their share price is this is nothing.

If they knew what was good for them.

They do know what is good for them and it turns out it's not high-end consumer GPUs.
 
Joined
Feb 3, 2017
Messages
3,812 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
I doubt that. For me it seems they just slammed some Tensor-cores on Pascal to have a multitude of opportunities …
For one thing, compute units in Turing are clearly based on Volta, not Pascal. Volta already had Tensor cores. RT cores are what were added.
You're able to establish another proprietary nVidia-only standard à la PhysX GameWorks, also just en passant
DX12's DXR and Vulkan's Raytracing extensions disagree with what you are saying.
You just bring a 'new' generation that doesn't bring anything real new (hence the RT-deception to hide that fact), make it so that those cards are in fact technically un·purchasable in terms of inflated price tag, that the older generation must seem to look like a real bargain compared to the new ones – and then you're just sell the old generation instead. The customer has no other choice but to bite the bullet (due to lack of any reasonable competition's product) - there is simply no alternative but to swallow the pill.
Why is RT not new?
They are neither unpurchasable nor overpriced. What many expected and Turing cards fail to do is push the price point of the performance level down. Perf/$ remains pretty exactly the same. 2080 effectively replaces 1080Ti and 2080Ti sits 30% above it, both in performance and price. And this is purely based on raster units and results. If any other new tech bets (RT, DLSS, variable-rate shading) should succeed, that would be on top of this.

By the way, Nvidia is clearly making less profit from 2080 than it does from 1080Ti.
The fact that the whole response of the developers in general is anything but euphoric and also the overall corresponding echo (not only from DICE) on the whole chapter RT is rather noncommital, tells me there's something wrong.
Developers response has been measured but positive. Game devs did not have Turing cards until very last moment before announcement so while they knew RT tech was coming, they had no details about what hardware did or the performance of it.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,649 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I doubt that. For me it seems they just slammed some Tensor-cores on Pascal to have a multitude of opportunities … and made Turing way to overpriced and literally unpurchasable on purpose.

Just imagine all the benefits!
  • You're skipping a whole true generation and given costs on R&D and manufacturing
  • nVidia officially postponed their 'Next Generation Mainstream GPU' based upon Volta on Hot Chips 30, wasn't by accident
  • You're able to clear your insane inventories and sell the vast stock of older cards at full MSRP
    … since everyone is going to say »Well, fuck off nVidia – I'm not going to buy this overpriced shit. I'll get a GTX 1080 instead!«
  • You're able to establish another price increase literally by just passing by
  • You're able to establish another proprietary nVidia-only standard à la PhysX GameWorks, also just en passant
    … since PhysX, then Tesselation, then HairWorks then G-Sync didn't really paid out for you as people look through your shitty attempt in dividing the market and tighten your closed ecosystem
  • It will work
    … since AMD isn't able to keep up with you anyway.


All you have had to do was to bring a 'new' generation's card which has techniques which are pretty much useless and ain't even any futureproof as of now (but rather purely hypothetical to be of any significance in the near future or any future at all) – but have a way too to high price tag to be considered being any reasonable compromise nor any sound purchase for a ordinary gamer. Funny enough, that's exactly what Turing represents. Coincidence? I don't like to think so.

They're smart and they're greedy. Given that the above points are pretty realistic and not too far-fetched to be the actual case, it's pretty reasonable at least to think about it or even take them into consideration they might be the actual case. They already fooled us on Pascal and how it would be a completely new architecture. Turned out, it wasn't but just Maxwell 2.0. They have done stunts like that already in the past, pretty often to be honest.

You just bring a 'new' generation that doesn't bring anything real new (hence the RT-deception to hide that fact), make it so that those cards are in fact technically un·purchasable in terms of inflated price tag, that the older generation must seem to look like a real bargain compared to the new ones – and then you're just sell the old generation instead. The customer has no other choice but to bite the bullet (due to lack of any reasonable competition's product) - there is simply no alternative but to swallow the pill.

The best part is, you make insane profits out of all of that. Like whatever the customer will do, you're listening to a good ol' Ka-Ching all day long, every day. The even better part is, that your next generation cards with a actual real new architecture will look even better compared to your last one – since you toned them down on purpose.

Indications
A pretty decent red flag (at least for me) was, when it came up that DICE is going to wind down the given effects and usage of any RT in Battlefield V. Why? And why should that be worth any red flag, you may ask?

Well, if we consider the state of facts than we see that DICE had to tone down the RT-effects to even reach playable frame-rates … And to be honest, they were pretty transparent on the reasons why. DICE, of all things! They're gamers and they're freaks (in a positive way), they're total geeks from top to toe and they absolutely love to do such things like that, joyfully! They're alwas prone to use the newest stuff, techniques and gadgets. A studio which – together with Square Enix – nine times out of ten are always the very first to adapt a new technology quite instantly (Mantle, TrueAudio, DirectX 12, Vulkan et cetera), if it has any greater potential in the future. You never have to ask them twice if it brings any benefit after all, at least from a gamer's perspective. Yes, if

The fact that the whole response of the developers in general is anything but euphoric and also the overall corresponding echo (not only from DICE) on the whole chapter RT is rather noncommital, tells me there's something wrong. Especially way more than nVidia is ever going to tell us or acknowledge.

If even DICE doesn't really give the impression that the technology is that good nVidia tries to convince us it would be, well, who else then?
Somehow it doth me, that apart from the insanely hyped presentiation nVidia deliberately delivered, there is just simply way less substance than they want us to believe there would be – especially can't be there any talk of a "spectacular surprise", all conviction aside. If, at least on hardware side the potential would exist, yes, sure … Thing is, it just doesn't. 1080 not even on 60 fps. What kind of a joke is that even?

I think exactly that seems to be the actual problem, if you look above …

Way too complicated.

Reality: Nvidia develops GPU for new markets (deep learning / AI / automotive / HPC) and consumers get cut down leftovers called Geforce. A major accelerator in these new markets were Tensor cores. Furthermore, RT is actively being used in development of products and it offers real advantages in that setting - it already does that for many years now.

Turing @ Geforce is just their attempt to maximize profits on that development and find a use for their new GPUs that don't make the Quadro / Tesla / Titan 'cut'.

Anything they get along the way is bonus points to begin with. As for your assumptions:

- Nvidia does not benefit from making RT proprietary, they only benefit from making it run very well on their own hardware. Your idea of it being the next Hairworks or PhysX completely counters Nvidia's alleged push to make RT a 'thing to have'. You don't promote something and then do what you can to stop broad adoption of it. Remember: AMD still holds the console market so if they want broad adoption, they need to include console gaming in some way or another.

- Price point of Turing compared to Pascal, yes, obviously they priced Turing such that 'for a small premium' you can get similar performance as Pascal but with improved RT performance. That's just common sense when there is no competition and old stock to sell. The next move is, if Turing doesn't sell at all, that it gets a price cut when Pascal stock is gone. Your logic of them making Turing unattainable doesn't make any sense.

- Turing can be bought, its not like people didn't buy 700+ dollar GPUs before.

- Nvidia isn't skipping cost on generational R&D, they just don't release entire generations under a Geforce label ;) Volta never turned into a Geforce product. If you really think they stall development, you haven't paid attention. The reason Maxwell wasn't Pascal is because we were stuck with 28nm for much longer than anyone would have liked. The reason Pascal is Pascal is because the competition had no real game plan for anything past Hawaii, which nicely coincides with the console deals AMD struck. Their focus is elsewhere and its clear as day, Vega just confirmed it. Not Nvidia's problem, really, is it? At the same time, Pascal offered the largest generational performance boost in many years. That's quite a feat, considering they were doing nothing substantial to their architecture ;)

- Nvidia does not benefit from stalling because there are massive potential, emerging markets where they want to secure profit and marketshare for the next few decades. If they lag behind, they lose all of their investments in those markets - automotive is already hanging by a thread.
 
Joined
Jan 25, 2011
Messages
512 (0.10/day)
"We have several RTX 2080 and 2080 Ti boards here for full review, and the results are extremely disappointing so far. "

Had to fix this snippet from the technology overview conclusion. I mean who in their right mind would praise or give Editors Choice Awards to something that even the investors can see didn't meet expectations.

Some review sites seem to have taken a marketing role for companies rather than being honest. Its a shame really.
 
Top