• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI Talks about NVIDIA Supply Issues, US Trade War and RTX 2080 Ti Lightning

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,772 (2.42/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
Looking at that first picture, what the hell was NVIDIA thinking?

As so often happens in this industry, they let the engineers do the thinking, but clearly didn't take the time to optimise the design.
It's also a rather power hungry chip, so it needs good power delivery, which means a more expensive board.
I did a rough count and it looks like the RTX 2080Ti has 3x as many tantalum capacitors as the GTX 1080Ti, both being MSI cards in this case.
Even the Titan X (Pascal) looks simple in comparison, which makes the RTX 2080Ti look like a huge failure from a design standpoint.
I wouldn't be surprised if this was the most complex consumer card that Nvidia has made to date and I'm not talking about the GPU itself, but rather the PCB design.
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Looking at that first picture, what the hell was NVIDIA thinking?

You only need that one look at those Pascal and Turing boards to see that RTX is the most wasteful practice in GPU history with the most meagre payoff ever. They took years and several generations to fine tune Kepler, and then they do this. o_O

I seriously question if we should be paying for all of that waste - since day one. Still not convinced.
 
Joined
Nov 29, 2016
Messages
671 (0.23/day)
System Name Unimatrix
Processor Intel i9-9900K @ 5.0GHz
Motherboard ASRock x390 Taichi Ultimate
Cooling Custom Loop
Memory 32GB GSkill TridentZ RGB DDR4 @ 3400MHz 14-14-14-32
Video Card(s) EVGA 2080 with Heatkiller Water Block
Storage 2x Samsung 960 Pro 512GB M.2 SSD in RAID 0, 1x WD Blue 1TB M.2 SSD
Display(s) Alienware 34" Ultrawide 3440x1440
Case CoolerMaster P500M Mesh
Power Supply Seasonic Prime Titanium 850W
Keyboard Corsair K75
Benchmark Scores Really Really High
Priceless, i am sure nVidia are paying more due to size. Personally i believe they should of not released it, i guess they wanted it out before AMD release their own version of ray tracing.

Yeah well done nVidia, awesome marketing strategy knowing full well the card would be expensive to begin with, knowingt full well nVidia tax and then 3rd party's have to find a way to make a profit too.

They are talking about AMD cpus.

So nearly doubling the component count wouldn't increase the cost and production time? I guess you'er not that familiar with SMT/SMD production?
Do you think all these "little" parts are free?





Just compare that to the 1080Ti and you'll see that just the power delivery circuitry is a lot more complex.





Instead of normal solid state capacitors, they're using what looks like tantalum capacitors, those are some 4-5x more expensive for starters.
GDDR6 is most likely more expensive than GDDR5X, as it's "new" technology and production has most likely not ramped up fully.
Obviously the GPU itself is more expensive, as it's a bigger chip, but that one is on Nvidia.

Then take into account that the machines that are picking and placing the components can only operate so fast. A decent SMT machine today can do 100,000 components per hour, but this also depends on the PCB layout and the type of components. However, a PCB normally goes through a couple of these machines and in-between there are reflow ovens that the boards have to pass through to solder the components to the board. The production lines aren't exactly moving at more than a snails pace, so it takes time to do these things. Once the boards have had all the components placed and soldered, they're then both manually checked and these days most likely machine checked for any issues with the component placement and soldering. This takes time. Finally the parts that have to be added by hand are are added and the boards and then going through yet another reflow oven to solder those parts in place. Once that's done, you have people fitting the cooling to the cards and that's obviously a manual job, which takes time. Then you have to test the cards to make sure they're working properly. Normally there should be a burn-in test that can take 24h or even longer.

So the more complex the PCB is, the longer it takes to make and the more expensive it gets.


THANK YOU. People think that things should be given to them for free and that if a card is complex and advanced, "they should not of releasing it". WTF.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Look how relatively barren Vega Frontier Edition is by comparison:

Or the Sapphire RX 580 Nitro+:


NVIDIA is not making any AIB friends with the Turing cards.
 
Joined
Nov 29, 2016
Messages
671 (0.23/day)
System Name Unimatrix
Processor Intel i9-9900K @ 5.0GHz
Motherboard ASRock x390 Taichi Ultimate
Cooling Custom Loop
Memory 32GB GSkill TridentZ RGB DDR4 @ 3400MHz 14-14-14-32
Video Card(s) EVGA 2080 with Heatkiller Water Block
Storage 2x Samsung 960 Pro 512GB M.2 SSD in RAID 0, 1x WD Blue 1TB M.2 SSD
Display(s) Alienware 34" Ultrawide 3440x1440
Case CoolerMaster P500M Mesh
Power Supply Seasonic Prime Titanium 850W
Keyboard Corsair K75
Benchmark Scores Really Really High
By nVidia, which, surprisingly, is able to somehow get (less than) 100 cards for reviewers, even though they are sold out in most shops.

How could nVidia achieve that? Mysterious...

If a company can't even spare 100 cards for marketing to try to sell a whole product lineup, they have issues.

As so often happens in this industry, they let the engineers do the thinking, but clearly didn't take the time to optimise the design.
It's also a rather power hungry chip, so it needs good power delivery, which means a more expensive board.
I did a rough count and it looks like the RTX 2080Ti has 3x as many tantalum capacitors as the GTX 1080Ti, both being MSI cards in this case.
Even the Titan X (Pascal) looks simple in comparison, which makes the RTX 2080Ti look like a huge failure from a design standpoint.
I wouldn't be surprised if this was the most complex consumer card that Nvidia has made to date and I'm not talking about the GPU itself, but rather the PCB design.

Did you look at the Titan V design? You said they "didn't take time to optimize the design" but why wouldn't they? They had time and you think it's not optimized but it probably is.
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,772 (2.42/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
Did you look at the Titan V design? You said they "didn't take time to optimize the design" but why wouldn't they? They had time and you think it's not optimized but it probably is.

It does indeed look like they borrowed a lot from the Titan V power design. Obviously that card is using HBM, so it should be quite different, but doesn't seem to be all that different.
Normally the third party board makers tend to improve upon the reference designs, be it from Nvidia or AMD, hence my commend about optimisation.
That said, things aren't always improved upon. Read @W1zzard's commends on older graphics card reviews about the fact that the main Voltage controller always seems to be a "budget" part that doesn't support things like I2C for monitoring. This is not just done on the lower-end cards, but on flagship cards as well. There's really no reason to cut corners when it comes to things like this, but apparently Nvidia could save a quarter per card, so they decided to pocket that difference. On occasion, the third party board makers add better components, but with no software support, it doesn't always add additional features. It's possible that we're looking at a power hog here, but GDDR6 was supposed to have lower power draw than GDDR5/5x, which should result in an overall lower card power draw. In other words, the only reason for the rather extreme power regulation would be due to the GPU itself using exponentially more power than the equivalent Pascal based GPU.

Just looking at the board layouts, I'm getting a weird feeling that something didn't quite go as planned on Nvidia's side, at least not when it came to raw performance. Whereas Pascal had so good performance that they could move GPU's down a tier, compared to past architectures, it would seem like Turing worked out the other way around. As such, the RTX 2080 Ti should've been the Titan card, but due to poor performance compared to Pascal, it sits where it sits today, is costly to produce and also retails at a Titan level price point.
This is the first time since at least the 400-series that we've seen a top of the range card retail for over $1,000 (ok, the MSRP point is $999, but that's unlikely to happen).
Sure, that's over an eight year time period and we need to consider things like inflation, but in this case, it's an overly inflated price for an overly complex and under performing part. On top of that, top-of-the-range Ti cards have only existed since the 700-series, but for three generations, they never surpassed $699.

Yes, Nvidia has added a ton of new technology and in as much as no-one was really asking for any of that in a consumer graphics card today, it does help to push the envelope in terms of technology. Unfortunately it seems to have been a push too far, too soon.
 
Top