Thursday, September 22nd 2022
Jensen Huang Tells the Media That Moore's Law is Dead
NVIDIA's CEO has gone out on a limb during a video call with the media, where he claimed that Moore's Law is Dead, in response to the high asking price for its latest graphics cards. For those not familiar with Moore's law, it's an observation by Intel's Gordon Moore that says that transistors double in density inside dense integrated circuits every two years, while at the same time, the cost of computers are halved. The follow-on to this observation is that there's also a doubling of the performance every two years, if maintaining the same cost. This part doesn't quite hold true any more, due to all major foundries having increased the cost when using their cutting edge nodes. We're also reaching a point where it's getting increasingly difficult to shrink process nodes in semiconductor fabs. However, Jensen Huang's statement has nothing to do with the actual node shrinks, which makes his statement a bit flawed.
Jensen's focus seems to be on the latter half of Moore's law, the part related to semiconductors getting cheaper, which in turn makes computers cheaper. However, this hasn't been true for some time now and Jensen's argument in this case is that NVIDIA's costs of making semiconductors have gone up. Jensen is quoted as saying "A 12-inch wafer is a lot more expensive today than it was yesterday, and it's not a little bit more expensive, it is a ton more expensive," "Moore's Law is dead … It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past." What he actually meant is that we shouldn't expect semiconductors to be as cheap as they've been in the past, although part of the issue NVIDIA is having is that their products have to be produced on cutting edge notes, which cost significantly more than more mature nodes. It'll be interesting to see if AMD can deliver graphics chips and cards with a more competitive price point than NVIDIA, as that would refute some of Jensen's claims.
Sources:
Barron's, MarketWatch
Jensen's focus seems to be on the latter half of Moore's law, the part related to semiconductors getting cheaper, which in turn makes computers cheaper. However, this hasn't been true for some time now and Jensen's argument in this case is that NVIDIA's costs of making semiconductors have gone up. Jensen is quoted as saying "A 12-inch wafer is a lot more expensive today than it was yesterday, and it's not a little bit more expensive, it is a ton more expensive," "Moore's Law is dead … It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past." What he actually meant is that we shouldn't expect semiconductors to be as cheap as they've been in the past, although part of the issue NVIDIA is having is that their products have to be produced on cutting edge notes, which cost significantly more than more mature nodes. It'll be interesting to see if AMD can deliver graphics chips and cards with a more competitive price point than NVIDIA, as that would refute some of Jensen's claims.
94 Comments on Jensen Huang Tells the Media That Moore's Law is Dead
We all know why nvlink and SLI is dead because Nvidia did not want people to buy any cheap cards to SLI them or use nvlink and get the same performance of the most expensive card. I was laughing when they showed the DLSS working at 1440p at 135 fps on some games really. I can push over 100 fps at 2160p on two 1070ti's in SLI and use reshade me tweaker and make it look just the same.
Its simple:
If we can trust to the leather man (not a jacket)
4090 $1600 = 2x 3090ti
4080 16gb $1200 = 2x 3080ti (only in the dreams of the leather man, 4080 has 40% less cores than 4090 so 2x 3080 at most)
4080 12gb $900 = 53% less cores than 4090 so 3090ti performance at most
now what if there were 4070 and 4060:
4070 $700 - cutted 104 ~ 48 cores, 20% less than 4080 12gb => 3090ti-20% = slower than 3080 which cost also $700 2 years ago, not surprisingly this card was not released
4060 $500 - 106 probably 20% slower than 4070 => 3080-20% = 3070 which cost also $500 2 years ago
1. Whether consumers can take some of that power away hugely depends on other contenders in the GPU market. If AMD prices RDNA 3 in a similar fashion as Nvidia did Ada, then we're F-ed.
2. Nvidia has a huge fan base. I know people who would never ever buy an AMD card even if it cost half as much as a similar offering from Nvidia. Nvidia can afford to rely on these people and raise prices and margins so high that they stay afloat even with only a handful of cards sold.
The ultimate power is always with the common people... the majority of which is ignorant as hell.
And Huang has been acting exactly like palpatine since day one, subversive, manipulative and driven to rule
Poor deflection, it's not competition that caused it.
As for your math, unfortunately it is not that simple. Process nodes have gotten so complicated, that the traditional nanometer metric does not apply anymore (with stuff like finfets, EUV, wider metal pitch and who knows what else, I do not even understand any of it). That is why they came up with those weird names.
TSMC's N5 has triple the density over Samsung's 8N. The N4 even slightly better than that. There is also a separate 4N process that NVIDIA is using here, and I do not know what the difference is compared to N4.
But AD102 has 2.7x more transistors with a slightly smaller die compared to GA102. The die size being so similar means that Ada chips are over 2x more expensive to make currently.
The cost increase is there, but their margins are another thing. As I said in another thread, maybe it is time to go back to making smaller and less power hungry dies.
Give up performance increases for one generation, instead utilize the efficiency of the new process and make the flagship model consume 250 W, sell it for $600-700 and call it a new generation of environment friendly technology. Would that not be a win in a world struggling for resources?
I think this is what AMD realized. I think they will let NVIDIA keep the pointless performance crown while they sell extremely efficient and affordable cards. I cannot wait to find out.
- Radeon RX 6950 XT - $949 (Was $1099 US)
- Radeon RX 6900 XT - $699 (Was $999 US)
- Radeon RX 6800 XT - $599 (Was $649 US)
- Radeon RX 6800 - $549 (Was $579 US)
- Radeon RX 6750 XT - $419 (Was $549 US)
- Radeon RX 6700 XT - $379 (Was $479 US)
- Radeon RX 6650 XT - $299 (Was $399 US)
- Radeon RX 6600 XT - $239 (Was $329 US)
- Radeon RX 6500 XT - $169 (Was $199 US)
- Radeon RX 6400 - $149 (Was $159 US)
Observations:1) Price cuts across the board = Making space for new generation at the old price slots (+some inflation)?
2) No absurd price cuts at the high end = AMD supply chain likely healthy?
If new processes drive up costs and make GPUs extremely expensive, then where are the small, efficient and relatively affordable chips? I mean, in a world where everybody is constantly being told by governments and companies to downsize personal needs (buy blankets instead of paying for heating, pff), why is a company like Nvidia still hell bent on making large and power-hungry GPUs that no one can afford? I wouldn't count on it. If news about the 7900 XT being a multi-chip die with 10k+ shaders are true, then it will be just as much of a niche product as the 4090. But we'll see.
The only thing I'm happy about in the current situation is the fact that game technologies don't evolve fast enough so that I would need to buy a new graphics card right now. Even my 6500 XT can run everything I want, and if a new game that I want comes out with bigger hardware requirements, I'll just dust off my noisy 2070 and call it a day.
But the point of my ramble was show us your invoices nvidia to show us what is “considerable” so we know what “considerable” is. :)
Or just say 2x or something. Considerable to me would be 5x or more.
Either way we’re all still waiting for prices to return to pre-pandemic levels for pretty much everything.
Intel had the same thing. They used to have Extreme Edition CPUs for $1000 on mainstream sockets. Then they limited this to their HEDT platforms. And then AMD killed Intel's HEDT, as they could barely keep up in the mainstream segment.
But whatever you might want to say about Intel, they have kept their entry-level and mainstream prices on the same level for ages. Ever since Core 2 Duo, you could always buy a CPU based on the newest architecture under $200, even when AMD had nothing.
And you cannot say this about AMD. It took 1.5 years for Zen 3 to go under $200. And it looks like Zen 4 will be repeating this.
16 years of inflation and $200 still buys you a CPU that is suitable for high-framerate gaming and decent productivity.
But with Ada Lovelace, the entire initial line-up has to be considered niche and out of reach.
FFS, we have 3 different resolutions that we can game at: 1080, 1440, and 4k. Why the hell is it necessary for NVidia to have so many different tiers/models? The most I can see the need for is 5 different tiers with two of them being an intermediate between 1080 & 1440, and 1440 & 4k. Its like as if these graphics card companies have some kind of fetish for bigger, heavier, moar tiers/models.
[URL='https://www.techpowerup.com/299159/jensen-huang-tells-the-media-that-moores-law-is-dead']Jensen Huang Tells the Media That[/URL] he couldn't give two fucks about EVGA leaving the fold and doesn't really care about consumers that much either infact he'd much rather we all fucked off and left him alone to play in his own little 3D realm