Thursday, September 22nd 2022

Jensen Huang Tells the Media That Moore's Law is Dead

NVIDIA's CEO has gone out on a limb during a video call with the media, where he claimed that Moore's Law is Dead, in response to the high asking price for its latest graphics cards. For those not familiar with Moore's law, it's an observation by Intel's Gordon Moore that says that transistors double in density inside dense integrated circuits every two years, while at the same time, the cost of computers are halved. The follow-on to this observation is that there's also a doubling of the performance every two years, if maintaining the same cost. This part doesn't quite hold true any more, due to all major foundries having increased the cost when using their cutting edge nodes. We're also reaching a point where it's getting increasingly difficult to shrink process nodes in semiconductor fabs. However, Jensen Huang's statement has nothing to do with the actual node shrinks, which makes his statement a bit flawed.

Jensen's focus seems to be on the latter half of Moore's law, the part related to semiconductors getting cheaper, which in turn makes computers cheaper. However, this hasn't been true for some time now and Jensen's argument in this case is that NVIDIA's costs of making semiconductors have gone up. Jensen is quoted as saying "A 12-inch wafer is a lot more expensive today than it was yesterday, and it's not a little bit more expensive, it is a ton more expensive," "Moore's Law is dead … It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past." What he actually meant is that we shouldn't expect semiconductors to be as cheap as they've been in the past, although part of the issue NVIDIA is having is that their products have to be produced on cutting edge notes, which cost significantly more than more mature nodes. It'll be interesting to see if AMD can deliver graphics chips and cards with a more competitive price point than NVIDIA, as that would refute some of Jensen's claims.
Sources: Barron's, MarketWatch
Add your own comment

94 Comments on Jensen Huang Tells the Media That Moore's Law is Dead

#51
steen
This is what restricting sell-through & layering on top looks like.
Posted on Reply
#52
RedBear
BSim500Either nVidia get it together and refocus on significantly improving 4050 / 4060 core rasterization performance & efficiency at a sane price and correct their current "direction", or they can GTFO.
You just need to look at the numbers on Steam in order to understand why Nvidia can actually ignore that segment, the RX 6600 was released in 2021 and it gets a 0.27% share, the RTX 3050 meanwhile gets a respectable 1.62% and it's at the 17th place in the overall chart, not bad for a (bad) GPU released this year...
Posted on Reply
#53
N3M3515
RedBearYou just need to look at the numbers on Steam in order to understand why Nvidia can actually ignore that segment, the RX 6600 was released in 2021 and it gets a 0.27% share, the RTX 3050 meanwhile gets a respectable 1.62% and it's at the 17th place in the overall chart, not bad for a (bad) GPU released this year...
That's crazy, the 6600 is considerably cheaper and way faster than the 3050
Posted on Reply
#54
mplayerMuPDF
hatHey, I'm all for optimized and minimalistic software. Imagine how well things would run without a bunch of unnecessary bloat? That would be so much better, and not just for performance reasons...
I know how well things run without unnecessary bloat because I run a very lightweight Linux setup on all my computers (and that includes uBlock Origin to lay waste to the scourge of unnecessary and privacy invading JavaScript scripts) :) . In fact, my username is a reference to this (mplayer is a lightweight video player and MuPDF is a minimalistic PDF viewer; both are cross platform, by the way). My dual core Richland (1 module, so really more like 1.5-1.75 core) ProBook would be unusable without this setup.
Posted on Reply
#55
mama
RedBearYou just need to look at the numbers on Steam in order to understand why Nvidia can actually ignore that segment, the RX 6600 was released in 2021 and it gets a 0.27% share, the RTX 3050 meanwhile gets a respectable 1.62% and it's at the 17th place in the overall chart, not bad for a (bad) GPU released this year...
:banghead:
Posted on Reply
#56
Lycanwolfen
MentalAcetylideJensen Huang is just trying to cloud the facts with BS. He wants bigger & heavier cards. I'm sure if he could make them out of something like osmium metal or depleted uranium, he would be doing it. The only thing dead is the cow that made his leather jacket.
Yep and what happened when they went that way before big and heavy cards people lost interest in Nvidia, Then they went into a new direction of smaller but faster cards. Then now we back to big and bulky cards again. Freaking joke.

We all know why nvlink and SLI is dead because Nvidia did not want people to buy any cheap cards to SLI them or use nvlink and get the same performance of the most expensive card. I was laughing when they showed the DLSS working at 1440p at 135 fps on some games really. I can push over 100 fps at 2160p on two 1070ti's in SLI and use reshade me tweaker and make it look just the same.
Posted on Reply
#57
ixi
Sitting here and waiting for rx 7000 gpu's. Didnt like their idea in showcase. 4080 for 899 while rtx 3060 for 329... comparing next gen gpu against previous gen low end gpu. Is there hidden text? Like 4060 will come out after 1 year?
Posted on Reply
#58
Hyderz
RTX4090 beyond fast! beyond expensive!
Posted on Reply
#59
usiname
ixiSitting here and waiting for rx 7000 gpu's. Didnt like their idea in showcase. 4080 for 899 while rtx 3060 for 329... comparing next gen gpu against previous gen low end gpu. Is there hidden text? Like 4060 will come out after 1 year?
The hole in the new gen is because it overlap with the performance/$ with the last gen
Its simple:
If we can trust to the leather man (not a jacket)
4090 $1600 = 2x 3090ti
4080 16gb $1200 = 2x 3080ti (only in the dreams of the leather man, 4080 has 40% less cores than 4090 so 2x 3080 at most)
4080 12gb $900 = 53% less cores than 4090 so 3090ti performance at most
now what if there were 4070 and 4060:
4070 $700 - cutted 104 ~ 48 cores, 20% less than 4080 12gb => 3090ti-20% = slower than 3080 which cost also $700 2 years ago, not surprisingly this card was not released
4060 $500 - 106 probably 20% slower than 4070 => 3080-20% = 3070 which cost also $500 2 years ago
Posted on Reply
#60
AusWolf
If this claim was true, then we would see a never-before-seen blooming of the low-end graphics card market. Instead, the only truly low-end we have is the Radeon RX 6400 and 6500 XT and some parts of the world have the Intel Arc A380. Where's the rest? Does the "Moore's law is dead" statement also include that only high-end graphics cards can be sold? Also, what about architectural improvements? All I can see in Ada is a slightly improved Tensor core. The rest of the design is basically Turing 3.0 with more shaders.
Posted on Reply
#61
Bomby569
I like Nvidia but their market share is doing to them what the i3/i5/i7 did for Intel for a long time. They became evil empire, a money sucking leach
Posted on Reply
#62
mama
Bomby569I like Nvidia but their market share is doing to them what the i3/i5/i7 did for Intel for a long time. They became evil empire, a money sucking leach
They're only an "empire" because consumers empowered them. That power can be taken from them. If that happens, they will be humble and hungry to please again.
Posted on Reply
#63
TheoneandonlyMrK
Bomby569I like Nvidia but their market share is doing to them what the i3/i5/i7 did for Intel for a long time. They became evil empire, a money sucking leach
Became!!, They usurped Intel recently but they're old masters for sure.
Posted on Reply
#64
AusWolf
mamaThey're only an "empire" because consumers empowered them. That power can be taken from them. If that happens, they will be humble and hungry to please again.
The problems with that are...
1. Whether consumers can take some of that power away hugely depends on other contenders in the GPU market. If AMD prices RDNA 3 in a similar fashion as Nvidia did Ada, then we're F-ed.
2. Nvidia has a huge fan base. I know people who would never ever buy an AMD card even if it cost half as much as a similar offering from Nvidia. Nvidia can afford to rely on these people and raise prices and margins so high that they stay afloat even with only a handful of cards sold.

The ultimate power is always with the common people... the majority of which is ignorant as hell.
Posted on Reply
#65
Bomby569
TheoneandonlyMrKBecame!!, They usurped Intel recently but they're old masters for sure.
they had power in the past, the 10 series had no competition and things didn't look this close to the dark side
Posted on Reply
#66
TheoneandonlyMrK
Bomby569they had power in the past, the 10 series had no competition and things didn't look this close to the dark side
No, I disagree, I don't see the relevance of 10 series.
And Huang has been acting exactly like palpatine since day one, subversive, manipulative and driven to rule
Poor deflection, it's not competition that caused it.
Posted on Reply
#67
THU31
mechtech""A 12-inch wafer is a lot more expensive today than it was yesterday, and it's not a little bit more expensive, it is a ton more expensive,"

ok yes,........ but I don't see no invoice saying how much more.

Also for an over simpllification and simple math, say 8nm process then move to 4nm. 8x8 = 64nm2 and 4x4 = 16nm2. So a factor of 4, so if the same chips were fabbed, (neglicting yields, etc) a 4nm wafer should yield 4x the chips as an 8nm wafer. So if the cost is 4x (400%) more for the 4nm wafer over the 8nm wafer, the end price should be roughly the same per chip. Now of course if the transistors double........
TSMC stated that the N5/N4 cost is about 2x higher than N7. And Samsung's 8N was cheaper than that (that was effectively a 10 nm class process).

As for your math, unfortunately it is not that simple. Process nodes have gotten so complicated, that the traditional nanometer metric does not apply anymore (with stuff like finfets, EUV, wider metal pitch and who knows what else, I do not even understand any of it). That is why they came up with those weird names.
TSMC's N5 has triple the density over Samsung's 8N. The N4 even slightly better than that. There is also a separate 4N process that NVIDIA is using here, and I do not know what the difference is compared to N4.

But AD102 has 2.7x more transistors with a slightly smaller die compared to GA102. The die size being so similar means that Ada chips are over 2x more expensive to make currently.
The cost increase is there, but their margins are another thing. As I said in another thread, maybe it is time to go back to making smaller and less power hungry dies.

Give up performance increases for one generation, instead utilize the efficiency of the new process and make the flagship model consume 250 W, sell it for $600-700 and call it a new generation of environment friendly technology. Would that not be a win in a world struggling for resources?
I think this is what AMD realized. I think they will let NVIDIA keep the pointless performance crown while they sell extremely efficient and affordable cards. I cannot wait to find out.
Posted on Reply
#68
Dimitriman
Official AMD price cuts:
  • Radeon RX 6950 XT - $949 (Was $1099 US)
  • Radeon RX 6900 XT - $699 (Was $999 US)
  • Radeon RX 6800 XT - $599 (Was $649 US)
  • Radeon RX 6800 - $549 (Was $579 US)
  • Radeon RX 6750 XT - $419 (Was $549 US)
  • Radeon RX 6700 XT - $379 (Was $479 US)
  • Radeon RX 6650 XT - $299 (Was $399 US)
  • Radeon RX 6600 XT - $239 (Was $329 US)
  • Radeon RX 6500 XT - $169 (Was $199 US)
  • Radeon RX 6400 - $149 (Was $159 US)
Observations:
1) Price cuts across the board = Making space for new generation at the old price slots (+some inflation)?
2) No absurd price cuts at the high end = AMD supply chain likely healthy?
Posted on Reply
#69
AusWolf
THU31As I said in another thread, maybe it is time to go back to making smaller and less power hungry dies.
This!
If new processes drive up costs and make GPUs extremely expensive, then where are the small, efficient and relatively affordable chips? I mean, in a world where everybody is constantly being told by governments and companies to downsize personal needs (buy blankets instead of paying for heating, pff), why is a company like Nvidia still hell bent on making large and power-hungry GPUs that no one can afford?
THU31Give up performance increases for one generation, instead utilize the efficiency of the new process and make the flagship model consume 250 W, sell it for $600-700 and call it a new generation of environment friendly technology. Would that not be a win in a world struggling for resources?
I think this is what AMD realized. I think they will let NVIDIA keep the pointless performance crown while they sell extremely efficient and affordable cards. I cannot wait to find out.
I wouldn't count on it. If news about the 7900 XT being a multi-chip die with 10k+ shaders are true, then it will be just as much of a niche product as the 4090. But we'll see.

The only thing I'm happy about in the current situation is the fact that game technologies don't evolve fast enough so that I would need to buy a new graphics card right now. Even my 6500 XT can run everything I want, and if a new game that I want comes out with bigger hardware requirements, I'll just dust off my noisy 2070 and call it a day.
Posted on Reply
#70
mechtech
THU31TSMC stated that the N5/N4 cost is about 2x higher than N7. And Samsung's 8N was cheaper than that (that was effectively a 10 nm class process).

As for your math, unfortunately it is not that simple. Process nodes have gotten so complicated, that the traditional nanometer metric does not apply anymore (with stuff like finfets, EUV, wider metal pitch and who knows what else, I do not even understand any of it). That is why they came up with those weird names.
TSMC's N5 has triple the density over Samsung's 8N. The N4 even slightly better than that. There is also a separate 4N process that NVIDIA is using here, and I do not know what the difference is compared to N4.

But AD102 has 2.7x more transistors with a slightly smaller die compared to GA102. The die size being so similar means that Ada chips are over 2x more expensive to make currently.
That’s why I said over simplification. :)
But the point of my ramble was show us your invoices nvidia to show us what is “considerable” so we know what “considerable” is. :)
Or just say 2x or something. Considerable to me would be 5x or more.

Either way we’re all still waiting for prices to return to pre-pandemic levels for pretty much everything.
Posted on Reply
#71
THU31
AusWolfI wouldn't count on it. If news about the 7900 XT being a multi-chip die with 10k+ shaders are true, then it will be just as much of a niche product as the 4090. But we'll see.
I never had a problem with niche top-end parts existing. You had cards with two GPUs, then you had Titans. But those cards were completely disconnected from the main line-up. The x80 was always considered the flagship and it was always within reach for enthusiasts.

Intel had the same thing. They used to have Extreme Edition CPUs for $1000 on mainstream sockets. Then they limited this to their HEDT platforms. And then AMD killed Intel's HEDT, as they could barely keep up in the mainstream segment.

But whatever you might want to say about Intel, they have kept their entry-level and mainstream prices on the same level for ages. Ever since Core 2 Duo, you could always buy a CPU based on the newest architecture under $200, even when AMD had nothing.
And you cannot say this about AMD. It took 1.5 years for Zen 3 to go under $200. And it looks like Zen 4 will be repeating this.
16 years of inflation and $200 still buys you a CPU that is suitable for high-framerate gaming and decent productivity.

But with Ada Lovelace, the entire initial line-up has to be considered niche and out of reach.
Posted on Reply
#72
MentalAcetylide
LycanwolfenYep and what happened when they went that way before big and heavy cards people lost interest in Nvidia, Then they went into a new direction of smaller but faster cards. Then now we back to big and bulky cards again. Freaking joke.

We all know why nvlink and SLI is dead because Nvidia did not want people to buy any cheap cards to SLI them or use nvlink and get the same performance of the most expensive card. I was laughing when they showed the DLSS working at 1440p at 135 fps on some games really. I can push over 100 fps at 2160p on two 1070ti's in SLI and use reshade me tweaker and make it look just the same.
Yeah, that's what happens when their video card lineup consists of almost a DOZEN different models. When people can buy two of the same video card model & use them together to get equal or better performance at a cheaper price than a single card one or two model tiers above it, there's something wrong with that business model. I could be wrong, but I think having so many different models that are trying to target so many different users that have different financial reaches in regards to computer hardware is making all of the cards more expensive. Never mind the fact that we're also paying an exorbitant NVidia tax. I say this because I'm assuming that it costs more to manufacture a greater variety vs. a smaller variety.
FFS, we have 3 different resolutions that we can game at: 1080, 1440, and 4k. Why the hell is it necessary for NVidia to have so many different tiers/models? The most I can see the need for is 5 different tiers with two of them being an intermediate between 1080 & 1440, and 1440 & 4k. Its like as if these graphics card companies have some kind of fetish for bigger, heavier, moar tiers/models.
Posted on Reply
#73
Athlonite

[URL='https://www.techpowerup.com/299159/jensen-huang-tells-the-media-that-moores-law-is-dead']Jensen Huang Tells the Media That[/URL] he couldn't give two fucks about EVGA leaving the fold and doesn't really care about consumers that much either infact he'd much rather we all fucked off and left him alone to play in his own little 3D realm

Posted on Reply
#74
Aquinus
Resident Wat-man
Well that's a silly justification for higher prices. He might as well have just said nothing.
Posted on Reply
#75
AnotherReader
THU31TSMC stated that the N5/N4 cost is about 2x higher than N7. And Samsung's 8N was cheaper than that (that was effectively a 10 nm class process).

As for your math, unfortunately it is not that simple. Process nodes have gotten so complicated, that the traditional nanometer metric does not apply anymore (with stuff like finfets, EUV, wider metal pitch and who knows what else, I do not even understand any of it). That is why they came up with those weird names.
TSMC's N5 has triple the density over Samsung's 8N. The N4 even slightly better than that. There is also a separate 4N process that NVIDIA is using here, and I do not know what the difference is compared to N4.

But AD102 has 2.7x more transistors with a slightly smaller die compared to GA102. The die size being so similar means that Ada chips are over 2x more expensive to make currently.
The cost increase is there, but their margins are another thing. As I said in another thread, maybe it is time to go back to making smaller and less power hungry dies.

Give up performance increases for one generation, instead utilize the efficiency of the new process and make the flagship model consume 250 W, sell it for $600-700 and call it a new generation of environment friendly technology. Would that not be a win in a world struggling for resources?
I think this is what AMD realized. I think they will let NVIDIA keep the pointless performance crown while they sell extremely efficient and affordable cards. I cannot wait to find out.
Higher node costs are an important factor, but Nvidia's rapacity isn't solely due to that. AD103 is 295 mm^2, and in 2020, the similar N5 process was 82% more expensive than the N7 process used for Big Navi. The Navi 21 die is 520 mm^2. This means that TSMC would charge almost the same amount to manufacture AD103 and Navi 21. So the 4080 12 GB, even with good margins, could have been priced similarly to the 2020 MSRP of the 6800 XT.
Posted on Reply
Add your own comment
Nov 21st, 2024 06:59 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts