• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Jensen Huang Tells the Media That Moore's Law is Dead

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
18,491 (2.47/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
NVIDIA's CEO has gone out on a limb during a video call with the media, where he claimed that Moore's Law is Dead, in response to the high asking price for its latest graphics cards. For those not familiar with Moore's law, it's an observation by Intel's Gordon Moore that says that transistors double in density inside dense integrated circuits every two years, while at the same time, the cost of computers are halved. The follow-on to this observation is that there's also a doubling of the performance every two years, if maintaining the same cost. This part doesn't quite hold true any more, due to all major foundries having increased the cost when using their cutting edge nodes. We're also reaching a point where it's getting increasingly difficult to shrink process nodes in semiconductor fabs. However, Jensen Huang's statement has nothing to do with the actual node shrinks, which makes his statement a bit flawed.

Jensen's focus seems to be on the latter half of Moore's law, the part related to semiconductors getting cheaper, which in turn makes computers cheaper. However, this hasn't been true for some time now and Jensen's argument in this case is that NVIDIA's costs of making semiconductors have gone up. Jensen is quoted as saying "A 12-inch wafer is a lot more expensive today than it was yesterday, and it's not a little bit more expensive, it is a ton more expensive," "Moore's Law is dead … It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past." What he actually meant is that we shouldn't expect semiconductors to be as cheap as they've been in the past, although part of the issue NVIDIA is having is that their products have to be produced on cutting edge notes, which cost significantly more than more mature nodes. It'll be interesting to see if AMD can deliver graphics chips and cards with a more competitive price point than NVIDIA, as that would refute some of Jensen's claims.



View at TechPowerUp Main Site | Source
 
That's why AMD is using chiplet design on RDNA3. GET SKILL LEATHER JACKET BOYS WOOOOO welcome to the thunderdome Nvidia, get skill yeeee bois

and I bet RDNA3 matches if not beats rtx 4xxx series in raw fps and undercuts costs at same time. we will find out soon. Nvidia. lol.
 
So Jensen is saying that the 4090 doesnt have 2.5x the perfomance of the 3090 that they bragged about days ago.
The insane profits of the crypto boom and shortage have clearly warped this mans grip on reality.
 
Hey remember all those people laughing and bashing Intel's GPU development, saying they should just toss in the towel ...probably want Intel to keep at it now huh?
 
Saying Moore's law is dead is justifying the expensive 4000 series prices. Nothing more. In a few months, we'll see what AMD's response to this will be. Whether Moore's law only applies to next gen Nvidia cards or not.
 
Jensen you can suck my ... :kookoo: AMD has just slashed RX 6000 prices. I was really tempted to get 4080, but priced at $1200 = 1500€ in Europe is pure insanity. Huang you can keep your expensive toys, I'm gonna wait Navi 3. I hope Navi 33 and 32 get more down to Earth pricing. If not I'm getting used 3070 or 3080 and call it a day for 2 years.
 
Seeing Jen playing with the dinos as a tiny action figure at the bottom of your article is surprisingly amusing...
 
Translation: 12-inch wafer is a lot more expensive so they're making more money and we're gonna need to make even more money which is why RTX 40 is so expensive! And consumers, shut up and give me your money!
LOLLLLLL
Btw, I think the best chance in years for AMD to flip the game has come. Every condition has been set. I hope AMD would not waste this opportunity. And I think it's pretty easy - just price RX 7000 reasonably and it'll be a win.
 
Last edited:
Nvidia trying to clear excess 3000 series inventory. Regardless, it's down to the consumer if they want to pay these prices. If people don't want to or can't, Nvidia will have poor sales and be forced to adapt.
 
Nvidia trying to clear excess 3000 series inventory. Regardless, it's down to the consumer if they want to pay these prices. If people don't want to or can't, Nvidia will have poor sales and be forced to adapt.
Consumers, be strong!
Bastard Huang is now bullying consumers after he's hurt nv partners and everybody else.
 
It's part true (smaller process nodes are more expensive) but also part nonsense and a whole lot of excuse making for lack of decent lower-end GPU's. Let's take the most popular sub £300 market:-

GTX 1660S (2019) = £199. 120w. 284mm2 die size (12nm), 6.6bn transistors.
RTX 3050 (2022) = £299. 130w. 276mm2 die size (8nm), 12bn transistors.
RX 6600 (2021) = £270. 120w. 237mm2 die size (7nm), 11bn transistors.

Basically the £270 AMD 2021 GPU shows a 4x larger performance difference (+28%) vs a £300 NVidia 2022 GPU, than that £300 NVidia 2022 GPU does (+7%) vs a £200 NVidia 2019 GPU, with a smaller die size, 1bn fewer transistors and lower price... nVidia's problem there isn't "Moore's Law", it's that "no we won't make £300 2022 GPU's any faster than £200 2019 GPU's, why on Earth would you want that? Here have this 64-bit GTX 1630 that's slower than our 2016 1050Ti made from leftover dies that we didn't sell to miners...") is simply an intelligence insulting ripoff, and blaming 'the laws of physics' as their "limitation" for having no decent low-end GPU's to show after 3 years of sitting on their behinds doing nothing is not very believable. Personally I'm waiting to see what the low-mid tiers of the RTX 4050-4060 / RX 7000 series will bring. Either nVidia get it together and refocus on significantly improving 4050 / 4060 core rasterization performance & efficiency at a sane price and correct their current "direction", or they can GTFO.
 
Sounds like an excuse to me. It seems like there's a lot of reliance on foundries to make better processes so they can just cram more transistors in, rather than improving the architecture itself. This is why we had about 57 *lake generations. Even with the increased density from better manufacturing processes, they just keep making bigger and bigger chips. Guess they're having issues obtaining the next set of design specs from the aliens or something...
 
Sad part is that some thought Moore’s Law was a scientific law like gravity. That’s how Intel wanted us to think. Instead its just a marketing and sales goal that no one should have repeated outside of meetings with people like ‘Jan the Man.’
 
Sounds like an excuse to me. It seems like there's a lot of reliance on foundries to make better processes so they can just cram more transistors in, rather than improving the architecture itself. This is why we had about 57 *lake generations. Even with the increased density from better manufacturing processes, they just keep making bigger and bigger chips. Guess they're having issues obtaining the next set of design specs from the aliens or something...
Well, it seems like it's hard to innovate on the chip design side, so all these companies rely more and more on node shrinks, which in turn makes for more expensive chips, as the foundries want to recuperate their investments within a set period of time.

Sad part is that some thought Moore’s Law was a scientific law like gravity. That’s how Intel wanted us to think. Instead its just a marketing and sales goal that no one should have repeated outside of meetings with people like ‘Jan the Man.’
Well, it remained true for a very long time, but we already reached something of an end to it a few years ago.
That said, it's just something "we" made up, it's no universal law by any means.
 
Everyone keeps saying "AMD has a chance to grab marketshare", but that's not going to happen....because even when AMD offers a better value, everyone still buys nvidia (say what you will about AdoredTV, but a year or so ago he did a lot of research into this phenomenon and used the time from the late 2000s and early 2010s as an example for a multpart video on the subject. AMD offered faster cards for better value, and Nvidia still outsold them by a magnitude)...what they really mean is they want AMD to drive prices down so they can buy an Nvidia card cheaper, but that'll never happen unless people actually buy the AMD cards!

People keep complaining about Nvidia and their prices (which unfortunately is going to force AMD to follow suit to please there shareholders), but as long as those same people keep buying Nvidia products, not only will prices not go down, they'll keep going up.
 
I think I will just avoid the Christmas Rush and start hating Nvidia right now.
 
Everyone keeps saying "AMD has a chance to grab marketshare", but that's not going to happen....because even when AMD offers a better value, everyone still buys nvidia (say what you will about AdoredTV, but a year or so ago he did a lot of research into this phenomenon and used the time from the late 2000s and early 2010s as an example for a multpart video on the subject. AMD offered faster cards for better value, and Nvidia still outsold them by a magnitude)...what they really mean is they want AMD to drive prices down so they can buy an Nvidia card cheaper, but that'll never happen unless people actually buy the AMD cards!

People keep complaining about Nvidia and their prices (which unfortunately is going to force AMD to follow suit to please there shareholders), but as long as those same people keep buying Nvidia products, not only will prices not go down, they'll keep going up.
I'd be more than happy to jump ship. I've owned plenty of ATI cards over the years, but I don't actually think I've owned an AMD card.
I actually regret not get a 6800 XT when they were selling at MSRP, but I didn't really have the money at the time.
 
Some of the craziest shit he's said.
Like stuff cannot be evolved we only just got Euv and haven't seen it's limit or exotic substrates yet
 
Well, one more reason to be disappointed.
 
He simply treats us badly because he is insanely wealthy (very rich capitalist) and thinks that only God is above him.
He lost connection with the reality. He doesn't know that we are in multiple crises, he doesn't know that the high-level inflation causes many people to stop consuming.
He simply lives in his own virtual reality.
 
Everyone keeps saying "AMD has a chance to grab marketshare", but that's not going to happen....because even when AMD offers a better value, everyone still buys nvidia (say what you will about AdoredTV, but a year or so ago he did a lot of research into this phenomenon and used the time from the late 2000s and early 2010s as an example for a multpart video on the subject. AMD offered faster cards for better value, and Nvidia still outsold them by a magnitude)...what they really mean is they want AMD to drive prices down so they can buy an Nvidia card cheaper, but that'll never happen unless people actually buy the AMD cards!

People keep complaining about Nvidia and their prices (which unfortunately is going to force AMD to follow suit to please there shareholders), but as long as those same people keep buying Nvidia products, not only will prices not go down, they'll keep going up.
Truth has been spoken.

But happy owner of a 10gb 6700 here which I bought 2 months ago for 375 bucks.

Waiting for competion from AMD!
 
Back
Top