Monday, December 5th 2022
AMD Still Believes in Moore's Law, Unlike NVIDIA
Back in September, NVIDIA's Jensen Huang said that Moore's Law is dead, but it seems like AMD disagrees with NVIDIA, at least for now. According to an interview with AMD's CTO, Mark Papermaster, AMD still believes that Moore's Law will be alive for another six to eight years. However, AMD no longer believes that transistor density can be doubled every 18 to 24 months, while remaining in the same cost envelope. "I can see exciting new transistor technology for the next - as far as you can really plot these things out - about six to eight years, and it's very, very clear to me the advances that we're going to make to keep improving the transistor technology, but they're more expensive," Papermaster said.
AMD believes we'll see a change in how chips are being designed and put together, with chiplets being the future of semiconductors. Papermaster calls this "a Moore's Law equivalent, meaning that you continue to really double that capability every 18 to 24 months" although it's not exactly Moore's Law in the traditional sense. AMD also appears to be betting heavily on FPGA technology in some of its market segments, for something the company calls adaptive computing. As to how things will play out, time will tell, but with both AMD and Intel going down the chiplet route, albeit in slightly different ways, we should continue to see new innovations from both companies, with or without Moore's Law.
Source:
The Register
AMD believes we'll see a change in how chips are being designed and put together, with chiplets being the future of semiconductors. Papermaster calls this "a Moore's Law equivalent, meaning that you continue to really double that capability every 18 to 24 months" although it's not exactly Moore's Law in the traditional sense. AMD also appears to be betting heavily on FPGA technology in some of its market segments, for something the company calls adaptive computing. As to how things will play out, time will tell, but with both AMD and Intel going down the chiplet route, albeit in slightly different ways, we should continue to see new innovations from both companies, with or without Moore's Law.
42 Comments on AMD Still Believes in Moore's Law, Unlike NVIDIA
Just seems weird to me.
I think the future will be a break from silicon or the traditional processes anyway. Like using a metric besides nm. Quantum is too far off. We are more likely to see slotted CPUs or a break through in the current process/elements before that kind of thing is common; even if its the logical evolution.
www.cerebras.net/
The chiplets is the last resort but definitely only a temporary solution.
Going further there are two solutions:
1. Faster software
2. The cloud but it will require faster internet connection speeds
However, it's no longer what it once was.
Imagine running a 64-bit Windows 95 on Ryzen 9 5950X. How fast would it be?
Many of the slow-downs are artificially added planned obsolescence shenanigans by M$ which asks that you pay more and sells the new hardware.
It's all a conspiracy.
Silicon has come to the end of its useful life as a semiconductor, and we don't yet have something to replace it (at least, not something as cheap or simple).
We have been stuck to a socket size for how long now? In the meantime, parts of the whole system have moved into the CPU, and motherboards clearly have the capacity to shrink to lots of sizes for lots of use cases. There's a lot of free space there.
Also, larger chips have more wiggle room in terms of temperature. Temperature-regulated activation or load regulation of core complexes? Sure... AMD is already moving into a fixed peak temp with the current gen, so they know how to extract performance adhering to a strict temp limit, its like a perfection of throttling.
Larger chips do require changes in the economy of the whole fabrication/production process and its cost. The first thing that comes to mind is: longer lifetimes of products. Pay more, use longer, because honestly, the E-waste we create now is retarded if you look at what the chip is capable of.
As for size I kind of wonder how big they can get away with I almost wonder if they will run into odd latency problems with super big dies if they go that route. Maybe at that point it would just be creative positioning of the chiplets they already do similar in the arc itself so it would really just be doing it in a bigger scale.
Thinking about it more given the already existing chiller design they might even be doing that kind of thought already.
Interesting times ahead for sure.
I think wafer cost and obviously max reticle size put limits on the chiplets size individually And the substrate especially if silicon becomes more important and the limiter(of package sizes)
I agree interesting.
It's not really a conspiracy. It's human nature to be more lazy with more resources. That said modern OSes do way way more than 9x could even dream of (Heck, OS/2 Warp 4 kills 9x there). It's a buzzword for "putting something on a server somewhere else."
Killing the whole concept would kill the internet as we know it. But the buzzword status it has gained isn't helping anyones understanding of it, that's for sure.
I'm thinking of one of those popular memes but I'm drawing a blank.
As far as costs go.... TSMC did raise prices and Nvidia no longer gets the repeat customer discount from TSMC.
Everybody focuses on his first statement: “Moore's Law's dead.”
He continued with another statement: “And the ability for Moore's Law to deliver twice the performance at the same cost, or at the same performance, half the cost, every year and a half, is over."
But that second statement doesn't make Moore's law automatically invalid - because the original Moore's law doesn't make any observations on cost:
"Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years."
Of course you can say that lowering cost is what enables this doubling every two years, but you can't just change what the "law" actually states and then moan that it doesn't hold any more.
And we all know what he actually said with this. That price uplift from RTX 3080 at $699 to RTX 4080 at $1199 is somehow the new law, deal with it.
That pace had slown down, so in its original form it was dead for years.
If one reads "it's not dead yet" as "we are still seeing progress at a slower, but still steady pace", then, well, it's not dead.