Wednesday, March 19th 2025

Pat Gelsinger Repeats Observation that NVIDIA CEO "Got Lucky" with AI Industry Boom
Pat Gelsinger has quite bravely stepped into the belly of the beast this week. The former Intel boss was an invited guest at NVIDIA's GTC 2025 conference; currently taking place in San Francisco, California. Technology news outlets have extracted key quotes from Gelsinger's musings during an in-person appearance on Acquired's "Live at GTC" video podcast. In the past, the ex-Team Blue chief held the belief that NVIDIA was "extraordinarily lucky" with a market leading position. Yesterday's panel discussion provided a repeat visit—where Gelsinger repeated his long-held opinion: "the CPU was the king of the hill, and I applaud Jensen for his tenacity in just saying, 'No, I am not trying to build one of those; I am trying to deliver against the workload starting in graphics. You know, it became this broader view. And then he got lucky with AI, and one time I was debating with him, he said: 'No, I got really lucky with AI workload because it just demanded that type of architecture.' That is where the center of application development is (right now)."
The American businessman and electrical engineer reckons that AI hardware costs are climbing to unreasonable levels: "today, if we think about the training workload, okay, but you have to give away something much more optimized for inferencing. You know a GPU is way too expensive; I argue it is 10,000 times too expensive to fully realize what we want to do with the deployment of inferencing for AI and then, of course, what's beyond that." Despite the "failure" of a much older Intel design, Gelsinger delved into some rose-tinted nostalgia: "I had a project that was well known in the industry called Larrabee and which was trying to bridge the programmability of the CPU with a throughput oriented architecture (of a GPU), and I think had Intel stay on that path, you know, the future could have been different...I give Jensen a lot of credit (as) he just stayed true to that throughput computing or accelerated (vision)." With the semi-recent cancelation of "Falcon Shores" chip design, Intel's AI GPU division is likely regrouping around their next-generation "Jaguar Shores" project—industry watchdogs reckon that this rack-scale platform will arrive in 2026.Pat Gelsinger turns up at the 38-minute mark on the "Live at NVIDIA GTC with Acquired" videocast.
Sources:
NVIDIA YouTube Channel, Tom's Hardware, Wccftech, DigiTimes Asia
The American businessman and electrical engineer reckons that AI hardware costs are climbing to unreasonable levels: "today, if we think about the training workload, okay, but you have to give away something much more optimized for inferencing. You know a GPU is way too expensive; I argue it is 10,000 times too expensive to fully realize what we want to do with the deployment of inferencing for AI and then, of course, what's beyond that." Despite the "failure" of a much older Intel design, Gelsinger delved into some rose-tinted nostalgia: "I had a project that was well known in the industry called Larrabee and which was trying to bridge the programmability of the CPU with a throughput oriented architecture (of a GPU), and I think had Intel stay on that path, you know, the future could have been different...I give Jensen a lot of credit (as) he just stayed true to that throughput computing or accelerated (vision)." With the semi-recent cancelation of "Falcon Shores" chip design, Intel's AI GPU division is likely regrouping around their next-generation "Jaguar Shores" project—industry watchdogs reckon that this rack-scale platform will arrive in 2026.Pat Gelsinger turns up at the 38-minute mark on the "Live at NVIDIA GTC with Acquired" videocast.
19 Comments on Pat Gelsinger Repeats Observation that NVIDIA CEO "Got Lucky" with AI Industry Boom
They lost that position.
I hope they manage to recover their former glory but that hope is very very little
I mean the numbers don't lie anyway. He's got bigger issues to worry about
Yeah, Intel leadership back in the 2010's was just terrible! They completely wasted their market dominance
It happens every time someone sits a salesman as CEO of a tech company: by they time they realize their mistake and try to pass the ball to someone with actual understanting the the products they make it is usually too late and the competition is so far ahead it's not ever fair to say they're "competing".
Intel moves too slow. Larrabee took ages to develop and never saw the light of day. A-series Arc was extremely late, and came with too many problems that should have been ironed out before release. I've heard B-series is much better, but still late, and only a single lower-midrange card available, while even the delayed 9070 (XT) has been out for half a month now, and the 9060 and 5060 coming soon.
So sorry, but it's not just Nvidia's luck, but also Intel's snail pace in the GPU space.
All of this was in the early 2010s, well before crypto or AI. Even if Intel never made a GPU and missed the whole crypto/AI bubbles, they would have far better fab utilization rates if they had been "hungry" for new business.
They were not hungry and to make the situation even worse, they took all that immense profits at their peak and wasted it on boondoggles like McAfee which went absolutely nowhere and wasted opportunities like Altera and Mobileye which could have been promising but the larger management issues prevented the fruits of such expensive acquisitions from being harvested by Intel.
Intel was at the cutting edge of SSDs for a while with great early-era innovations, but they abandoned this field because they were not ok with low margins. (are you noticing a pattern here...)
All of this is separate from the massive issues they had on the infamous 10 nm node(s). That latter part is a technical issue, but everything else is a result of bad or incorrect business management.