• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Pat Gelsinger Repeats Observation that NVIDIA CEO "Got Lucky" with AI Industry Boom

T0@st

News Editor
Joined
Mar 7, 2023
Messages
2,714 (3.64/day)
Location
South East, UK
System Name The TPU Typewriter
Processor AMD Ryzen 5 5600 (non-X)
Motherboard GIGABYTE B550M DS3H Micro ATX
Cooling DeepCool AS500
Memory Kingston Fury Renegade RGB 32 GB (2 x 16 GB) DDR4-3600 CL16
Video Card(s) PowerColor Radeon RX 7800 XT 16 GB Hellhound OC
Storage Samsung 980 Pro 1 TB M.2-2280 PCIe 4.0 X4 NVME SSD
Display(s) Lenovo Legion Y27q-20 27" QHD IPS monitor
Case GameMax Spark M-ATX (re-badged Jonsbo D30)
Audio Device(s) FiiO K7 Desktop DAC/Amp + Philips Fidelio X3 headphones, or ARTTI T10 Planar IEMs
Power Supply ADATA XPG CORE Reactor 650 W 80+ Gold ATX
Mouse Roccat Kone Pro Air
Keyboard Cooler Master MasterKeys Pro L
Software Windows 10 64-bit Home Edition
Pat Gelsinger has quite bravely stepped into the belly of the beast this week. The former Intel boss was an invited guest at NVIDIA's GTC 2025 conference; currently taking place in San Francisco, California. Technology news outlets have extracted key quotes from Gelsinger's musings during an in-person appearance on Acquired's "Live at GTC" video podcast. In the past, the ex-Team Blue chief held the belief that NVIDIA was "extraordinarily lucky" with a market leading position. Yesterday's panel discussion provided a repeat visit—where Gelsinger repeated his long-held opinion: "the CPU was the king of the hill, and I applaud Jensen for his tenacity in just saying, 'No, I am not trying to build one of those; I am trying to deliver against the workload starting in graphics. You know, it became this broader view. And then he got lucky with AI, and one time I was debating with him, he said: 'No, I got really lucky with AI workload because it just demanded that type of architecture.' That is where the center of application development is (right now)."

The American businessman and electrical engineer reckons that AI hardware costs are climbing to unreasonable levels: "today, if we think about the training workload, okay, but you have to give away something much more optimized for inferencing. You know a GPU is way too expensive; I argue it is 10,000 times too expensive to fully realize what we want to do with the deployment of inferencing for AI and then, of course, what's beyond that." Despite the "failure" of a much older Intel design, Gelsinger delved into some rose-tinted nostalgia: "I had a project that was well known in the industry called Larrabee and which was trying to bridge the programmability of the CPU with a throughput oriented architecture (of a GPU), and I think had Intel stay on that path, you know, the future could have been different...I give Jensen a lot of credit (as) he just stayed true to that throughput computing or accelerated (vision)." With the semi-recent cancelation of "Falcon Shores" chip design, Intel's AI GPU division is likely regrouping around their next-generation "Jaguar Shores" project—industry watchdogs reckon that this rack-scale platform will arrive in 2026.




Pat Gelsinger turns up at the 38-minute mark on the "Live at NVIDIA GTC with Acquired" videocast.

View at TechPowerUp Main Site | Source
 
Joined
Jan 8, 2024
Messages
300 (0.68/day)
It felt like Intel was reacting to CUDA and the rise of GPGPU, but without any long-term plans to push the field forward. Even 10 years ago, running machine learning on GPUs was a hot topic. It just wasn't all over the place like AI is nowadays. Intel could have and should have captured a slice of the pie. Better late than never I guess.
 
Joined
Oct 5, 2024
Messages
244 (1.46/day)
Location
United States of America
Come on Pat, don't be a sore loser. I was with your corporate strategy as the right move for Intel that just came too late, but this just makes you look pathetic. Love or hate Jensen, but he was planning for AI long before it became popular to use AI as a marketing move.
 
Joined
Jan 11, 2022
Messages
1,175 (1.01/day)
He's correct and intel was in a luxury position to make a heck of a lot of mistakes.
They lost that position.

I hope they manage to recover their former glory but that hope is very very little
 
Joined
Sep 17, 2014
Messages
23,544 (6.13/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I think its commendable he points out that Jensen was on the right path as well. Its not just being sore here. More of an observation I'd say.

I mean the numbers don't lie anyway. He's got bigger issues to worry about
 
Joined
Sep 13, 2022
Messages
315 (0.34/day)
It felt like Intel was reacting to CUDA and the rise of GPGPU, but without any long-term plans to push the field forward. Even 10 years ago, running machine learning on GPUs was a hot topic. It just wasn't all over the place like AI is nowadays. Intel could have and should have captured a slice of the pie. Better late than never I guess.
Intel did the exact same thing IBM did before them: sit on their behinds thinking nothing could possibly shake them from market leadership.

It happens every time someone sits a salesman as CEO of a tech company: by they time they realize their mistake and try to pass the ball to someone with actual understanting the the products they make it is usually too late and the competition is so far ahead it's not ever fair to say they're "competing".
 
Joined
Dec 12, 2016
Messages
2,442 (0.81/day)
Sorry Patty Cakes but Nvidia Tesla data center COMPUTE units were around way before AI. Your precious company stopped competing a long time ago and only ‘delivered against the workload’ through anti-competitive behavior.
 
Joined
Mar 16, 2017
Messages
2,349 (0.80/day)
Location
Tanagra
Processor Ryzen 7 5800XT 65W mode
Motherboard MSI Pro B550M-VC WIFI
Cooling Wraith Prism
Memory 32GB DDR4-3600
Video Card(s) AsRock 7600 Challenger 8GB
Storage WD NVME 1GB
Display(s) ASUS Pro Art 27"
Case Antec something or other
Power Supply EVGA 500W 80
Well of course success comes with being in the right place at the right time, but that’s not the end of it. You also have to be in a position to do something about it. Intel was also in the right place at the right time and did something about it, and then the empire faded. They are still important, but no longer the dominant industry leader.
 
Joined
Aug 22, 2007
Messages
3,645 (0.57/day)
Location
Terra
System Name :)
Processor Intel 13700k
Motherboard Gigabyte z790 UD AC
Cooling Noctua NH-D15
Memory 64GB GSKILL DDR5
Video Card(s) Gigabyte RTX 4090 Gaming OC
Storage 960GB Optane 905P U.2 SSD + 4TB PCIe4 U.2 SSD
Display(s) Alienware AW3423DW 175Hz QD-OLED + AOC Agon Pro AG276QZD2 240Hz QD-OLED
Case Fractal Design Torrent
Audio Device(s) MOTU M4 - JBL 305P MKII w/2x JL Audio 10 Sealed --- X-Fi Titanium HD - Presonus Eris E5 - JBL 4412
Power Supply Silverstone 1000W
Mouse Roccat Kain 122 AIMO
Keyboard KBD67 Lite / Mammoth75
VR HMD Reverb G2 V2
Software Win 11 Pro
Sorry Patty Cakes but Nvidia Tesla data center COMPUTE units were around way before AI.
AI has been around way before NVidia existed.... ;)
 
Joined
Apr 12, 2013
Messages
7,790 (1.79/day)
Right, so Intel got lucky with *dozer & the OEM bribes ~ no wait maybe that doesn't count :slap:
If they didn't start running the company like a Amway for decade maybe they'd be in a better position today.
They're still doing it today! How come you need to change your mobo every 2nd gen? Yeah and for Intel apologists 12-14th gen is virtually the same uarch.
 
Joined
Sep 1, 2020
Messages
2,627 (1.58/day)
Location
Bulgaria
Well, Pat is just crying about his huge salary, bonuses, and benefits that he was getting from Intel. He could blame himself if he wasn't too narcissistic to do so.
 
Joined
Jan 14, 2019
Messages
15,217 (6.74/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
I think he's right in his comment but... Big but...

Intel moves too slow. Larrabee took ages to develop and never saw the light of day. A-series Arc was extremely late, and came with too many problems that should have been ironed out before release. I've heard B-series is much better, but still late, and only a single lower-midrange card available, while even the delayed 9070 (XT) has been out for half a month now, and the 9060 and 5060 coming soon.

So sorry, but it's not just Nvidia's luck, but also Intel's snail pace in the GPU space.
 
Joined
Jan 12, 2023
Messages
344 (0.43/day)
System Name IZALITH (or just "Lith")
Processor AMD Ryzen 7 7800X3D (4.2Ghz base, 5.0Ghz boost, -30 PBO offset)
Motherboard Gigabyte X670E Aorus Master Rev 1.0
Cooling Deepcool Gammaxx AG400 Single Tower
Memory Corsair Vengeance 64GB (2x32GB) 6000MHz CL40 DDR5 XMP (XMP enabled)
Video Card(s) PowerColor Radeon RX 7900 XTX Red Devil OC 24GB (2.39Ghz base, 2.99Ghz boost, -30 core offset)
Storage 2x1TB SSD, 2x2TB SSD, 2x 8TB HDD
Display(s) Samsung Odyssey G51C 27" QHD (1440p 165Hz) + Samsung Odyssey G3 24" FHD (1080p 165Hz)
Case Corsair 7000D Airflow Full Tower
Audio Device(s) Corsair HS55 Surround Wired Headset/LG Z407 Speaker Set
Power Supply Corsair HX1000 Platinum Modular (1000W)
Mouse Logitech G502 X LIGHTSPEED Wireless Gaming Mouse
Keyboard Keychron K4 Wireless Mechanical Keyboard
Software Arch Linux
I think he's right in his comment but... Big but...

Intel moves too slow. Larrabee took ages to develop and never saw the light of day. A-series Arc was extremely late, and came with too many problems that should have been ironed out before release. I've heard B-series is much better, but still late, and only a single lower-midrange card available, while even the delayed 9070 (XT) has been out for half a month now, and the 9060 and 5060 coming soon.

So sorry, but it's not just Nvidia's luck, but also Intel's snail pace in the GPU space.
I think some credit is due to Nvidia for their development speed and hardware as well. Intel might've been slow on the uptake but Nvidia went all in FAST and it shows in the quality of their AI stack. Everyone else is playing catchup after all.
 
Joined
Oct 5, 2024
Messages
244 (1.46/day)
Location
United States of America
For the last decade Intel has traded innovation for cheap H1-B labor and lost the race in the process.
I don't think Intel's lack of innovation is because they had excess proportions of H1-B labor. It is more that Intel chose to keep profit margins high instead of reinvesting it in new lines of business like handhelds (the NUC chips were a start but Intel never took this seriously), phones (the Atoms were a start but Intel never was ok with tiny profit margins and rapid development efforts), consoles (Intel was never willing to accept the low margins and work intensively with Sony and Microsoft to make custom hardware), SSDs (a product type they dominated in but were not willing to lower margins and prices to compete), etc, etc.

All of this was in the early 2010s, well before crypto or AI. Even if Intel never made a GPU and missed the whole crypto/AI bubbles, they would have far better fab utilization rates if they had been "hungry" for new business.

They were not hungry and to make the situation even worse, they took all that immense profits at their peak and wasted it on boondoggles like McAfee which went absolutely nowhere and wasted opportunities like Altera and Mobileye which could have been promising but the larger management issues prevented the fruits of such expensive acquisitions from being harvested by Intel.

Intel was at the cutting edge of SSDs for a while with great early-era innovations, but they abandoned this field because they were not ok with low margins. (are you noticing a pattern here...)

All of this is separate from the massive issues they had on the infamous 10 nm node(s). That latter part is a technical issue, but everything else is a result of bad or incorrect business management.
 
Last edited:
Joined
Sep 13, 2022
Messages
315 (0.34/day)
I think some credit is due to Nvidia for their development speed and hardware as well. Intel might've been slow on the uptake but Nvidia went all in FAST and it shows in the quality of their AI stack. Everyone else is playing catchup after all.
nVidia has been ahead of the curve the whole time, it's just people don't know or don't make the connection because after so many years and repetitions is easy to forget the DL in DLSS stands for Deep Learning and the Super Sampling work is done by a neural network. Now look at the dates: DLSS was announced with the RTX 20 series in september 2018, just 2 months after the release of GPT-1 and released 1.0 in February 2019, five months later. What's more, nVidia invests a ton in AI research. If you watch Two Minute Papers you'll know about half of the papers published every week on graphics processing, light transport, etc. come from them. Here is a recent example:

 
Joined
Oct 18, 2017
Messages
233 (0.09/day)
Location
Baguetteland
System Name 1080p 144hz
Processor 7800X3D
Motherboard Asus X670E crosshair hero
Cooling Noctua NH-D15
Memory G.skill flare X5 2*16 GB DDR5 6000 Mhz CL30
Video Card(s) Nvidia RTX 4070 FE
Storage Western digital SN850 1 TB NVME
Display(s) Asus PG248Q
Case Phanteks P600S
Audio Device(s) Logitech pro X2 lightspeed
Power Supply EVGA 1200 P2
Mouse Logitech G PRO
Keyboard Logitech G710+
Software Windows 11 24H2
Benchmark Scores https://www.3dmark.com/sw/1143551
Sore looser. This is not luck. Jensen has been in the game for over 30 years. He understood with Moore's law that hardware can only take you so far, and that approaching hardware physical limitations the software part would take greater and greater importance. Jensen is as much a visionary as Steeve Jobs or Bill Gates. Something that Pat never had while CEO of Intel.
 
Top