Thursday, December 21st 2023

Intel Should be Leading the AI Hardware Market: Pat Gelsinger on NVIDIA Getting "Extraordinarily Lucky"

Intel CEO Pat Gelsinger considers NVIDIA "extraordinarily lucky" to be leading the AI hardware industry. In a recent public discussion with the students of MIT's engineering school to discuss the state of the semiconductor industry, Gelsinger said that Intel should be the one to be leading AI, but instead NVIDIA got lucky. We respectfully disagree. What Gelsinger glosses over with this train of thought is how NVIDIA got here. What NVIDIA has in 2023 is the distinction of being one of the hottest tech stocks behind Apple, the highest market share in a crucial hardware resource driving the AI revolution, and of course the little things, like market leadership over the gaming GPU market. What it doesn't have, is access to the x86 processor IP.

NVIDIA has, for long, aspired to be a CPU company, right from its rumored attempt to merge with AMD in the early/mid 2000s, to its stint with smartphone application processors with Tegra, an assortment of Arm-based products along the way, and most recently, its spectacularly unsuccessful attempt to acquire Arm from Softbank. Despite limited luck with the CPU industry, to level up to Intel, AMD, or even Qualcomm and MediaTek; NVIDIA never lost sight of its goal to be a compute hardware superpower, which is why, in our opinion, it owns the AI hardware market. NVIDIA isn't lucky, it spent 16 years getting here.
NVIDIA's journey to AI hardware leadership begins back in the late 2000s, when it saw the potential for GPU to be a general purpose processor, since programmable shaders essentially made the GPU a many-core processor with a small amount of fixed-function raster hardware on the side. The vast majority of an NVIDIA GPU's die-area is made up of streaming multiprocessors—the GPU's programmable SIMD muscle.

NVIDIA's primordial attempts to break into the HPC market with its GPUs bore fruit with its "Tesla" GPU, and the compute unified device architecture, or CUDA. NVIDIA's unique software stack that lets developers build and accelerate applications on its hardware dates all the way back to 2007. CUDA set in motion a long and exhaustive journey leading up to NVIDIA's first bets with accelerated AI on its GPUs a decade later, beginning with "Volta." NVIDIA realized that despite a vast amount of CUDA cores on its GPUs and HPC processors, it needed some fixed-function hardware to speed up deep learning neural network building, training, and inference, and developed the Tensor core.

In all this time, Intel continued to behave like a CPU company and not a compute company—the majority of its revenue came from client CPUs, followed by server CPUs, and it has consistently held accelerators at a lower priority. Even as Tesla and CUDA took off in 2007, Intel had its first blueprints for an SIMD accelerator, codenamed "Larrabee" as early as by 2008. The company hasn't accorded the focus Larrabee needed as a nascent hardware technology. But that's on Intel. AMD has been a CPU + GPU company since its acquisition of ATI in 2006, and has tried to played catch-up with NVIDIA by combining its Stream compute architecture with open compute software technologies. The reason AMD's Instinct CDNA processors aren't as successful as NVIDIA's A100 and H100 processors is the same reason Intel never stood a chance in this market with its "Ponte Vecchio"—it was slow to market, and didn't nurture an ecosystem around its silicon quite like NVIDIA did.

Hardware is a fraction of NVIDIA's growth story—the company has an enormous, top-down software stack, including its own programming language, APIs, prebuilt compute and AI models; and a thriving ecosystem of independent developers and ISVs that it has nurtured over these years. So by the time AI took off at scale as a revolution in computing, NVIDIA was ready with the fastest hardware, and the largest community of developers that could put it to use. We began this editorial by stating that it's a good thing NVIDIA didn't acquire an x86 license in the early 2000s. It could switch gears and look inward on the one thing it was already making that can crunch numbers at scale—GPUs with programmable shaders. What NVIDIA is extraordinarily lucky about is that it didn't get stuck with an x86 license.

You can watch Pat Gelsinger's interview over at MIT's YouTube channel, here.
Source: ExtremeTech
Add your own comment

55 Comments on Intel Should be Leading the AI Hardware Market: Pat Gelsinger on NVIDIA Getting "Extraordinarily Lucky"

#26
ThrashZone
Hi,
Yeah one got a lot of bucks off miners "a few times" you can guess which one got that easy money to blow on AI :laugh:
Intel was to busy on ++++++ refreshes every 6 months.
Posted on Reply
#27
JohH
Another hit piece against Gelsinger?
If you want to see what he actually has to say about how Intel messed it up watch from 13:00 to 20:50 here:

Summarized: Intel was working on massively parallel compute when he left. After he left, Intel more or less killed that project (only Xeon Phi continued). Gelsinger considers that the mistake. It would have been an expensive 10-year project and Intel's demise was not doing that. Now Intel is trying to catch up, but they have far more limited resources.
Posted on Reply
#28
Gooigi's Ex
Pat, just shut the f$&k up and make good products that are competitive instead of s!$t posting. It seems like that’s all you can do.
“AMD is using glue for their chips. Snake oil”, “WE SHOULD BE THE LEADER OF AI, Nvidia is “Extraordinary LUCKY” LUCKY!?!? Pfffffffahhahahahahahahahaha. As much I give s$&t to Nvidia, luck is definitely not that LOL
Posted on Reply
#29
Onasi
Intel is starting to look increasingly unhinged while the competition is ramping up. This is an embarrassing look for such an old and respected company. Just say that mis-steps were made and that you are committing your vast resources to doing better and catching up and overtaking other players instead of… whatever this is. You are looking like that guy who gets repeatedly farmed in a game and instead of sucking up and improving starts ranting about lag, poor balance, cheaters, cat on a keyboard, mouse running out of battery despite being wired and so on.
Posted on Reply
#30
oxidized
And why exactly should i listen to the guy who etched his initials on processors?
Posted on Reply
#31
Squared
john_"Nvidia is a software company that also happens to build the hardware where that software will run"
I did hear a long time ago that Intel does employ more software engineers than hardware engineers. I think all the major chipmakers invest a lot into software, but obviously Nvidia has funneled a lot into AI.

Intel and AMD software also tend to be free, open source, and easy to port to other platforms. I wonder if Nvidia's accountants are a lot more willing to spend on software knowing that everyone who uses it will have to buy Nvidia hardware. We criticize Nvidia a lot for proprietary software but Nvidia created adaptive displays, AI upscale, and other cool things. Then again, AMD gave us Mantle which became Vulcan and lead to DirectX 12.
Posted on Reply
#32
sephiroth117
Nvidia invested in areas way before other companies, we had DLSS when chatGPT and journey were non-existent and AI was really not that widespread.
Nvidia switch to being a service/software company in addition to hardware too, clairvoyance.
Same for cloud and HPC, Nvidia invested heavily in those areas and early-on.

Intel is in very good financial shape and their foundry business is really strategic. USA and EU are very interested in getting as many domestic foundries as possible.
Now they are investing at the same times on foundries, GPU, cloud, HPC, AI/ML...and catching up on CPU efficiency, that's the issue I feel that intel whilst they really had the monopoly and supremacy, under-invested and lacked clairvoyance, others caught up and it weakened Intel in many strategic area where it could have a much bigger footprint, true
Posted on Reply
#33
Darmok N Jalad
Honestly, it is a bit of luck for any one of these companies to get where they are. The rest, the overwhelming majority rest, is having a solid plan and executing extremely well. We've seen it before where AMD, Intel, or NVIDIA aimed a future product at a market, but failed miserably due to 1, missing design targets, 2, underestimating the competition, or 3, simply producing a sucktacular design. Look at Arc. Had it arrived remotely on time and had decent driver support on day one, we'd be taking Intel way more seriously in the GPU segment. One could say Intel was "unlucky" along several steps in the product development, but the reality is, their competitors were in a far better position to deliver a better product and thoroughly pummeled Arc before it even launched. NVIDIA is simply beating Intel at its own game through really good product designs, but, more than that, cornering the market by "partnering" with those making the software. They did it in gaming, and they're doing it again in AI.

Spin it however you want, Pat, but NVIDIA got your playbook and ran the plays way better than you did.
Posted on Reply
#34
samum
Intel has only 1 product: stock price. They have been dismembering and selling the company piece by piece to make the next quarterly/annual report look good.

Nvidia executes Jensen's vision with hardware and software, and the stock price is a byproduct.
Posted on Reply
#35
Chaitanya
Ferrum MasterEach article I see him in, I cannot decide either he is delusional, mad, a clown or just plain blind.
Or all of the above.

Posted on Reply
#36
c2DDragon
I guess he means they are "lucky" because they sent tons of hardware to China (long time before the US restriction hey, remember the time we couldn't get anything guys, covid my ass)...but fanboys see what they want to see x)
Posted on Reply
#37
Minus Infinity
The amount of sheer nonsense that Gelsinger utters is frightening. The hyperbole like "AMD is in our rearview" and that "Nvidia were just lucky" is insulting to himself, his company and the consumers. Sounds like sour grapes over his sacking, and that he thinks Larrabee would have been the beginning of Intel world domination is laughable.
Posted on Reply
#38
_Flare
Sad, how intel plays down its own mistakes and hybris and dares to call nvidia "lucky".
Intel itself had and still has a wide backbone of kinda pro-intel ISVs, media, business- and enduser supplyers and ecosystems but those leftovers are shrinking.
Still those where heavy weapons to sandback vs AMD in the x86 space and some are still acting today,
like a shielding for the remaining pro-intel ecosystems if i may call them like that.
That behavior was 100% intel mindset for many years .... and whoosh ... there is CUDA and Ryzen ... intel must have been thinking they where invincible untouchable or something.
But no sorry ... the others only where lucky. :kookoo:
Posted on Reply
#40
brian111
JohHAnother hit piece against Gelsinger?
If you want to see what he actually has to say about how Intel messed it up watch from 13:00 to 20:50 here:

Summarized: Intel was working on massively parallel compute when he left. After he left, Intel more or less killed that project (only Xeon Phi continued). Gelsinger considers that the mistake. It would have been an expensive 10-year project and Intel's demise was not doing that. Now Intel is trying to catch up, but they have far more limited resources.

Nvidia's luck, as it is being called, was what Jensen Huang said to Gelsinger himself.
Yes, at least in the section you point out Gelsinger is a bit more measured than this piece makes him out to be. While he speaks with a lot of confidence in general, he does gives credit to Jensen and Nvidia for sticking with their plan to be in the position they are.
Posted on Reply
#41
Dr. Dro
IMHO: the thing with NVIDIA is that as a company, it has never lost sight of what it stands for, what it does and what its objective is. This allows them to create high-quality products that will give excellent experiences to their customers of each and every segment that the company operates within, and allows them to raise their ASP accordingly. More recently, this has also elevated the status of their brand, not entirely unlike Apple did and for the same reasons.

Both Intel and AMD, their greatest rivals, do so routinely (Intel relies on their tick/tock cadence and if this fails, their entire business goes belly up, see 14nm++++ and more recently, the quote unquote """"14th gen""" BS they have come up with), with AMD having an honest to God corporate culture problem; which is why no matter how great a product they seem to create, it will always be the second option: either because the software falters (and would you believe me if I told you their devs are not idiots? they just have their hands bound by the suits, this is why I am so fed up with AMD!) or because marketing oversold it - and at the wrong time as well.
Posted on Reply
#42
Patriot
phanbueyYeah they did, repeatedly and purposefully over 20 years of development... They have really good luck.

www.techpowerup.com/61702/university-of-antwerp-makes-4k-eur-supercomputer-with-four-geforce-9800-gx2-cards



All of that lucky BrookGPU -> to CUDA -> cuDNN roadmap that took decades to execute on.

They even have the marketing slides.
They had a vision for use of hardware acceleration, just not AI, they stole that from Google... who launched the TPU in 2016, and they added Tensor cores to their GPUs in 2017.
They haven't been doing accelerations targeted at neural nets before google made tensorflow.

Its pretty basic to be trying to create demand for your accelerators.
Posted on Reply
#43
remixedcat
ThrashZoneHi,
Yeah one got a lot of bucks off miners "a few times" you can guess which one got that easy money to blow on AI :laugh:
Intel was to busy on ++++++ refreshes every 6 months.
ivy bridge refresh coming Q2 25
Posted on Reply
#44
wolf
Performance Enthusiast
Good editorial and an interesting read.
Posted on Reply
#45
Pepamami
How so? It feels like Intel did nothing special after the Sandy Bridge release. In next few years I bet they gonna lose the SSD market, and in few years later probably will close their ARC gpus
Posted on Reply
#46
Onasi
PepamamiHow so? It feels like Intel did nothing special after the Sandy Bridge release. In next few years I bet they gonna lose the SSD market, and in few years later probably will close their ARC gpus
Uh, they sold their SSD business a couple years back already. And then shut down Optane (which they initially kept) last year too.
Posted on Reply
#47
Sabotaged_Enigma
Despite how much I hate Nvidia, reading those words of a CEO I'm like "has he lost his minds, gone crazy or what?"
Should I say that Intel "was lucky enough to lead the fab process before 14 nm"???
Posted on Reply
#48
Hyderz
okay come on intel show us what intel can do in the AI segment
Posted on Reply
#49
mama
I agree. Nvidia has all the luck.
Posted on Reply
#50
KLMR
Intel had bad luck when launched their GPU lineup.
Posted on Reply
Add your own comment
Jun 3rd, 2024 11:21 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts