• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Editorial Intel Should be Leading the AI Hardware Market: Pat Gelsinger on NVIDIA Getting "Extraordinarily Lucky"

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,217 (7.55/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel CEO Pat Gelsinger considers NVIDIA "extraordinarily lucky" to be leading the AI hardware industry. In a recent public discussion with the students of MIT's engineering school to discuss the state of the semiconductor industry, Gelsinger said that Intel should be the one to be leading AI, but instead NVIDIA got lucky. We respectfully disagree. What Gelsinger glosses over with this train of thought is how NVIDIA got here. What NVIDIA has in 2023 is the distinction of being one of the hottest tech stocks behind Apple, the highest market share in a crucial hardware resource driving the AI revolution, and of course the little things, like market leadership over the gaming GPU market. What it doesn't have, is access to the x86 processor IP.

NVIDIA has, for long, aspired to be a CPU company, right from its rumored attempt to merge with AMD in the early/mid 2000s, to its stint with smartphone application processors with Tegra, an assortment of Arm-based products along the way, and most recently, its spectacularly unsuccessful attempt to acquire Arm from Softbank. Despite limited luck with the CPU industry, to level up to Intel, AMD, or even Qualcomm and MediaTek; NVIDIA never lost sight of its goal to be a compute hardware superpower, which is why, in our opinion, it owns the AI hardware market. NVIDIA isn't lucky, it spent 16 years getting here.



NVIDIA's journey to AI hardware leadership begins back in the late 2000s, when it saw the potential for GPU to be a general purpose processor, since programmable shaders essentially made the GPU a many-core processor with a small amount of fixed-function raster hardware on the side. The vast majority of an NVIDIA GPU's die-area is made up of streaming multiprocessors—the GPU's programmable SIMD muscle.

NVIDIA's primordial attempts to break into the HPC market with its GPUs bore fruit with its "Tesla" GPU, and the compute unified device architecture, or CUDA. NVIDIA's unique software stack that lets developers build and accelerate applications on its hardware dates all the way back to 2007. CUDA set in motion a long and exhaustive journey leading up to NVIDIA's first bets with accelerated AI on its GPUs a decade later, beginning with "Volta." NVIDIA realized that despite a vast amount of CUDA cores on its GPUs and HPC processors, it needed some fixed-function hardware to speed up deep learning neural network building, training, and inference, and developed the Tensor core.

In all this time, Intel continued to behave like a CPU company and not a compute company—the majority of its revenue came from client CPUs, followed by server CPUs, and it has consistently held accelerators at a lower priority. Even as Tesla and CUDA took off in 2007, Intel had its first blueprints for an SIMD accelerator, codenamed "Larrabee" as early as by 2008. The company hasn't accorded the focus Larrabee needed as a nascent hardware technology. But that's on Intel. AMD has been a CPU + GPU company since its acquisition of ATI in 2006, and has tried to played catch-up with NVIDIA by combining its Stream compute architecture with open compute software technologies. The reason AMD's Instinct CDNA processors aren't as successful as NVIDIA's A100 and H100 processors is the same reason Intel never stood a chance in this market with its "Ponte Vecchio"—it was slow to market, and didn't nurture an ecosystem around its silicon quite like NVIDIA did.

Hardware is a fraction of NVIDIA's growth story—the company has an enormous, top-down software stack, including its own programming language, APIs, prebuilt compute and AI models; and a thriving ecosystem of independent developers and ISVs that it has nurtured over these years. So by the time AI took off at scale as a revolution in computing, NVIDIA was ready with the fastest hardware, and the largest community of developers that could put it to use. We began this editorial by stating that it's a good thing NVIDIA didn't acquire an x86 license in the early 2000s. It could switch gears and look inward on the one thing it was already making that can crunch numbers at scale—GPUs with programmable shaders. What NVIDIA is extraordinarily lucky about is that it didn't get stuck with an x86 license.

You can watch Pat Gelsinger's interview over at MIT's YouTube channel, here.

View at TechPowerUp Main Site | Source
 
Joined
Aug 13, 2010
Messages
5,471 (1.05/day)
I really hope Pat does not actually believe the words he said about NVIDIA being lucky.
NVIDIA very much could be how the use of hardware accelerated machine learning got to where it is today. I remember attending a GTC in 2011 where it was already in the air.
 
Joined
Nov 11, 2016
Messages
3,399 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Well Nvidia have one single leader whose sole focus is to innovate, whereas Intel have had CEOs who are more bean counters than anything else, that's why Intel cancelled projects at first sight of difficulty...
 
Joined
Apr 26, 2023
Messages
128 (0.22/day)
I really hope Pat does not actually believe the words he said about NVIDIA being lucky.
NVIDIA very much could be how the use of hardware accelerated machine learning got to where it is today. I remember attending a GTC in 2011 where it was already in the air.
Basically since 22nm Intel had problems with its fabs. While 22nm was on time, it was initially no better than 32nm. Only two years latter with Haswel refresh it was proper. Even worse with 14nm as it took them 4 years to make it proper, and 10nm it was a disaster. Because of this they were unable to produce Larrabee and Atoms for phones. Later they were unable to produce next gen Phi, and now they must use TSMC fabs.
While AMD were struggling to survive and nVidia was investing in innovations, Intel was paying big dividends.
 
Joined
Nov 13, 2007
Messages
10,737 (1.73/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 CL30 / 2133 fclk
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Yeah they did, repeatedly and purposefully over 20 years of development... They have really good luck.


1703177862121.png


All of that lucky BrookGPU -> to CUDA -> cuDNN roadmap that took decades to execute on.

They even have the marketing slides.
 
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Well the article puts things in their right place and many of us where saying the same things for years. I really don't understand who Pat is trying to fool, especially at MIT. Maybe next time he should try Microcenter and talking to customers who need help to send an email. They might believe him.

Just adding a couple stuff here.
Nvidia tried to get an X86 license. Intel said no.
Huang said once(I think, read it once many many years ago), I believe around 2005 or even earlier "Nvidia is a software company that also happens to build the hardware where that software will run"

And one more thought.
Creative was the King in Audio 20+ years ago. With on board audio solutions becoming the normal, Creative just became an old name to remember.
I think Huang seen that and making the GPU a powerful co processor wasn't only a great vision, but also a necessary transformation of the simple graphics chip that could also be replaced in the future from cheap on board solutions that could be good enough for the majority of consumers.
 
Joined
Jan 11, 2022
Messages
866 (0.83/day)
Nvidia wasn't big enough to be able to ignore competition and had to step up a couple of times where intel didn't have to.
 
Joined
Aug 13, 2010
Messages
5,471 (1.05/day)
Basically since 22nm Intel had problems with its fabs. While 22nm was on time, it was initially no better than 32nm. Only two years latter with Haswel refresh it was proper. Even worse with 14nm as it took them 4 years to make it proper, and 10nm it was a disaster. Because of this they were unable to produce Larrabee and Atoms for phones. Later they were unable to produce next gen Phi, and now they must use TSMC fabs.
While AMD were struggling to survive and nVidia was investing in innovations, Intel was paying big dividends.
Fabrication isn't why hardware accelerated machine learning turned NVIDIA into a successful company.
Development, tools and community is why. NVIDIA actively running workshops, actively running courses and actively granting the tools to buy (that means even your local computer store's gaming GPU), learn and perform ML actions. No server grade or expensive hardware needed
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Intel's inability to be honest about their own failings is the company's biggest failing, and will destroy it if not rectified.
 
Joined
Nov 26, 2021
Messages
1,642 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Intel's inability to be honest about their own failings is the company's biggest failing, and will destroy it if not rectified.
I hope they are more honest in internal discussions. This was, after all, a public venue.
 
Joined
Oct 28, 2012
Messages
1,190 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
People initially thought that getting an engineer as CEO would be a good thing for Intel, but as time passed, I was under the impression that they hired the biggest delusional fanboy. Pat is making a lot of unnecessary wild claims.
 
Joined
Jul 10, 2018
Messages
38 (0.02/day)
I think PG is extraordinarily stupid! Everything he says is utter nonsense !
 
Joined
Dec 26, 2006
Messages
3,820 (0.58/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
maybe intel shouldn't have thrown in the towel on larrabee then??
 
Joined
Dec 29, 2010
Messages
3,807 (0.75/day)
Processor AMD 5900x
Motherboard Asus x570 Strix-E
Cooling Hardware Labs
Memory G.Skill 4000c17 2x16gb
Video Card(s) RTX 3090
Storage Sabrent
Display(s) Samsung G9
Case Phanteks 719
Audio Device(s) Fiio K5 Pro
Power Supply EVGA 1000 P2
Mouse Logitech G600
Keyboard Corsair K95
People initially thought that getting an engineer as CEO would be a good thing for Intel, but as time passed, I was under the impression that they hired the biggest delusional fanboy. Pat is making a lot of unnecessary wild claims.
Well he does come from the Netburst era when they were colluding behind closed doors. BS is not out of his repertoire.
 
Joined
Nov 6, 2016
Messages
1,749 (0.60/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
ALL success, and I mean ALL, has some degree of luck involved and lucky timing, to say otherwise is to imply that you can control every aspect of reality...EVERY success story involves luck.

*We should ALL be praying somebody knocks Nvidia off their high horse. Monopoly and AI alone are issues that represent many dangers, but together....that could be truly dangerous.
 
Joined
May 30, 2015
Messages
1,928 (0.56/day)
Location
Seattle, WA
maybe intel shouldn't have thrown in the towel on larrabee then??

They didn't. Larrabee became Xeon Phi. By all accounts Xeon Phi was a success. It ran multiple generations, saw exponential improvements in performance and design, introduced the super-wide SIMD (AVX-512) to consumer markets, and pushed development forward on GPGPU at Intel where none was happening prior. Intel's Xe graphics chips owe their heritage in part to the Larrabee team with the dual-issue Vector/Matrix compute blocks.

Well he does come from the Netburst era

He comes from the "Old" Intel. He exited after the Netburst years but Pat Gelsinger's signature is literally printed on the i386. Twice, actually. He also co-wrote the book on modern x86 as we know it. He's a well accomplished engineer, but also an incredibly brash and out of touch CEO.

In all this time, Intel continued to behave like a CPU company and not a compute company—the majority of its revenue came from client CPUs, followed by server CPUs, and it has consistently held accelerators at a lower priority.

I would argue against this if only for the recent years. Intel purchased Habana labs in 2019, dedicating millions to AI accelerator development (Gaudi). Then in 2021 they bought up the entirety of CenTaur's x86 design team who were working on AI vision technology for VIA, as well as designing the first x86 CPU with an integrated ultra-wide VLIW neural processing unit.

Going back to the formation of the "Gen" graphics division (separate from Larrabee, those were two distinct teams) Intel invested heavily in the development of fixed-function accelerators for video with Quick Sync. They also began work on AMX way back in 2017 but made the poor decision to only incorporate that hardware in Sapphire Rapids which saw an historic 30+ months of delays before finally crawling over the finish line. So it's not that they sat back and did nothing, they simply weren't strategic enough in the use of their technologies.
 
Joined
May 23, 2022
Messages
30 (0.03/day)
People initially thought that getting an engineer as CEO would be a good thing for Intel, but as time passed, I was under the impression that they hired the biggest delusional fanboy. Pat is making a lot of unnecessary wild claims.
It's funny because Lisa is also a engineer and they're almost opposity as CEOs,.
 

stickleback123

New Member
Joined
Dec 21, 2023
Messages
2 (0.01/day)
I hope they are more honest in internal discussions. This was, after all, a public venue.
I hope for their sake that they are, although from what I gather their internal culture may not allow that.

In any case it makes him look like a fool, if a competitor has done well through decades of careful planning and competent execution you acknowledge that, to do otherwise makes you look graceless, petty, and small.
 
Joined
May 13, 2010
Messages
6,062 (1.14/day)
System Name RemixedBeast-NX
Processor Intel Xeon E5-2690 @ 2.9Ghz (8C/16T)
Motherboard Dell Inc. 08HPGT (CPU 1)
Cooling Dell Standard
Memory 24GB ECC
Video Card(s) Gigabyte Nvidia RTX2060 6GB
Storage 2TB Samsung 860 EVO SSD//2TB WD Black HDD
Display(s) Samsung SyncMaster P2350 23in @ 1920x1080 + Dell E2013H 20 in @1600x900
Case Dell Precision T3600 Chassis
Audio Device(s) Beyerdynamic DT770 Pro 80 // Fiio E7 Amp/DAC
Power Supply 630w Dell T3600 PSU
Mouse Logitech G700s/G502
Keyboard Logitech K740
Software Linux Mint 20
Benchmark Scores Network: APs: Cisco Meraki MR32, Ubiquiti Unifi AP-AC-LR and Lite Router/Sw:Meraki MX64 MS220-8P
Pat Gelsigner digging for apples

reminds him of what he also lost.... apple... and now they are outperforming them with thier own silicon that does better with creative tasks like music production.
 
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Fabrication isn't why hardware accelerated machine learning turned NVIDIA into a successful company.
But it was the reason for Intel's success in the past. The reason Intel could fake an X86 SOC being as efficient as an ARM SOC. When Intel lost that fabrication advantage, it stopped dreaming of winning market share on tablets against ARM. When Intel fell behind, AMD that was the worst company on the planet on efficiency, but was using TSMC's better manufacturing nodes, beat them.
People initially thought that getting an engineer as CEO would be a good thing for Intel, but as time passed, I was under the impression that they hired the biggest delusional fanboy. Pat is making a lot of unnecessary wild claims.
Pat is right about one thing. Trying to make Intel the top manufacturer in the world. Trying to beat TSMC and Samsung. He knows that having a manufacturing advantage is as big of an advantage as having the best architecture.
 
Joined
Jun 22, 2014
Messages
446 (0.12/day)
System Name Desktop / "Console"
Processor Ryzen 5950X / Ryzen 5800X
Motherboard Asus X570 Hero / Asus X570-i
Cooling EK AIO Elite 280 / Cryorig C1
Memory 32GB Gskill Trident DDR4-3600 CL16 / 16GB Crucial Ballistix DDR4-3600 CL16
Video Card(s) RTX 4090 FE / RTX 2080ti FE
Storage 1TB Samsung 980 Pro, 1TB Sabrent Rocket 4 Plus NVME / 1TB Sabrent Rocket 4 NVME, 1TB Intel 660P
Display(s) Alienware AW3423DW / LG 65CX Oled
Case Lian Li O11 Mini / Sliger CL530 Conswole
Audio Device(s) Sony AVR, SVS speakers & subs / Marantz AVR, SVS speakers & subs
Power Supply ROG Loki 1000 / Silverstone SX800
VR HMD Quest 3
Apparently, Salty Pat feels that his marketing team didn't make them look like complete fools while firing shots at AMD a couple of weeks ago, so he has taken it upon himself to finish off the job.
Clearly upset that he's riding on leather coat tails in to the AI market, while simultaneously being shown how to engineer a great CPU with less resources, by a team 1/3 their size. It is Intel that is 'lucky' to have this new market created by others, where they can now dump their -me too- "AI" products.
 
Joined
May 13, 2010
Messages
6,062 (1.14/day)
System Name RemixedBeast-NX
Processor Intel Xeon E5-2690 @ 2.9Ghz (8C/16T)
Motherboard Dell Inc. 08HPGT (CPU 1)
Cooling Dell Standard
Memory 24GB ECC
Video Card(s) Gigabyte Nvidia RTX2060 6GB
Storage 2TB Samsung 860 EVO SSD//2TB WD Black HDD
Display(s) Samsung SyncMaster P2350 23in @ 1920x1080 + Dell E2013H 20 in @1600x900
Case Dell Precision T3600 Chassis
Audio Device(s) Beyerdynamic DT770 Pro 80 // Fiio E7 Amp/DAC
Power Supply 630w Dell T3600 PSU
Mouse Logitech G700s/G502
Keyboard Logitech K740
Software Linux Mint 20
Benchmark Scores Network: APs: Cisco Meraki MR32, Ubiquiti Unifi AP-AC-LR and Lite Router/Sw:Meraki MX64 MS220-8P
But it was the reason for Intel's success in the past. The reason Intel could fake an X86 SOC being as efficient as an ARM SOC. When Intel lost that fabrication advantage, it stopped dreaming of winning market share on tablets against ARM. When Intel fell behind, AMD that was the worst company on the planet on efficiency, but was using TSMC's better manufacturing nodes, beat them.

Pat is right about one thing. Trying to make Intel the top manufacturer in the world. Trying to beat TSMC and Samsung. He knows that having a manufacturing advantage is as big of an advantage as having the best architecture.
bay trail and cherry trail tablets were such a pain... they were designed for windows but windows ran like a dog on tranq... I had to put linux on 3 of mine... and worst part is all 3 are technically 64 bit but they stuck a 32 bit bootloader/efi on them so it was hard af to put linux on them. They did that to lock windows in when when I got linux working it was massively better.

Intel failed so hard on tablets. So much waste
 
Top