• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Grace CPU Specs Remind Us Why Intel Never Shared x86 with the Green Team

Joined
Sep 6, 2013
Messages
3,291 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
The article I linked has lists of products using their chips. Not only from lesser-known OEMs, but from HTC, Motorola, LG, Samsung, Acer, Sony, Dell, Toshiba, ASUS, Microsoft, Google, Xiaomi, Lenovo and Tesla. Huge ranges of products as well.

Give me a list of mass produced products from those companies in the last 5 years using Nvidia SOCs.
(not the whole list obviously, just a few examples)
 
Joined
Jun 29, 2018
Messages
524 (0.23/day)
Give me a list of mass produced products from those companies in the last 5 years using Nvidia SOCs.
(not the whole list obviously, just a few examples)
At which point you will shift the goal post again. I will simply not play this game, and agree to disagree :)
 
Joined
Sep 6, 2013
Messages
3,291 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
At which point you will shift the goal post again. I will simply not play this game, and agree to disagree :)
So, you have nothing to say.

I was very clear in my first post. I was very clear in my reply. Instead of acknowledging what I meant, you chose to play dump and throw me a list of companies that maybe produced some products 10 years ago, when Nvidia did tried to get in smartphones.
Now that you can't continue that charade, you run away saying that I am the one shifting the goalposts.

Go and play your games elsewhere.
 
Joined
Jan 24, 2016
Messages
17 (0.01/day)
“NVIDIA also shared that it expects the new 72-core CPU to hit around SPEC CPU2017 Integer Rate scores of around 370 per full 72-core Grace CPU using GCC. For some context, a 64-core AMD chip will have official scores in the 390-450 range (e.g. an AMD EPYC 7773X.)”

Source.

For all that terabyte of bandwidth and juvenile “SuPErCHiP” naming, it’s merely “competitive” (read: slower but not a whole lot slower) with current offerings. Imagine designing your processor with an exotic memory interface and giving it a silly grandiose name, only for it to be slower than year old processors using bog-standard DDR4-3200. IPC uplift leaks suggest 96-core Genoa will be faster than the whole dual-processor “sUpERchIp” package :laugh:

I agree with the article, maybe if Nvidia had an x86 license they could have come up with something more competitive. This is a great (although very costly) demonstration that the reports of x86 being dead are greatly exaggerated :sleep:
 
Joined
Jun 29, 2018
Messages
524 (0.23/day)
So, you have nothing to say.

I was very clear in my first post. I was very clear in my reply. Instead of acknowledging what I meant, you chose to play dump and throw me a list of companies that maybe produced some products 10 years ago, when Nvidia did tried to get in smartphones.
Now that you can't continue that charade, you run away saying that I am the one shifting the goalposts.

Go and play your games elsewhere.
Fine, I'll bite and quote your first post directly:
Nvidia did a mistake to not REALLY concentrate on ARM sooner and produce products like Grace long ago. Not this kind of huge processors for servers from the beginning maybe, but SOCs for laptops and desktops, or if not desktops, at least mini PCs, running Windows on ARM, or Linux, or Android, or all of them. Qualcomm is a sleeping, boring, failure in that area.
Let me also remind you that you used "sooner" there which I understood as wishing that NVIDIA did ARM stuff before now. When faced with the entire history of NVIDIA ARM in your last post you suddenly want only products from the last 5 years. Why?
but SOCs for laptops and desktops
ASUS Eee Pad Transformer series, Microsoft Surface, Surface 2 and Surface RT, Lenovo IdeaPad Yoga 11, Toshiba AC100, HP Chromebook, Acer Chromebook.
at least mini PCs
HP Slate series, Shield TV, arguably Jetson.
running Windows on ARM
Microsoft Surface, Surface 2 and RT.
or Linux, or Android
Jetson shipped with Linux and almost everything else was running Android. On most products Linux could be installed in some form.

There's also a myriad of smartphones and the gaming series with Ouya, Shield Portable and Nintendo Switch series.

All in all, NVIDIA has a long history with ARM and has scored multiple products with them from the biggest OEMs. Were they unsuccessful? Arguably mostly yes, but not usually as a consequence of technical merit.

“NVIDIA also shared that it expects the new 72-core CPU to hit around SPEC CPU2017 Integer Rate scores of around 370 per full 72-core Grace CPU using GCC. For some context, a 64-core AMD chip will have official scores in the 390-450 range (e.g. an AMD EPYC 7773X.)”

Source.

For all that terabyte of bandwidth and juvenile “SuPErCHiP” naming, it’s merely “competitive” (read: slower but not a whole lot slower) with current offerings. Imagine designing your processor with an exotic memory interface and giving it a silly grandiose name, only for it to be slower than year old processors using bog-standard DDR4-3200. IPC uplift leaks suggest 96-core Genoa will be faster than the whole dual-processor “sUpERchIp” package :laugh:

I agree with the article, maybe if Nvidia had an x86 license they could have come up with something more competitive. This is a great (although very costly) demonstration that the reports of x86 being dead are greatly exaggerated :sleep:
To be honest the STH article you linked explains it pretty well. Grace is not supposed to compete with x86 in raw compute, but is a platform for NVIDIAs GPU efforts. The CPU-GPU coherency along with NVLink networking is a big deal with scaling their AI/ML solutions.
Historically NVIDIA went from IBM POWER to Intel x86 to AMD x86 as the basis of their GPU clusters, so it makes sense for them to utilize previous ARM expertise and complete the walled garden with an in-house CPU platform ;)
With the Mellanox acquisition they are pretty well vertically integrated now.
 
Joined
Jan 24, 2016
Messages
17 (0.01/day)
To be honest the STH article you linked explains it pretty well. Grace is not supposed to compete with x86 in raw compute, but is a platform for NVIDIAs GPU efforts. The CPU-GPU coherency along with NVLink networking is a big deal with scaling their AI/ML solutions.
Historically NVIDIA went from IBM POWER to Intel x86 to AMD x86 as the basis of their GPU clusters, so it makes sense for them to utilize previous ARM expertise and complete the walled garden with an in-house CPU platform ;)
With the Mellanox acquisition they are pretty well vertically integrated now.
Yes, my point was they are vertically integrated by virtually uncompetitive. Locked into a walled garden where the prices are high and the hardware is slow; surely they must aspire to be the new Apple? This is even funnier when you consider NVIDIA has been developing their "ARM expertise™" since 2008 at least — all that "expertise™" and such an unflattering result makes you wonder about ARM as a platform :laugh:
 
Joined
Sep 6, 2013
Messages
3,291 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Fine, I'll bite and quote your first post directly:

Let me also remind you that you used "sooner" there which I understood as wishing that NVIDIA did ARM stuff before now. When faced with the entire history of NVIDIA ARM in your last post you suddenly want only products from the last 5 years. Why?

ASUS Eee Pad Transformer series, Microsoft Surface, Surface 2 and Surface RT, Lenovo IdeaPad Yoga 11, Toshiba AC100, HP Chromebook, Acer Chromebook.

HP Slate series, Shield TV, arguably Jetson.

Microsoft Surface, Surface 2 and RT.

Jetson shipped with Linux and almost everything else was running Android. On most products Linux could be installed in some form.

There's also a myriad of smartphones and the gaming series with Ouya, Shield Portable and Nintendo Switch series.

All in all, NVIDIA has a long history with ARM and has scored multiple products with them from the biggest OEMs. Were they unsuccessful? Arguably mostly yes, but not usually as a consequence of technical merit.
Oh, come on. I have a "REALLY" in my first post with ALL the letters in capital form. I mean, you missed that? There is nothing to bite here. Just READ what I wrote. Not parts of what I wrote, but ALL the words.
I also explained in that first post what I meant by saying sooner. It's there. Just READ it. Don't avoid parts of my posts just to create not existent excuses, to give a different meaning of what I wrote. You misread what I write, and then accuse me for what you misunderstood, or chose to give a different explanation, an explanation of your own. You think I don't know Nvidia's history or that I haven't seen their first efforts in the ARM platform? But they just gave up for something REALLY serious in the retail market. I think their excuse was Qualcomms anticompetitive tactics back then. Give me a brake, my first Nvidia product was a GeForce 2 MX.
The products you show are 10 years old. Haven't seen what Nvidia was doing in the Chromebook market to be honest, but the last 5-10 years they where mostly making specialized boards and nothing else with the exception of Shield and Swift. Do you understand that "REALLY" in my first post, or are you determent to NOT understand what I am saying even from that first post? When we have so many companies succeeding in the ARM market, companies (much) smaller than Nvidia and knowing Nvidia's potential, it's easy to assume that they just lost interest and only now they are coming back. Hope not just to support their server vision, but for more.
As for Nvidia's long history, did I said they just got a license? What? I am shifting the golposts again? No. You just DON'T READ what I write.
 
Joined
Sep 17, 2014
Messages
22,009 (6.01/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
I think this was on everyone's mind who knew that Intel rejected giving a license to Nvidia.

Nvidia did a mistake to not REALLY concentrate on ARM sooner and produce products like Grace long ago. Not this kind of huge processors for servers from the beginning maybe, but SOCs for laptops and desktops, or if not desktops, at least mini PCs, running Windows on ARM, or Linux, or Android, or all of them. Qualcomm is a sleeping, boring, failure in that area.

They'll probably start accelerating in the ARM platform now. They lost time waiting to see if they can first have the absolute control of ARM. No one wanted them, so it's good to see that their pride and arrogance - which is part of their business mentality, sometimes helps them, mosts times, it doesn't - is not becoming an obstacle to their plans to start developing CPUs also.

As much as Intel needs GPUs for it's future, the same Nvidia needs CPUs for it's future. We all saw what happened to Nvidia's financials this quarter, because they only stand on one foot. GPUs. Hit that foot and the whole company trembles.
They did focus early though. Tegra

first dual core soc in a phone, even
People read your posts, but quite often something is missing from your version of history


They also pushed on the GPU angle but (graphics intensive-) gaming in phones never really took off;

And this was their niche, the unique selling point Nvidia had and now expands on with Grace as well: gpu acceleration.

Overall I dont think Nvidia can be blamed for lack of trying to get into this market... the ultimate push cost them dearly;

 
Last edited:
Joined
Sep 6, 2013
Messages
3,291 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
They did focus early though. Tegra

first dual core soc in a phone, even
People read your posts, but quite often something is missing from your version of history


They also pushed on the GPU angle but (graphics intensive-) gaming in phones never really took off;

And this was their niche, the unique selling point Nvidia had and now expands on with Grace as well: gpu acceleration.

Overall I dont think Nvidia can be blamed for lack of trying to get into this market... the ultimate push cost them dearly;

Or maybe I was expecting more than what others expected from Nvidia. Others look at wikipedia, see Nvidia's non stop support of the ARM platform and say "There, Nvidia never stopped innovating on the ARM platform". But that's not the case. As seen by the above products given, 10 years ago Nvidia was trying to get in smartphones and tablets, maybe some ultra portables, netbooks too. But those efforts just stopped, meaning Nvidia did steps backwards from there. Maybe it was Qualcomm's tactics, maybe it was the low profit margin, maybe combinations. Project Denver back then was looking promising. Then nothing. I mean, Nvidia did use it in some specialized products, it even got in Google's Nexus, but not much after that. We hear about a SOC from Mediatek or Qualcomm for example, and 6-12 months later we get 10-20-50 devices using it. That never happened with Nvidia, at least not for the last many years, because Nvidia stopped targeting smartphones, tablets and laptops etc.
Gaming on phones doesn't really needs graphics. At least most games in smartphones don't. They are very simple in graphics. i believe 3D intensive gaming in smartphones is a minority, but i could be wrong here.
I think Nvidia wasn't really investing on the ARM platform, because it could not control the platform and what direction that platform would go. That's why they tried to buy ARM. They had a license, they even had a license to build custom cores, but they probably also had reasons to not invest heavily on that platform. If they had bought ARM, no Qualcomm could sabotage their negotiations with big smartphone companies and also they could turn the ship more in the server direction, than the smartphone market. They couldn't buy ARM, but they can't do something about that either. They need CPUs for their server aspirations at least.
 
Joined
Sep 17, 2014
Messages
22,009 (6.01/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Or maybe I was expecting more than what others expected from Nvidia. Others look at wikipedia, see Nvidia's non stop support of the ARM platform and say "There, Nvidia never stopped innovating on the ARM platform". But that's not the case. As seen by the above products given, 10 years ago Nvidia was trying to get in smartphones and tablets, maybe some ultra portables, netbooks too. But those efforts just stopped, meaning Nvidia did steps backwards from there. Maybe it was Qualcomm's tactics, maybe it was the low profit margin, maybe combinations. Project Denver back then was looking promising. Then nothing. I mean, Nvidia did use it in some specialized products, it even got in Google's Nexus, but not much after that. We hear about a SOC from Mediatek or Qualcomm for example, and 6-12 months later we get 10-20-50 devices using it. That never happened with Nvidia, at least not for the last many years, because Nvidia stopped targeting smartphones, tablets and laptops etc.
Gaming on phones doesn't really needs graphics. At least most games in smartphones don't. They are very simple in graphics. i believe 3D intensive gaming in smartphones is a minority, but i could be wrong here.
I think Nvidia wasn't really investing on the ARM platform, because it could not control the platform and what direction that platform would go. That's why they tried to buy ARM. They had a license, they even had a license to build custom cores, but they probably also had reasons to not invest heavily on that platform. If they had bought ARM, no Qualcomm could sabotage their negotiations with big smartphone companies and also they could turn the ship more in the server direction, than the smartphone market. They couldn't buy ARM, but they can't do something about that either. They need CPUs for their server aspirations at least.
They simply didnt have design wins. Cortex was always more diverse. Tegra was power hungry. Cortex had a core in the stack for every segment from budget to flagship phone.

Nvidia never got to an efficient enough design to compete. They sold their SoC to LG and their dual core phone was fast but not efficient at that. (I owned one..) Tegra K1 similarly guzzles power. And Tegra STILL shines in one device: Shield TV. Why? It doesnt run on a battery ;) And its still arguably the best android Tv device you can hook onto a TV.

The ARM market is open. At the server end, many core CPUs like ThunderX trumped their design too for server.

You can spin it however you feel but the reality is, Nvidia certainly did try, but their advantage in tech just couldnt get translated into something meaningful while competitors already had some iterations and refinements under the hood. The whole reason Grace is the subject now, again is because it helps their GPU division. Not because they built the most efficient CPU. They have shareholders too; these adventures cant last forever.

You even answered the issue yourself by stating smartphones dont really need graphics in a big way. Its not entirely realistic to then expect them to push further on stuff they are behind on anyway...

As for the olde idea of Nvidia wanting market control and not just share... you might be right about that. But the company really did try the fair play approach; let the design/tech speak for itself. Its all Nvidia really does and did at its core. Their GPU leadership is not the result of market control. Its a result of consecutive design wins. An approach they tried just the same with Tegra.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,622 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Of course they had a business case, it's called making money. Even though Arm CPUs are in everything and the kitchen sink nowadays, I'd wager the x86 market cap still exceeds every other CPU architecture type combined.
That's a very simplistic view. Wishing does not make it so, I gave examples of companies that tried to build x86 CPUs and failed. And even if you can make money, there's also the cost of opportunity. My feeling is Nvidia simply chose to invest in compute which is a higher margin market. Now that they have a big chunk of that market, they can afford to design custom CPUs. But they only to that to support their existing business.

On another note, cracking into the x86 CPU business is probably impossible today. AMD is very lucky to have x86_64 in their courtyard, that forces Intel to cross-license the rest of the instruction set. A third player would have to get a license from both Intel and AMD and will probably never be able to compete on price because of that.
 
Joined
May 3, 2018
Messages
2,802 (1.20/day)
Would like to see Nvidia make ARM based desktop cpu's to rival Apple's Mx series. If Apple weren't so arrogant and self-absorbed they could make a ton of money licensing their chips to anyone. Just like Sony's sensor division form their camera division, they will sell sensors to fierce competitors, because there in the business to make money for Sony and boy do they make a lot more money than the camera division. So given Apple's arrogance, a third player in the desktop cpu market with the might of Nvidia would be good IMO. Intel really isn't going to have an answer to Apple until Arrow Lake, and I'll bet it still is way behind on efficiency even if they deliver the forecast improvements in that area.

Still come 2025-27 I think we are in for a treat. AMD is not ignoring Apple either.
 
Joined
Sep 6, 2013
Messages
3,291 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
They simply didnt have design wins. Cortex was always more diverse. Tegra was power hungry. Cortex had a core in the stack for every segment from budget to flagship phone.

Nvidia never got to an efficient enough design to compete. They sold their SoC to LG and their dual core phone was fast but not efficient at that. (I owned one..) Tegra K1 similarly guzzles power. And Tegra STILL shines in one device: Shield TV. Why? It doesnt run on a battery ;) And its still arguably the best android Tv device you can hook onto a TV.

The ARM market is open. At the server end, many core CPUs like ThunderX trumped their design too for server.

You can spin it however you feel but the reality is, Nvidia certainly did try, but their advantage in tech just couldnt get translated into something meaningful while competitors already had some iterations and refinements under the hood. The whole reason Grace is the subject now, again is because it helps their GPU division. Not because they built the most efficient CPU. They have shareholders too; these adventures cant last forever.

You even answered the issue yourself by stating smartphones dont really need graphics in a big way. Its not entirely realistic to then expect them to push further on stuff they are behind on anyway...

As for the olde idea of Nvidia wanting market control and not just share... you might be right about that. But the company really did try the fair play approach; let the design/tech speak for itself. Its all Nvidia really does and did at its core. Their GPU leadership is not the result of market control. Its a result of consecutive design wins. An approach they tried just the same with Tegra.
Well, they had to start from somewhere. Tegra having bad efficiency was known and that companion core in Tegra 3's design was an effort to lower power consumption. But, Qualcomm and probably profit margins made them to not take that market seriously. They could invest into building ARM SOCs. They could do what Mediatek is doing. Offer a cheaper alternative into the market. Or do what AMD is trying to do with Samsung, integrate their graphics tech into an ARM SOC and try to make it as efficient as possible. They could have a thriving business by now in the ARM SOC market, with Nvidia's graphics being an alternative to Adreno and Mali, or the top option, or a number of options, for the hi end or the mid range phones. But I am not sure they still see this. Grace is for servers. Are they going to do something in the other areas where ARM is the main option, or just stay in servers, cloud, auto business?
Nvidia was always trying to gain market control. That's what forces them to innovate and makes them look like they are one or two steps in front of the competition. CUDA, PhysX(they bought the tech, but who else had a chance to promote it stronger?), G-Sync, DLSS, RayTracing(it's not their, but the same as PhysX). Fair game and Nvidia doesn't usually go together and when I was reading 10 years or so ago Nvidia's protests about Qualcomm's anticompetitive tactics in the SOC market, I was laughing. Nvidia wasn't lying. Qualcomm had to face fines if I remember correctly from the courts and agree to play nice in the future.
Would like to see Nvidia make ARM based desktop cpu's to rival Apple's Mx series. If Apple weren't so arrogant and self-absorbed they could make a ton of money licensing their chips to anyone. Just like Sony's sensor division form their camera division, they will sell sensors to fierce competitors, because there in the business to make money for Sony and boy do they make a lot more money than the camera division. So given Apple's arrogance, a third player in the desktop cpu market with the might of Nvidia would be good IMO. Intel really isn't going to have an answer to Apple until Arrow Lake, and I'll bet it still is way behind on efficiency even if they deliver the forecast improvements in that area.

Still come 2025-27 I think we are in for a treat. AMD is not ignoring Apple either.
Nvidia is as arrogant as Apple. But they don't have the huge customer base Apple does. Apple sells devices, Nvidia sells parts, not whole devices. It's not easy to make loyal customers who buy your parts, to convince them to buy whole devices, just because of your logo on them. But Nvidia could start offering ARM SOCs for smartphones, laptops, (mini) desktop PCs and build reference, all Nvidia hardware, devices for others to build.
Apple will never sell it's stuff to others, because if everyone could build iPhones and Macs, those devices could lose their premium image. Many people buy iPhones and Macs with the same mentality they buy jewelery.
 
Joined
Jul 9, 2015
Messages
3,413 (1.01/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
So the "news" is essentially a hidden team green advertisement.

Because honestly saying that yet another ARM clone had been rolled out today, this time, by NV, is not flashy enough.

Nvidia had always better vision than AMD, it was more ambitious

Ah, so that is why both Microsoft and Apple told it to go have solo kamasutra: "better vision".
 

bug

Joined
May 22, 2015
Messages
13,622 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Would like to see Nvidia make ARM based desktop cpu's to rival Apple's Mx series. If Apple weren't so arrogant and self-absorbed they could make a ton of money licensing their chips to anyone. Just like Sony's sensor division form their camera division, they will sell sensors to fierce competitors, because there in the business to make money for Sony and boy do they make a lot more money than the camera division. So given Apple's arrogance, a third player in the desktop cpu market with the might of Nvidia would be good IMO. Intel really isn't going to have an answer to Apple until Arrow Lake, and I'll bet it still is way behind on efficiency even if they deliver the forecast improvements in that area.

Still come 2025-27 I think we are in for a treat. AMD is not ignoring Apple either.
That's not Apple's business model. Apple is all about (apparent) scarcity to justify higher mark-ups and keeping the number of SKUs in the wild low, to cut back on support expenses. Offering their CPU to third parties would fly in the face of all that.
 
Joined
Jul 9, 2015
Messages
3,413 (1.01/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Would like to see Nvidia make ARM based desktop cpu's to rival Apple's Mx series.
Where is the market for that?

Apple paywalled people by anything from Apple anyway.

Why would the rest of the world bother?
 
Joined
Sep 6, 2013
Messages
3,291 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Ah, so that is why both Microsoft and Apple told it to go have solo kamasutra: "better vision".
Nvidia took a dumb graphics chip and transformed it in the most important compute chip of today. They pushed Physics (in the wrong way, but at least made gamers focus on the importance of it), build CUDA, introduced GSync, DLSS, pushed RayTracing into the market making it the de facto feature more gamers are expecting from their new hardware and probably so many things I forget now, or don't know about. AMD on the other hand have only to show Mantle. Other techs from AMD are reactions to Nvidia. Versions of Nvidia's visions as they should be brought to the market to be consumer friendly, like FreeSync and FSR.
In the case of consoles, Nvidia either didn't had the hardware, a SOC that would combine both a CPU and a GPU and be competitively priced compared to AMD's APUs, or the low profit margins made them ignore that market (and I thank them for that choice, if that was the case, because I could be forced to be buying quad core Intel's for the rest of my life).
 
Joined
Jul 9, 2015
Messages
3,413 (1.01/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Nvidia took a dumb graphics chip and transformed it in the most important compute chip of today.
The only truth to it is that number crunching GPUs are indeed very dumb.

The need for massively parallel processing has exactly 0 to do with NV.
They pushed Physics (in the wrong way,
That is one way to refer to "they bought PhysX".

build CUDA
Proprietary crap.

introduced GSync
That is dead, and good riddance.

Used that thing to milk more money off the customer. The tech itself will likely follow the path of GSYNC.

pushed RayTracing into the market
And at the moment, the most noticeable hint at "RT is on" is the massive drop in framerates. At the same time, this thing is not uses NV's RT:


Other techs from AMD are reactions to Nvidia
That's a lovely myth to push in 2022.
 

bug

Joined
May 22, 2015
Messages
13,622 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Nvidia took a dumb graphics chip and transformed it in the most important compute chip of today. They pushed Physics (in the wrong way, but at least made gamers focus on the importance of it), build CUDA, introduced GSync, DLSS, pushed RayTracing into the market making it the de facto feature more gamers are expecting from their new hardware and probably so many things I forget now, or don't know about. AMD on the other hand have only to show Mantle. Other techs from AMD are reactions to Nvidia. Versions of Nvidia's visions as they should be brought to the market to be consumer friendly, like FreeSync and FSR.
In the case of consoles, Nvidia either didn't had the hardware, a SOC that would combine both a CPU and a GPU and be competitively priced compared to AMD's APUs, or the low profit margins made them ignore that market (and I thank them for that choice, if that was the case, because I could be forced to be buying quad core Intel's for the rest of my life).
To be fair, AMD/ATI had TruForm years before tessellation was a thing. I don't know whether that's good or bad, since they weren't able to get any traction for it.

If memory serves me well Nvidia pioneered programmable shaders.
 
Joined
Sep 6, 2013
Messages
3,291 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
To be fair, AMD/ATI had TruForm years before tessellation was a thing. I don't know whether that's good or bad, since they weren't able to get any traction for it.

If memory serves me well Nvidia pioneered programmable shaders.
Can't say I remember or know everything from that era, but Nvidia was always better in promoting something. AMD was usually failing in promoting anything. When they bought Ageia I was truly happy, thinking that only they could push hardware Physics in games. Then they locked it and tried to use it to gain an unfair advantage over the competition. Nvidia is a company to both cheer and curse them at the same time.
 
Joined
Feb 18, 2005
Messages
5,701 (0.80/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
I am not. I am talking about mass production targeting markets with a huge user base. Other than Switch, that probably just happened, Nvidia was using ARM for very specific applications.
I haven't miss something.

All in on ARM? Really? Tell me some products that where made for mass production and availability to the general public. Except the obvious mention to Switch, what else is there? Shield tablet?
The reason that NVIDIA never has and never will be a mass-market seller of Arm CPUs is alluded to by my previous comment:

Even though Arm CPUs are in everything and the kitchen sink nowadays, I'd wager the x86 market cap still exceeds every other CPU architecture type combined.

For Arm, the profit margins simply aren't there unless you're putting your CPUs in everything, or building highly specialised CPUs for highly specialised niches. The latter is where NVIDIA has, rightly, focused their attention by augmenting their primary area of expertise (graphics) with CPUs that can help feed those graphics. They don't want to be a CPU company because CPUs are ancillary to their core focus.

Which once again brings us back to their attempted acquisition of Arm; I still struggle to see the reasoning behind it. The argument that it was to integrate NVIDIA graphics into Arm CPUs doesn't wash because NVIDIA's focus has always been high-performance high-power graphics, not low-end low-power ones as found in typical Arm applications, so they would essentially have to build an entirely new product. The thing is though, that doesn't require them to buy Arm; if NVIDIA already has a low-power GPU capable of competing with what's typically found in smartphones, there's absolutely nothing stopping them from just licensing or selling it as a standalone product.

The cynical take is that it's simply so NVIDIA could increase Arm licensing fees and reap the profits, but I really don't see that panning out well for them; it would almost certainly have pushed a lot of Arm licensees towards the royalty-free RISC-V, which makes it a self-defeating proposition.
 

bug

Joined
May 22, 2015
Messages
13,622 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Can't say I remember or know everything from that era, but Nvidia was always better in promoting something. AMD was usually failing in promoting anything. When they bought Ageia I was truly happy, thinking that only they could push hardware Physics in games. Then they locked it and tried to use it to gain an unfair advantage over the competition. Nvidia is a company to both cheer and curse them at the same time.
Well, it's really more nuanced than that. People (myself included) prefer open approaches.
But, besides trying to develop something new in the open taking a lot more time, look at the compute situation. Closed approach: CUDA, doing its job just fine and is now nearly ubiquitous. Open approach: OpenCL, a spec everyone hates to use, thus almost no one uses it. It doesn't always pan out like that, but what I'm trying to say is going the open route is not an automatic win, I will not fault anyone for going for the closed route.
 
Joined
Dec 25, 2020
Messages
6,193 (4.51/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
This is probably why NVIDIA wanted ARM so badly; they could have made a stronger takeover of the general ARM space with biased deals offering their design over other competitors. Instead, they now have to innovate to complete alongside other ARM licensees. As an aside, their desire to homogenize their HPC systems reminds me of Apple and their walled garden; looking to also replace Intel out of their systems (if they haven't already), although they still use Radeon GPUs here and there.

That said, it looks like the competition will really be between NVIDIA and AMD. AMD added Xlinx to their portfolio, also own an ARM license, and are jointly working with Samsung to integrate RDNA with elements of ARM (via Exynos), which would help them combat NVIDIA across all platforms too. This is assuming NVIDIA also ports elements of this CPU down into their next-gen gaming tablets (and the next-gen Switch, assuming Nintendo sticks with NVIDIA), and even some gaming laptops running either Steam OS or Windows ARM.

Meanwhile, Intel, despite all their recent acquisitions, haven't really gotten anything to show for it, aside from Foveros, and it'll be awhile longer before their own compute and gaming GPUs can prove reliable enough in the high-value markets. Kind of wild to see such a dramatic shift the last 5 years.

However, you forgot the best part of it: Intel and its advanced foundry services, now available to third party customers might as well be fabbing both aforementioned companies' products themselves! ;)

A Ryzen on Intel 20A? I want to see it, to be honest with you.
 
Joined
Sep 6, 2013
Messages
3,291 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
The reason that NVIDIA never has and never will be a mass-market seller of Arm CPUs is alluded to by my previous comment:



For Arm, the profit margins simply aren't there unless you're putting your CPUs in everything, or building highly specialised CPUs for highly specialised niches. The latter is where NVIDIA has, rightly, focused their attention by augmenting their primary area of expertise (graphics) with CPUs that can help feed those graphics. They don't want to be a CPU company because CPUs are ancillary to their core focus.

Which once again brings us back to their attempted acquisition of Arm; I still struggle to see the reasoning behind it. The argument that it was to integrate NVIDIA graphics into Arm CPUs doesn't wash because NVIDIA's focus has always been high-performance high-power graphics, not low-end low-power ones as found in typical Arm applications, so they would essentially have to build an entirely new product. The thing is though, that doesn't require them to buy Arm; if NVIDIA already has a low-power GPU capable of competing with what's typically found in smartphones, there's absolutely nothing stopping them from just licensing or selling it as a standalone product.

The cynical take is that it's simply so NVIDIA could increase Arm licensing fees and reap the profits, but I really don't see that panning out well for them; it would almost certainly have pushed a lot of Arm licensees towards the royalty-free RISC-V, which makes it a self-defeating proposition.
I agree with you in most of what you write. Don't think I wrote something different than that.

I think Nvidia having a... dramatic past was one of the reasons for them trying to buy ARM. Intel refusing to give them a license for it's back then new CPUs drive Nvidia's chipset department to an end. Qualcomm's anticompetitive tactics probably played a major role in Nvidia not pushing in the smartphone market. They probably thought that, in the case of ARM, there could be another company trying to buy them today or in the future and they should move first. Think of Apple buying ARM. They have the money, the power and the customer base to just terminate all licenses and say to the other companies "Go and try your lack with RISC-V, we just don't care!". A simplified thought that last one, but, who knows? I think Nvidia didn't wanted to invest heavily in something they couldn't control. Also buying ARM, as I said, it could give them, not just an advantage over competition, keeping the best design for themselves, but also to choose where the ARM platform will focus. Now they get what ARM is making and then change that to suit their needs. With them having control over ARM, they would create cores for their needs and then let their licensees to spend time and money to make those cores suits their needs.
Anyway, just speculating here.
 
Joined
May 3, 2018
Messages
2,802 (1.20/day)
That's not Apple's business model. Apple is all about (apparent) scarcity to justify higher mark-ups and keeping the number of SKUs in the wild low, to cut back on support expenses. Offering their CPU to third parties would fly in the face of all that.
I now it's not their business model, but it's stupid IMO and more fool them.
 
Top