Wednesday, August 24th 2022

NVIDIA Grace CPU Specs Remind Us Why Intel Never Shared x86 with the Green Team

NVIDIA designed the Grace CPU, a processor in the classical sense, to replace the Intel Xeon or AMD EPYC processors it was having to cram into its pre-built HPC compute servers for serial-processing roles, and mainly because those half-a-dozen GPU HPC processors need to be interconnected by a CPU. The company studied the CPU-level limitations and bottlenecks not just with I/O, but also the machine-architecture, and realized its compute servers need a CPU purpose-built for the role, with an architecture that's heavily optimized for NVIDIA's APIs. This, the NVIDIA Grace CPU was born.

This is NVIDIA's first outing with a CPU with a processing footprint rivaling server processors from Intel and AMD. Built on the TSMC N4 (4 nm EUV) silicon fabrication process, it is a monolithic chip that's deployed standalone with an H100 HPC processor on a single board that NVIDIA calls a "Superchip." A board with a Grace and an H100, makes up a "Grace Hopper" Superchip. A board with two Grace CPUs makes a Grace CPU Superchip. Each Grace CPU contains a 900 GB/s switching fabric, a coherent interface, which has seven times the bandwidth of PCI-Express 5.0 x16. This is key to connecting the companion H100 processor, or neighboring Superchips on the node, with coherent memory access.
Serial processing muscle on the NVIDIA Grace CPU is care of a 72-core Arm v9 64-bit CPU. A Superchip would contain 144 cores. The main memory interface is LPDDR5x, with each "socket" having a maximum memory bandwidth of 1 TB/s (or rivaling that of over 24 channels of DDR5). This includes ECC. A key serial-IO interface is PCI-Express Gen 5, with 68 lanes on offer. These are mainly to wire out NVMe storage devices. The chip has a TDP rating of 500 W peak.

The Grace CPU demonstrates the engineering muscle of NVIDIA at designing large multi-core processors for enterprise and HPC applications. With Arm achieving near-parity with x86-64 in performance, efficiency, and IPC, we're beginning to understand why NVIDIA couldn't become an x86 licensee. It would have achieved a winning enterprise processor rivaling Intel's much before. Future generations of NVIDIA's DGX compute nodes, as well as pre-built workstations and servers, spanning a multitude of applications, could see NVIDIA wean away from x86-based CPUs, replacing them with Grace and its successors.
Source: Wccftech
Add your own comment

54 Comments on NVIDIA Grace CPU Specs Remind Us Why Intel Never Shared x86 with the Green Team

#26
ncrs
john_Give me a list of mass produced products from those companies in the last 5 years using Nvidia SOCs.
(not the whole list obviously, just a few examples)
At which point you will shift the goal post again. I will simply not play this game, and agree to disagree :)
Posted on Reply
#27
john_
ncrsAt which point you will shift the goal post again. I will simply not play this game, and agree to disagree :)
So, you have nothing to say.

I was very clear in my first post. I was very clear in my reply. Instead of acknowledging what I meant, you chose to play dump and throw me a list of companies that maybe produced some products 10 years ago, when Nvidia did tried to get in smartphones.
Now that you can't continue that charade, you run away saying that I am the one shifting the goalposts.

Go and play your games elsewhere.
Posted on Reply
#28
bruhmoment01
“NVIDIA also shared that it expects the new 72-core CPU to hit around SPEC CPU2017 Integer Rate scores of around 370 per full 72-core Grace CPU using GCC. For some context, a 64-core AMD chip will have official scores in the 390-450 range (e.g. an AMD EPYC 7773X.)”

Source.

For all that terabyte of bandwidth and juvenile “SuPErCHiP” naming, it’s merely “competitive” (read: slower but not a whole lot slower) with current offerings. Imagine designing your processor with an exotic memory interface and giving it a silly grandiose name, only for it to be slower than year old processors using bog-standard DDR4-3200. IPC uplift leaks suggest 96-core Genoa will be faster than the whole dual-processor “sUpERchIp” package :laugh:

I agree with the article, maybe if Nvidia had an x86 license they could have come up with something more competitive. This is a great (although very costly) demonstration that the reports of x86 being dead are greatly exaggerated :sleep:
Posted on Reply
#29
ncrs
john_So, you have nothing to say.

I was very clear in my first post. I was very clear in my reply. Instead of acknowledging what I meant, you chose to play dump and throw me a list of companies that maybe produced some products 10 years ago, when Nvidia did tried to get in smartphones.
Now that you can't continue that charade, you run away saying that I am the one shifting the goalposts.

Go and play your games elsewhere.
Fine, I'll bite and quote your first post directly:
john_Nvidia did a mistake to not REALLY concentrate on ARM sooner and produce products like Grace long ago. Not this kind of huge processors for servers from the beginning maybe, but SOCs for laptops and desktops, or if not desktops, at least mini PCs, running Windows on ARM, or Linux, or Android, or all of them. Qualcomm is a sleeping, boring, failure in that area.
Let me also remind you that you used "sooner" there which I understood as wishing that NVIDIA did ARM stuff before now. When faced with the entire history of NVIDIA ARM in your last post you suddenly want only products from the last 5 years. Why?
john_but SOCs for laptops and desktops
ASUS Eee Pad Transformer series, Microsoft Surface, Surface 2 and Surface RT, Lenovo IdeaPad Yoga 11, Toshiba AC100, HP Chromebook, Acer Chromebook.
john_at least mini PCs
HP Slate series, Shield TV, arguably Jetson.
john_running Windows on ARM
Microsoft Surface, Surface 2 and RT.
john_or Linux, or Android
Jetson shipped with Linux and almost everything else was running Android. On most products Linux could be installed in some form.

There's also a myriad of smartphones and the gaming series with Ouya, Shield Portable and Nintendo Switch series.

All in all, NVIDIA has a long history with ARM and has scored multiple products with them from the biggest OEMs. Were they unsuccessful? Arguably mostly yes, but not usually as a consequence of technical merit.
rrubberr“NVIDIA also shared that it expects the new 72-core CPU to hit around SPEC CPU2017 Integer Rate scores of around 370 per full 72-core Grace CPU using GCC. For some context, a 64-core AMD chip will have official scores in the 390-450 range (e.g. an AMD EPYC 7773X.)”

Source.

For all that terabyte of bandwidth and juvenile “SuPErCHiP” naming, it’s merely “competitive” (read: slower but not a whole lot slower) with current offerings. Imagine designing your processor with an exotic memory interface and giving it a silly grandiose name, only for it to be slower than year old processors using bog-standard DDR4-3200. IPC uplift leaks suggest 96-core Genoa will be faster than the whole dual-processor “sUpERchIp” package :laugh:

I agree with the article, maybe if Nvidia had an x86 license they could have come up with something more competitive. This is a great (although very costly) demonstration that the reports of x86 being dead are greatly exaggerated :sleep:
To be honest the STH article you linked explains it pretty well. Grace is not supposed to compete with x86 in raw compute, but is a platform for NVIDIAs GPU efforts. The CPU-GPU coherency along with NVLink networking is a big deal with scaling their AI/ML solutions.
Historically NVIDIA went from IBM POWER to Intel x86 to AMD x86 as the basis of their GPU clusters, so it makes sense for them to utilize previous ARM expertise and complete the walled garden with an in-house CPU platform ;)
With the Mellanox acquisition they are pretty well vertically integrated now.
Posted on Reply
#30
bruhmoment01
ncrsTo be honest the STH article you linked explains it pretty well. Grace is not supposed to compete with x86 in raw compute, but is a platform for NVIDIAs GPU efforts. The CPU-GPU coherency along with NVLink networking is a big deal with scaling their AI/ML solutions.
Historically NVIDIA went from IBM POWER to Intel x86 to AMD x86 as the basis of their GPU clusters, so it makes sense for them to utilize previous ARM expertise and complete the walled garden with an in-house CPU platform ;)
With the Mellanox acquisition they are pretty well vertically integrated now.
Yes, my point was they are vertically integrated by virtually uncompetitive. Locked into a walled garden where the prices are high and the hardware is slow; surely they must aspire to be the new Apple? This is even funnier when you consider NVIDIA has been developing their "ARM expertise™" since 2008 at least — all that "expertise™" and such an unflattering result makes you wonder about ARM as a platform :laugh:
Posted on Reply
#31
john_
ncrsFine, I'll bite and quote your first post directly:

Let me also remind you that you used "sooner" there which I understood as wishing that NVIDIA did ARM stuff before now. When faced with the entire history of NVIDIA ARM in your last post you suddenly want only products from the last 5 years. Why?

ASUS Eee Pad Transformer series, Microsoft Surface, Surface 2 and Surface RT, Lenovo IdeaPad Yoga 11, Toshiba AC100, HP Chromebook, Acer Chromebook.

HP Slate series, Shield TV, arguably Jetson.

Microsoft Surface, Surface 2 and RT.

Jetson shipped with Linux and almost everything else was running Android. On most products Linux could be installed in some form.

There's also a myriad of smartphones and the gaming series with Ouya, Shield Portable and Nintendo Switch series.

All in all, NVIDIA has a long history with ARM and has scored multiple products with them from the biggest OEMs. Were they unsuccessful? Arguably mostly yes, but not usually as a consequence of technical merit.
Oh, come on. I have a "REALLY" in my first post with ALL the letters in capital form. I mean, you missed that? There is nothing to bite here. Just READ what I wrote. Not parts of what I wrote, but ALL the words.
I also explained in that first post what I meant by saying sooner. It's there. Just READ it. Don't avoid parts of my posts just to create not existent excuses, to give a different meaning of what I wrote. You misread what I write, and then accuse me for what you misunderstood, or chose to give a different explanation, an explanation of your own. You think I don't know Nvidia's history or that I haven't seen their first efforts in the ARM platform? But they just gave up for something REALLY serious in the retail market. I think their excuse was Qualcomms anticompetitive tactics back then. Give me a brake, my first Nvidia product was a GeForce 2 MX.
The products you show are 10 years old. Haven't seen what Nvidia was doing in the Chromebook market to be honest, but the last 5-10 years they where mostly making specialized boards and nothing else with the exception of Shield and Swift. Do you understand that "REALLY" in my first post, or are you determent to NOT understand what I am saying even from that first post? When we have so many companies succeeding in the ARM market, companies (much) smaller than Nvidia and knowing Nvidia's potential, it's easy to assume that they just lost interest and only now they are coming back. Hope not just to support their server vision, but for more.
As for Nvidia's long history, did I said they just got a license? What? I am shifting the golposts again? No. You just DON'T READ what I write.
Posted on Reply
#32
Vayra86
john_I think this was on everyone's mind who knew that Intel rejected giving a license to Nvidia.

Nvidia did a mistake to not REALLY concentrate on ARM sooner and produce products like Grace long ago. Not this kind of huge processors for servers from the beginning maybe, but SOCs for laptops and desktops, or if not desktops, at least mini PCs, running Windows on ARM, or Linux, or Android, or all of them. Qualcomm is a sleeping, boring, failure in that area.

They'll probably start accelerating in the ARM platform now. They lost time waiting to see if they can first have the absolute control of ARM. No one wanted them, so it's good to see that their pride and arrogance - which is part of their business mentality, sometimes helps them, mosts times, it doesn't - is not becoming an obstacle to their plans to start developing CPUs also.

As much as Intel needs GPUs for it's future, the same Nvidia needs CPUs for it's future. We all saw what happened to Nvidia's financials this quarter, because they only stand on one foot. GPUs. Hit that foot and the whole company trembles.
They did focus early though. Tegra

first dual core soc in a phone, even
People read your posts, but quite often something is missing from your version of history

en.m.wikipedia.org/wiki/LG_Optimus_2X

They also pushed on the GPU angle but (graphics intensive-) gaming in phones never really took off;
developer.nvidia.com/embedded/buy/tegra-k1-processor

And this was their niche, the unique selling point Nvidia had and now expands on with Grace as well: gpu acceleration.

Overall I dont think Nvidia can be blamed for lack of trying to get into this market... the ultimate push cost them dearly;

nvidianews.nvidia.com/news/nvidia-to-acquire-arm-for-40-billion-creating-worlds-premier-computing-company-for-the-age-of-ai
Posted on Reply
#33
john_
Vayra86They did focus early though. Tegra

first dual core soc in a phone, even
People read your posts, but quite often something is missing from your version of history

en.m.wikipedia.org/wiki/LG_Optimus_2X

They also pushed on the GPU angle but (graphics intensive-) gaming in phones never really took off;
developer.nvidia.com/embedded/buy/tegra-k1-processor

And this was their niche, the unique selling point Nvidia had and now expands on with Grace as well: gpu acceleration.

Overall I dont think Nvidia can be blamed for lack of trying to get into this market... the ultimate push cost them dearly;

nvidianews.nvidia.com/news/nvidia-to-acquire-arm-for-40-billion-creating-worlds-premier-computing-company-for-the-age-of-ai
Or maybe I was expecting more than what others expected from Nvidia. Others look at wikipedia, see Nvidia's non stop support of the ARM platform and say "There, Nvidia never stopped innovating on the ARM platform". But that's not the case. As seen by the above products given, 10 years ago Nvidia was trying to get in smartphones and tablets, maybe some ultra portables, netbooks too. But those efforts just stopped, meaning Nvidia did steps backwards from there. Maybe it was Qualcomm's tactics, maybe it was the low profit margin, maybe combinations. Project Denver back then was looking promising. Then nothing. I mean, Nvidia did use it in some specialized products, it even got in Google's Nexus, but not much after that. We hear about a SOC from Mediatek or Qualcomm for example, and 6-12 months later we get 10-20-50 devices using it. That never happened with Nvidia, at least not for the last many years, because Nvidia stopped targeting smartphones, tablets and laptops etc.
Gaming on phones doesn't really needs graphics. At least most games in smartphones don't. They are very simple in graphics. i believe 3D intensive gaming in smartphones is a minority, but i could be wrong here.
I think Nvidia wasn't really investing on the ARM platform, because it could not control the platform and what direction that platform would go. That's why they tried to buy ARM. They had a license, they even had a license to build custom cores, but they probably also had reasons to not invest heavily on that platform. If they had bought ARM, no Qualcomm could sabotage their negotiations with big smartphone companies and also they could turn the ship more in the server direction, than the smartphone market. They couldn't buy ARM, but they can't do something about that either. They need CPUs for their server aspirations at least.
Posted on Reply
#34
Vayra86
john_Or maybe I was expecting more than what others expected from Nvidia. Others look at wikipedia, see Nvidia's non stop support of the ARM platform and say "There, Nvidia never stopped innovating on the ARM platform". But that's not the case. As seen by the above products given, 10 years ago Nvidia was trying to get in smartphones and tablets, maybe some ultra portables, netbooks too. But those efforts just stopped, meaning Nvidia did steps backwards from there. Maybe it was Qualcomm's tactics, maybe it was the low profit margin, maybe combinations. Project Denver back then was looking promising. Then nothing. I mean, Nvidia did use it in some specialized products, it even got in Google's Nexus, but not much after that. We hear about a SOC from Mediatek or Qualcomm for example, and 6-12 months later we get 10-20-50 devices using it. That never happened with Nvidia, at least not for the last many years, because Nvidia stopped targeting smartphones, tablets and laptops etc.
Gaming on phones doesn't really needs graphics. At least most games in smartphones don't. They are very simple in graphics. i believe 3D intensive gaming in smartphones is a minority, but i could be wrong here.
I think Nvidia wasn't really investing on the ARM platform, because it could not control the platform and what direction that platform would go. That's why they tried to buy ARM. They had a license, they even had a license to build custom cores, but they probably also had reasons to not invest heavily on that platform. If they had bought ARM, no Qualcomm could sabotage their negotiations with big smartphone companies and also they could turn the ship more in the server direction, than the smartphone market. They couldn't buy ARM, but they can't do something about that either. They need CPUs for their server aspirations at least.
They simply didnt have design wins. Cortex was always more diverse. Tegra was power hungry. Cortex had a core in the stack for every segment from budget to flagship phone.

Nvidia never got to an efficient enough design to compete. They sold their SoC to LG and their dual core phone was fast but not efficient at that. (I owned one..) Tegra K1 similarly guzzles power. And Tegra STILL shines in one device: Shield TV. Why? It doesnt run on a battery ;) And its still arguably the best android Tv device you can hook onto a TV.

The ARM market is open. At the server end, many core CPUs like ThunderX trumped their design too for server.

You can spin it however you feel but the reality is, Nvidia certainly did try, but their advantage in tech just couldnt get translated into something meaningful while competitors already had some iterations and refinements under the hood. The whole reason Grace is the subject now, again is because it helps their GPU division. Not because they built the most efficient CPU. They have shareholders too; these adventures cant last forever.

You even answered the issue yourself by stating smartphones dont really need graphics in a big way. Its not entirely realistic to then expect them to push further on stuff they are behind on anyway...

As for the olde idea of Nvidia wanting market control and not just share... you might be right about that. But the company really did try the fair play approach; let the design/tech speak for itself. Its all Nvidia really does and did at its core. Their GPU leadership is not the result of market control. Its a result of consecutive design wins. An approach they tried just the same with Tegra.
Posted on Reply
#35
bug
AssimilatorOf course they had a business case, it's called making money. Even though Arm CPUs are in everything and the kitchen sink nowadays, I'd wager the x86 market cap still exceeds every other CPU architecture type combined.
That's a very simplistic view. Wishing does not make it so, I gave examples of companies that tried to build x86 CPUs and failed. And even if you can make money, there's also the cost of opportunity. My feeling is Nvidia simply chose to invest in compute which is a higher margin market. Now that they have a big chunk of that market, they can afford to design custom CPUs. But they only to that to support their existing business.

On another note, cracking into the x86 CPU business is probably impossible today. AMD is very lucky to have x86_64 in their courtyard, that forces Intel to cross-license the rest of the instruction set. A third player would have to get a license from both Intel and AMD and will probably never be able to compete on price because of that.
Posted on Reply
#36
Minus Infinity
Would like to see Nvidia make ARM based desktop cpu's to rival Apple's Mx series. If Apple weren't so arrogant and self-absorbed they could make a ton of money licensing their chips to anyone. Just like Sony's sensor division form their camera division, they will sell sensors to fierce competitors, because there in the business to make money for Sony and boy do they make a lot more money than the camera division. So given Apple's arrogance, a third player in the desktop cpu market with the might of Nvidia would be good IMO. Intel really isn't going to have an answer to Apple until Arrow Lake, and I'll bet it still is way behind on efficiency even if they deliver the forecast improvements in that area.

Still come 2025-27 I think we are in for a treat. AMD is not ignoring Apple either.
Posted on Reply
#37
john_
Vayra86They simply didnt have design wins. Cortex was always more diverse. Tegra was power hungry. Cortex had a core in the stack for every segment from budget to flagship phone.

Nvidia never got to an efficient enough design to compete. They sold their SoC to LG and their dual core phone was fast but not efficient at that. (I owned one..) Tegra K1 similarly guzzles power. And Tegra STILL shines in one device: Shield TV. Why? It doesnt run on a battery ;) And its still arguably the best android Tv device you can hook onto a TV.

The ARM market is open. At the server end, many core CPUs like ThunderX trumped their design too for server.

You can spin it however you feel but the reality is, Nvidia certainly did try, but their advantage in tech just couldnt get translated into something meaningful while competitors already had some iterations and refinements under the hood. The whole reason Grace is the subject now, again is because it helps their GPU division. Not because they built the most efficient CPU. They have shareholders too; these adventures cant last forever.

You even answered the issue yourself by stating smartphones dont really need graphics in a big way. Its not entirely realistic to then expect them to push further on stuff they are behind on anyway...

As for the olde idea of Nvidia wanting market control and not just share... you might be right about that. But the company really did try the fair play approach; let the design/tech speak for itself. Its all Nvidia really does and did at its core. Their GPU leadership is not the result of market control. Its a result of consecutive design wins. An approach they tried just the same with Tegra.
Well, they had to start from somewhere. Tegra having bad efficiency was known and that companion core in Tegra 3's design was an effort to lower power consumption. But, Qualcomm and probably profit margins made them to not take that market seriously. They could invest into building ARM SOCs. They could do what Mediatek is doing. Offer a cheaper alternative into the market. Or do what AMD is trying to do with Samsung, integrate their graphics tech into an ARM SOC and try to make it as efficient as possible. They could have a thriving business by now in the ARM SOC market, with Nvidia's graphics being an alternative to Adreno and Mali, or the top option, or a number of options, for the hi end or the mid range phones. But I am not sure they still see this. Grace is for servers. Are they going to do something in the other areas where ARM is the main option, or just stay in servers, cloud, auto business?
Nvidia was always trying to gain market control. That's what forces them to innovate and makes them look like they are one or two steps in front of the competition. CUDA, PhysX(they bought the tech, but who else had a chance to promote it stronger?), G-Sync, DLSS, RayTracing(it's not their, but the same as PhysX). Fair game and Nvidia doesn't usually go together and when I was reading 10 years or so ago Nvidia's protests about Qualcomm's anticompetitive tactics in the SOC market, I was laughing. Nvidia wasn't lying. Qualcomm had to face fines if I remember correctly from the courts and agree to play nice in the future.
Minus InfinityWould like to see Nvidia make ARM based desktop cpu's to rival Apple's Mx series. If Apple weren't so arrogant and self-absorbed they could make a ton of money licensing their chips to anyone. Just like Sony's sensor division form their camera division, they will sell sensors to fierce competitors, because there in the business to make money for Sony and boy do they make a lot more money than the camera division. So given Apple's arrogance, a third player in the desktop cpu market with the might of Nvidia would be good IMO. Intel really isn't going to have an answer to Apple until Arrow Lake, and I'll bet it still is way behind on efficiency even if they deliver the forecast improvements in that area.

Still come 2025-27 I think we are in for a treat. AMD is not ignoring Apple either.
Nvidia is as arrogant as Apple. But they don't have the huge customer base Apple does. Apple sells devices, Nvidia sells parts, not whole devices. It's not easy to make loyal customers who buy your parts, to convince them to buy whole devices, just because of your logo on them. But Nvidia could start offering ARM SOCs for smartphones, laptops, (mini) desktop PCs and build reference, all Nvidia hardware, devices for others to build.
Apple will never sell it's stuff to others, because if everyone could build iPhones and Macs, those devices could lose their premium image. Many people buy iPhones and Macs with the same mentality they buy jewelery.
Posted on Reply
#38
medi01
So the "news" is essentially a hidden team green advertisement.

Because honestly saying that yet another ARM clone had been rolled out today, this time, by NV, is not flashy enough.
john_Nvidia had always better vision than AMD, it was more ambitious
Ah, so that is why both Microsoft and Apple told it to go have solo kamasutra: "better vision".
Posted on Reply
#39
bug
Minus InfinityWould like to see Nvidia make ARM based desktop cpu's to rival Apple's Mx series. If Apple weren't so arrogant and self-absorbed they could make a ton of money licensing their chips to anyone. Just like Sony's sensor division form their camera division, they will sell sensors to fierce competitors, because there in the business to make money for Sony and boy do they make a lot more money than the camera division. So given Apple's arrogance, a third player in the desktop cpu market with the might of Nvidia would be good IMO. Intel really isn't going to have an answer to Apple until Arrow Lake, and I'll bet it still is way behind on efficiency even if they deliver the forecast improvements in that area.

Still come 2025-27 I think we are in for a treat. AMD is not ignoring Apple either.
That's not Apple's business model. Apple is all about (apparent) scarcity to justify higher mark-ups and keeping the number of SKUs in the wild low, to cut back on support expenses. Offering their CPU to third parties would fly in the face of all that.
Posted on Reply
#40
medi01
Minus InfinityWould like to see Nvidia make ARM based desktop cpu's to rival Apple's Mx series.
Where is the market for that?

Apple paywalled people by anything from Apple anyway.

Why would the rest of the world bother?
Posted on Reply
#41
john_
medi01Ah, so that is why both Microsoft and Apple told it to go have solo kamasutra: "better vision".
Nvidia took a dumb graphics chip and transformed it in the most important compute chip of today. They pushed Physics (in the wrong way, but at least made gamers focus on the importance of it), build CUDA, introduced GSync, DLSS, pushed RayTracing into the market making it the de facto feature more gamers are expecting from their new hardware and probably so many things I forget now, or don't know about. AMD on the other hand have only to show Mantle. Other techs from AMD are reactions to Nvidia. Versions of Nvidia's visions as they should be brought to the market to be consumer friendly, like FreeSync and FSR.
In the case of consoles, Nvidia either didn't had the hardware, a SOC that would combine both a CPU and a GPU and be competitively priced compared to AMD's APUs, or the low profit margins made them ignore that market (and I thank them for that choice, if that was the case, because I could be forced to be buying quad core Intel's for the rest of my life).
Posted on Reply
#42
medi01
john_Nvidia took a dumb graphics chip and transformed it in the most important compute chip of today.
The only truth to it is that number crunching GPUs are indeed very dumb.

The need for massively parallel processing has exactly 0 to do with NV.
john_They pushed Physics (in the wrong way,
That is one way to refer to "they bought PhysX".
john_build CUDA
Proprietary crap.
john_introduced GSync
That is dead, and good riddance.
john_DLSS
Used that thing to milk more money off the customer. The tech itself will likely follow the path of GSYNC.
john_pushed RayTracing into the market
And at the moment, the most noticeable hint at "RT is on" is the massive drop in framerates. At the same time, this thing is not uses NV's RT:

john_Other techs from AMD are reactions to Nvidia
That's a lovely myth to push in 2022.
Posted on Reply
#43
bug
john_Nvidia took a dumb graphics chip and transformed it in the most important compute chip of today. They pushed Physics (in the wrong way, but at least made gamers focus on the importance of it), build CUDA, introduced GSync, DLSS, pushed RayTracing into the market making it the de facto feature more gamers are expecting from their new hardware and probably so many things I forget now, or don't know about. AMD on the other hand have only to show Mantle. Other techs from AMD are reactions to Nvidia. Versions of Nvidia's visions as they should be brought to the market to be consumer friendly, like FreeSync and FSR.
In the case of consoles, Nvidia either didn't had the hardware, a SOC that would combine both a CPU and a GPU and be competitively priced compared to AMD's APUs, or the low profit margins made them ignore that market (and I thank them for that choice, if that was the case, because I could be forced to be buying quad core Intel's for the rest of my life).
To be fair, AMD/ATI had TruForm years before tessellation was a thing. I don't know whether that's good or bad, since they weren't able to get any traction for it.

If memory serves me well Nvidia pioneered programmable shaders.
Posted on Reply
#44
john_
bugTo be fair, AMD/ATI had TruForm years before tessellation was a thing. I don't know whether that's good or bad, since they weren't able to get any traction for it.

If memory serves me well Nvidia pioneered programmable shaders.
Can't say I remember or know everything from that era, but Nvidia was always better in promoting something. AMD was usually failing in promoting anything. When they bought Ageia I was truly happy, thinking that only they could push hardware Physics in games. Then they locked it and tried to use it to gain an unfair advantage over the competition. Nvidia is a company to both cheer and curse them at the same time.
Posted on Reply
#45
Assimilator
john_I am not. I am talking about mass production targeting markets with a huge user base. Other than Switch, that probably just happened, Nvidia was using ARM for very specific applications.
I haven't miss something.

All in on ARM? Really? Tell me some products that where made for mass production and availability to the general public. Except the obvious mention to Switch, what else is there? Shield tablet?
The reason that NVIDIA never has and never will be a mass-market seller of Arm CPUs is alluded to by my previous comment:
Even though Arm CPUs are in everything and the kitchen sink nowadays, I'd wager the x86 market cap still exceeds every other CPU architecture type combined.
For Arm, the profit margins simply aren't there unless you're putting your CPUs in everything, or building highly specialised CPUs for highly specialised niches. The latter is where NVIDIA has, rightly, focused their attention by augmenting their primary area of expertise (graphics) with CPUs that can help feed those graphics. They don't want to be a CPU company because CPUs are ancillary to their core focus.

Which once again brings us back to their attempted acquisition of Arm; I still struggle to see the reasoning behind it. The argument that it was to integrate NVIDIA graphics into Arm CPUs doesn't wash because NVIDIA's focus has always been high-performance high-power graphics, not low-end low-power ones as found in typical Arm applications, so they would essentially have to build an entirely new product. The thing is though, that doesn't require them to buy Arm; if NVIDIA already has a low-power GPU capable of competing with what's typically found in smartphones, there's absolutely nothing stopping them from just licensing or selling it as a standalone product.

The cynical take is that it's simply so NVIDIA could increase Arm licensing fees and reap the profits, but I really don't see that panning out well for them; it would almost certainly have pushed a lot of Arm licensees towards the royalty-free RISC-V, which makes it a self-defeating proposition.
Posted on Reply
#46
bug
john_Can't say I remember or know everything from that era, but Nvidia was always better in promoting something. AMD was usually failing in promoting anything. When they bought Ageia I was truly happy, thinking that only they could push hardware Physics in games. Then they locked it and tried to use it to gain an unfair advantage over the competition. Nvidia is a company to both cheer and curse them at the same time.
Well, it's really more nuanced than that. People (myself included) prefer open approaches.
But, besides trying to develop something new in the open taking a lot more time, look at the compute situation. Closed approach: CUDA, doing its job just fine and is now nearly ubiquitous. Open approach: OpenCL, a spec everyone hates to use, thus almost no one uses it. It doesn't always pan out like that, but what I'm trying to say is going the open route is not an automatic win, I will not fault anyone for going for the closed route.
Posted on Reply
#47
Dr. Dro
TechLurkerThis is probably why NVIDIA wanted ARM so badly; they could have made a stronger takeover of the general ARM space with biased deals offering their design over other competitors. Instead, they now have to innovate to complete alongside other ARM licensees. As an aside, their desire to homogenize their HPC systems reminds me of Apple and their walled garden; looking to also replace Intel out of their systems (if they haven't already), although they still use Radeon GPUs here and there.

That said, it looks like the competition will really be between NVIDIA and AMD. AMD added Xlinx to their portfolio, also own an ARM license, and are jointly working with Samsung to integrate RDNA with elements of ARM (via Exynos), which would help them combat NVIDIA across all platforms too. This is assuming NVIDIA also ports elements of this CPU down into their next-gen gaming tablets (and the next-gen Switch, assuming Nintendo sticks with NVIDIA), and even some gaming laptops running either Steam OS or Windows ARM.

Meanwhile, Intel, despite all their recent acquisitions, haven't really gotten anything to show for it, aside from Foveros, and it'll be awhile longer before their own compute and gaming GPUs can prove reliable enough in the high-value markets. Kind of wild to see such a dramatic shift the last 5 years.
However, you forgot the best part of it: Intel and its advanced foundry services, now available to third party customers might as well be fabbing both aforementioned companies' products themselves! ;)

A Ryzen on Intel 20A? I want to see it, to be honest with you.
Posted on Reply
#48
john_
AssimilatorThe reason that NVIDIA never has and never will be a mass-market seller of Arm CPUs is alluded to by my previous comment:



For Arm, the profit margins simply aren't there unless you're putting your CPUs in everything, or building highly specialised CPUs for highly specialised niches. The latter is where NVIDIA has, rightly, focused their attention by augmenting their primary area of expertise (graphics) with CPUs that can help feed those graphics. They don't want to be a CPU company because CPUs are ancillary to their core focus.

Which once again brings us back to their attempted acquisition of Arm; I still struggle to see the reasoning behind it. The argument that it was to integrate NVIDIA graphics into Arm CPUs doesn't wash because NVIDIA's focus has always been high-performance high-power graphics, not low-end low-power ones as found in typical Arm applications, so they would essentially have to build an entirely new product. The thing is though, that doesn't require them to buy Arm; if NVIDIA already has a low-power GPU capable of competing with what's typically found in smartphones, there's absolutely nothing stopping them from just licensing or selling it as a standalone product.

The cynical take is that it's simply so NVIDIA could increase Arm licensing fees and reap the profits, but I really don't see that panning out well for them; it would almost certainly have pushed a lot of Arm licensees towards the royalty-free RISC-V, which makes it a self-defeating proposition.
I agree with you in most of what you write. Don't think I wrote something different than that.

I think Nvidia having a... dramatic past was one of the reasons for them trying to buy ARM. Intel refusing to give them a license for it's back then new CPUs drive Nvidia's chipset department to an end. Qualcomm's anticompetitive tactics probably played a major role in Nvidia not pushing in the smartphone market. They probably thought that, in the case of ARM, there could be another company trying to buy them today or in the future and they should move first. Think of Apple buying ARM. They have the money, the power and the customer base to just terminate all licenses and say to the other companies "Go and try your lack with RISC-V, we just don't care!". A simplified thought that last one, but, who knows? I think Nvidia didn't wanted to invest heavily in something they couldn't control. Also buying ARM, as I said, it could give them, not just an advantage over competition, keeping the best design for themselves, but also to choose where the ARM platform will focus. Now they get what ARM is making and then change that to suit their needs. With them having control over ARM, they would create cores for their needs and then let their licensees to spend time and money to make those cores suits their needs.
Anyway, just speculating here.
Posted on Reply
#49
Minus Infinity
bugThat's not Apple's business model. Apple is all about (apparent) scarcity to justify higher mark-ups and keeping the number of SKUs in the wild low, to cut back on support expenses. Offering their CPU to third parties would fly in the face of all that.
I now it's not their business model, but it's stupid IMO and more fool them.
Posted on Reply
#50
Count von Schwalbe
AssimilatorFor Arm, the profit margins simply aren't there unless you're putting your CPUs in everything, or building highly specialised CPUs for highly specialised niches. The latter is where NVIDIA has, rightly, focused their attention by augmenting their primary area of expertise (graphics) with CPUs that can help feed those graphics. They don't want to be a CPU company because CPUs are ancillary to their core focus.

Which once again brings us back to their attempted acquisition of Arm; I still struggle to see the reasoning behind it. The argument that it was to integrate NVIDIA graphics into Arm CPUs doesn't wash because NVIDIA's focus has always been high-performance high-power graphics, not low-end low-power ones as found in typical Arm applications, so they would essentially have to build an entirely new product. The thing is though, that doesn't require them to buy Arm; if NVIDIA already has a low-power GPU capable of competing with what's typically found in smartphones, there's absolutely nothing stopping them from just licensing or selling it as a standalone product.

The cynical take is that it's simply so NVIDIA could increase Arm licensing fees and reap the profits, but I really don't see that panning out well for them; it would almost certainly have pushed a lot of Arm licensees towards the royalty-free RISC-V, which makes it a self-defeating proposition.
And Intel is a CPU company. That did not stop them from trying to build a graphics card division. Twice.
Posted on Reply
Add your own comment
Dec 1st, 2024 23:55 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts