Thursday, August 24th 2023

Strong Cloud AI Server Demand Propels NVIDIA's FY2Q24 Data Center Business to Surpass 76% for the First Time

NVIDIA's latest financial report for FY2Q24 reveals that its data center business reached US$10.32 billion—a QoQ growth of 141% and YoY increase of 171%. The company remains optimistic about its future growth. TrendForce believes that the primary driver behind NVIDIA's robust revenue growth stems from its data center's AI server-related solutions. Key products include AI-accelerated GPUs and AI server HGX reference architecture, which serve as the foundational AI infrastructure for large data centers.

TrendForce further anticipates that NVIDIA will integrate its software and hardware resources. Utilizing a refined approach, NVIDIA will align its high-end, mid-tier, and entry-level GPU AI accelerator chips with various ODMs and OEMs, establishing a collaborative system certification model. Beyond accelerating the deployment of CSP cloud AI server infrastructures, NVIDIA is also partnering with entities like VMware on solutions including the Private AI Foundation. This strategy extends NVIDIA's reach into the edge enterprise AI server market, underpinning steady growth in its data center business for the next two years.
NVIDIA's data center business surpasses 76% market share due to strong demand for cloud AI
In recent years, NVIDIA has been actively expanding its data center business. In FY4Q22, data center revenue accounted for approximately 42.7%, trailing its gaming segment by about 2 percentage points. However, by FY1Q23, data center business surpassed gaming—accounting for over 45% of revenue. Starting in 2023, with major CSPs heavily investing in ChatBOTS and various AI services for public cloud infrastructures, NVIDIA reaped significant benefits. By FY2Q24, data center revenue share skyrocketed to over 76%.

NVIDIA targets both Cloud and Edge Data Center AI markets
TrendForce observes and forecasts a shift in NVIDIA's approach to high-end GPU products in 2H23. While the company has primarily focused on top-tier AI servers equipped with the A100 and H100, given positive market demand, NVIDIA is likely to prioritize the higher-priced H100 to effectively boost its data-center-related revenue growth.

NVIDIA is currently emphasizing the L40s as their flagship product for mid-tier GPUs, meaning several strategic implications: Firstly, the high-end H100 series is constrained by the limited production capacity of current CoWoS and HBM technologies. In contrast, the L40s primarily utilizes GDDR memory. Without the need for CoWos packaging, it can be rapidly introduced to the mid-tier AI server market, filling the gap left by the A100 PCle interface in meeting the needs of enterprise customers.

Secondly, the L40s also target enterprise customers who don't require large parameter models like ChatGPT. Instead, it focuses on more compact AI training applications in various specialized fields, with parameter counts ranging from tens of billions to under a hundred billion. They can also address edge AI inference or image analysis tasks. Additionally, in light of potential geopolitical issues that might disrupt the supply of the high-end GPU H series for Chinese customers, the L40s can serve as an alternative. As for lower-tier GPUs, NVIDIA highlights the L4 or T4 series, which are designed for real-time AI inference or image analysis in edge AI servers. These GPUs underscore affordability while maintaining a high-cost-performance ratio.

HGX and MGX AI server reference architectures are set to be NVIDIA's main weapons for AI solutions in 2H23
TrendForce notes that recently, NVIDIA has not only refined its product positioning for its core AI chip GPU but has also actively promoted its HGX and MGX solutions. Although this approach isn't new in the server industry, NVIDIA has the opportunity to solidify its leading position with this strategy. The key is NVIDIA's absolute leadership stemming from its extensive integration of its GPU and CUDA platform—establishing a comprehensive AI ecosystem. As a result, NVIDIA has considerable negotiating power with existing server supply chains. Consequently, ODMs like Inventec, Quanta, FII, Wistron, and Wiwynn, as well as brands such as Dell, Supermicro, and Gigabyte, are encouraged to follow NVIDIA's HGX or MGX reference designs. However, they must undergo NVIDIA's hardware and software certification process for these AI server reference architectures. Leveraging this, NVIDIA can bundle and offer integrated solutions like its Arm CPU Grace, NPU, and AI Cloud Foundation.

It's worth noting that for ODMs or OEMs, given that NVIDIA is expected to make significant achievements in the AI server market for CSPs from 2023 to 2024, there will likely be a boost in overall shipment volume and revenue growth of AI servers. However, with NVIDIA's strategic introduction of standardized AI server architectures like HGX or MGX, the core product architecture for AI servers among ODMs and others will become more homogenized. This will intensify the competition among them as they vie for orders from CSPs. Furthermore, it's been observed that large CSPs such as Google and AWS are leaning toward adopting in-house ASIC AI accelerator chips in the future, meaning there's a potential threat to a portion of NVIDIA's GPU market. This is likely one of the reasons NVIDIA continues to roll out GPUs with varied positioning and comprehensive solutions. They aim to further expand their AI business aggressively to Tier-2 data centers (like CoreWeave) and edge enterprise clients.
Source: TrendForce
Add your own comment

18 Comments on Strong Cloud AI Server Demand Propels NVIDIA's FY2Q24 Data Center Business to Surpass 76% for the First Time

#1
bug
I've been saying one of the reasons AMD can't put pressure on Nvidia is that AMD only competes in gaming, which isn't Nvidia's main concern (and now source of revenue) anymore. People said I was wrong. And in a way, I wish I was.
Posted on Reply
#2
fancucker
bugI've been saying one of the reasons AMD can't put pressure on Nvidia is that AMD only competes in gaming, which isn't Nvidia's main concern (and now source of revenue) anymore. People said I was wrong. And in a way, I wish I was.
Nvidia has a better ecosystem, more robust and widely adopted frameworks (CUDA, cuDNN, TensorRT). They can afford to invest heavily in the segment and they're building brilliantly on their early development for AI. AMD has begun shifting priority to it and the previous rumor about small Navi 4x dies suggests so (unless its multGCD arrangement)
Posted on Reply
#3
bug
fancuckerNvidia has a better ecosystem, more robust and widely adopted frameworks (CUDA, cuDNN, TensorRT). They can afford to invest heavily in the segment and they're building brilliantly on their early development for AI. AMD has begun shifting priority to it and the previous rumor about small Navi 4x dies suggests so (unless its multGCD arrangement)
Well, they weren't born with that ecosystem. They had the foresight to build it and support it properly. Meanwhile, AMD was able to catch up in rastering performance and get close on perf/W. The difference between leading and hanging in there.
Posted on Reply
#4
Jism
bugI've been saying one of the reasons AMD can't put pressure on Nvidia is that AMD only competes in gaming, which isn't Nvidia's main concern (and now source of revenue) anymore. People said I was wrong. And in a way, I wish I was.
The f.

There's so many super computers assembled these days with both AMD Epyc CPU's and Mi300's and what more. AMD's compute devision do better and better each year. CDNA is excellent. Its just adoptation which will take some years.
Posted on Reply
#5
bug
JismThe f.

There's so many super computers assembled these days with both AMD Epyc CPU's and Mi300's and what more. AMD's compute devision do better and better each year. CDNA is excellent. Its just adoptation which will take some years.
Did you just miss the whole article? AMD is a drop in an ocean.
Posted on Reply
#6
cvaldes
bugI've been saying one of the reasons AMD can't put pressure on Nvidia is that AMD only competes in gaming, which isn't Nvidia's main concern (and now source of revenue) anymore. People said I was wrong. And in a way, I wish I was.
Actually, Nvidia's data center business overtook their gaming business a while ago but many people here chose to stick their heads in the sand and pretend it didn't happen. And even when presented with Nvidia's own CAGR numbers and outlook, they still stubbornly clung to the idea that consumer gaming was the most important thing for Nvidia.

They were wrong.
JismThe f.

There's so many super computers assembled these days with both AMD Epyc CPU's and Mi300's and what more. AMD's compute devision do better and better each year. CDNA is excellent. Its just adoptation which will take some years.
Supercomputing isn't just one discipline. They are typically designed, assembled, and deployed for a narrow range of applications.

Remember that CPU cores aren't suited for all workloads. Heck, the crypto craze cause high demand for GPUs because GPU cores are more suited for crypto mining than CPU cores. No one has mined bitcoin on CPUs for perhaps ten years; almost all bitcoin mining is done on custom ASICs.

Hell, 3D rasterization on CPU cores is very inefficient. That's why you have that GTX 1060 in your system. Hell, Windows doesn't even support a CPU-only graphics pipeline anymore because even the wimpiest integrated graphics processor is far better than a CPU at generating graphics.

AMD's compute division does better every year. But Nvidia's growth is outpacing AMD's growth in data center. So if AMD goes from 5 to 7 and Nvidia goes from 10 to 27, that results in the latter's market dominance despite AMD's growth.
Posted on Reply
#7
bug
cvaldesActually, Nvidia's data center business overtook their gaming business a while ago but many people here chose to stick their heads in the sand and pretend it didn't happen. And even when presented with Nvidia's own CAGR numbers and outlook, they still stubbornly clung to the idea that consumer gaming was the most important thing for Nvidia.

They were wrong.
I mean, gaming is still important to them, otherwise they wouldn't bother with RT, DLSS & friends. At the same time, dropping the whole business if they can't get the desired margins is also on the table now. They did it before when they handed AMD consoles on a platter (lucky for us, otherwise there wouldn't be an AMD to speak of today).
Posted on Reply
#8
cvaldes
bugI mean, gaming is still important to them, otherwise they wouldn't bother with RT, DLSS & friends. At the same time, dropping the whole business if they can't get the desired margins is also on the table now. They did it before when they handed AMD consoles on a platter (lucky for us, otherwise there wouldn't be an AMD to speak of today).
Of course gaming is still important to Nvidia. However there are varying degrees of importance and they are subject to change.

My guess is someday, something else will overtake machine learning. It won't happen tomorrow, next week, or even next year. But it is bound to happen and if Nvidia is alert, they will adjust to the changing conditions. If they didn't it is doubtful they would be in business today.

And people in this discussion forum need to remember that whatever Nvidia and Intel do doesn't prevent AMD from assigning some engineers to research machine learning technologies in their R&D labs.

It's up to all businesses to adapt to the changing environment and customers' wishes. If you had a limited amount of flour and oven space and people will pay way more for your pie than your cookies, well, you'd make pie. If a farmer gets more from ambrosia cantaloupe than watermelon, he's going to plant cantaloupe instead. In the summer make lemonade, in the winter make hot chocolate.

This is not rocket science people, these are basic business principles on how to run a company that will last. This is nothing new, it goes back to the first time humans started producing things.
Posted on Reply
#9
AnotherReader
bugI mean, gaming is still important to them, otherwise they wouldn't bother with RT, DLSS & friends. At the same time, dropping the whole business if they can't get the desired margins is also on the table now. They did it before when they handed AMD consoles on a platter (lucky for us, otherwise there wouldn't be an AMD to speak of today).
They didn't hand consoles to AMD on a platter. Rather, Nvidia tried to strongarm Microsoft which is still a much bigger company than Nvidia, and after that, they didn't have a snowball's chance in hell of getting into any Xbox consoles. Recall that Apple also dropped Nvidia after bumpgate.
Posted on Reply
#10
wolf
Better Than Native
bugI mean, gaming is still important to them, otherwise they wouldn't bother with RT, DLSS & friends. At the same time, dropping the whole business if they can't get the desired margins is also on the table now. They did it before when they handed AMD consoles on a platter (lucky for us, otherwise there wouldn't be an AMD to speak of today).
I almost think of it like Heisenberg in Breaking Bad, at a point in time it isn't even about the money, it's about the empire, being recognised as the best, the name in the business for graphics cards. I think that's important to the CEO personally, perhaps more so than margins and profit, it's personal to him to have the mindshare and empire and Nvidia be the best in the gaming market.

I could be totally off base and they pull the pin on the gaming segment all together, but I suspect that yeah, it's about more than just margins and shareholders, there's a heap of value in perceptions, and I think there's ego at stake there too.
Posted on Reply
#11
not_my_real_name
Market share?
TheLostSwedeNVIDIA's data center business surpasses 76% market share due to strong demand for cloud AI
Is it proper wording? Because there is not a word about market share in the paragraph. I'm happy for the company and for the industry, but this line is not very nice if it's a press release.

I hope this is only the beginning and not the end of a chapter in the history of AI. And the rest of the market parties will catch up in time.
Posted on Reply
#12
TheLostSwede
News Editor
not_my_real_nameMarket share?

Is it proper wording? Because there is not a word about market share in the paragraph. I'm happy for the company and for the industry, but this line is not very nice if it's a press release.

I hope this is only the beginning and not the end of a chapter in the history of AI. And the rest of the market parties will catch up in time.
You'll have to as TrendForce that wrote the press release.
Posted on Reply
#13
bug
AnotherReaderThey didn't hand consoles to AMD on a platter. Rather, Nvidia tried to strongarm Microsoft which is still a much bigger company than Nvidia, and after that, they didn't have a snowball's chance in hell of getting into any Xbox consoles. Recall that Apple also dropped Nvidia after bumpgate.
That doesn't really contradict what I said: they didn't get the margins, they bailed.
The handing on a platter part is they could have sold chips to MS at a loss for a year or two, it would have bankrupted AMD at the time. They gave AMD a lifeline instead.
Posted on Reply
#14
AnotherReader
bugThat doesn't really contradict what I said: they didn't get the margins, they bailed.
The handing on a platter part is they could have sold chips to MS at a loss for a year or two, it would have bankrupted AMD at the time. They gave AMD a lifeline instead.
Nvidia of the early 2010s wasn't as dominant as they are now; AMD's market share was around 40% rather than below 20%. Their high margin businesses were still in their infancy at that time. Their own actions, before and after 2013, belie your claims; they were happy to supply Microsoft and Sony in the past. The Playstation 3's GPU was supplied by Nvidia. AMD also met the requirements: they could provide a proven SOC whereas the alternatives, such as Nvidia's Tegra 4 had even lower CPU performance than AMD's cores and didn't even provide the capability to address more than 4 GB of RAM.

Their later actions also support my interpretation: they opted to supply Nintendo a few years later and that's hardly likely to be a high margin business. Moreover, Nvidia of 2010-2013 wasn't Intel; they didn't have any cause to fear an anti-trust case and would have crushed AMD if they could have. That they didn't succeed in winning the Sony and Microsoft consoles was down to two things: lackluster SOCs, and more importantly, bad relationships with both Microsoft and Sony.



Nvidia Tegra 4 results below: notice that it is slower than the A4-5000 despite a nearly 30% higher clock (1.9 vs 1.5 GHz)

Posted on Reply
#15
bug
AnotherReaderNvidia of the early 2010s wasn't as dominant as they are now; AMD's market share was around 40% rather than below 20%. Their high margin businesses were still in their infancy at that time. Their own actions, before and after 2013, belie your claims; they were happy to supply Microsoft and Sony in the past. The Playstation 3's GPU was supplied by Nvidia. AMD also met the requirements: they could provide a proven SOC whereas the alternatives, such as Nvidia's Tegra 4 had even lower CPU performance than AMD's cores and didn't even provide the capability to address more than 4 GB of RAM.

Their later actions also support my interpretation: they opted to supply Nintendo a few years later and that's hardly likely to be a high margin business. Moreover, Nvidia of 2010-2013 wasn't Intel; they didn't have any cause to fear an anti-trust case and would have crushed AMD if they could have. That they didn't succeed in winning the Sony and Microsoft consoles was down to two things: lackluster SOCs, and more importantly, bad relationships with both Microsoft and Sony.



Nvidia Tegra 4 results below: notice that it is slower than the A4-5000 despite a nearly 30% higher clock (1.9 vs 1.5 GHz)

Two things:
1. If Nvidia couldn't compete, how come AMD didn't crush them in the SoC market?
2. Without AMD, Nvidia would have become a GPU monopoly, that's what they were avoiding.
Posted on Reply
#16
AnotherReader
bugTwo things:
1. If Nvidia couldn't compete, how come AMD didn't crush them in the SoC market?
2. Without AMD, Nvidia would have become a GPU monopoly, that's what they were avoiding.
There are easy explanations for both:
  1. AMD canned its low power microarchitecture and opted to focus its limited resources on Zen which was the right choice for them. For low power scenarios targeting the GPU, Maxwell and its successors had a large lead over AMD so they were the right choice for portables like the Switch. Despite all that, in the overall SOC market, Nvidia is a minor player
  2. They could have pointed to the smartphone market where they have always been a failure and avoided antitrust
Posted on Reply
#17
bug
AnotherReaderThere are easy explanations for both:
Easy explanations, but false.
AnotherReader
  1. AMD canned its low power microarchitecture and opted to focus its limited resources on Zen which was the right choice for them. For low power scenarios targeting the GPU, Maxwell and its successors had a large lead over AMD so they were the right choice for portables like the Switch. Despite all that, in the overall SOC market, Nvidia is a minor player
They canned them? How are they still supplying Sony and MS? (And yes, all things considered Nvidia doesn't command a large part of the market either.)
AnotherReader
  1. They could have pointed to the smartphone market where they have always been a failure and avoided antitrust
Yes, if hit with desktop GPU monopoly regulations, I'm sure pointing authorities to smartphones would have done them a lot of good :wtf:
Posted on Reply
#18
AnotherReader
bugEasy explanations, but false.

They canned them? How are they still supplying Sony and MS? (And yes, all things considered Nvidia doesn't command a large part of the market either.)

Yes, if hit with desktop GPU monopoly regulations, I'm sure pointing authorities to smartphones would have done them a lot of good :wtf:
The low power architecture was Bobcat and its successors. They haven't done any further development on that architecture in a long while; for R&D purposes, it's canned.

We are talking about 2013 not 2023. How would Nvidia have been investigated for anti-trust when AMD was still at 40% market share in discrete GPUs?
Posted on Reply
Add your own comment
Nov 21st, 2024 11:45 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts