Tuesday, November 21st 2023

NVIDIA Announces Q3-FY24 Results; Earns 5 Times More Revenue from AI Chips than Gaming GPUs

NVIDIA (NASDAQ: NVDA) today reported revenue for the third quarter ended October 29, 2023, of $18.12 billion, up 206% from a year ago and up 34% from the previous quarter. GAAP earnings per diluted share for the quarter were $3.71, up more than 12x from a year ago and up 50% from the previous quarter. Non-GAAP earnings per diluted share were $4.02, up nearly 6x from a year ago and up 49% from the previous quarter.

"Our strong growth reflects the broad industry platform transition from general-purpose to accelerated computing and generative AI," said Jensen Huang, founder and CEO of NVIDIA. "Large language model startups, consumer internet companies and global cloud service providers were the first movers, and the next waves are starting to build. Nations and regional CSPs are investing in AI clouds to serve local demand, enterprise software companies are adding AI copilots and assistants to their platforms, and enterprises are creating custom AI to automate the world's largest industries.
"NVIDIA GPUs, CPUs, networking, AI foundry services and NVIDIA AI Enterprise software are all growth engines in full throttle. The era of generative AI is taking off," he said.

NVIDIA will pay its next quarterly cash dividend of $0.04 per share on December 28, 2023, to all shareholders of record on December 6, 2023.

Outlook
NVIDIA's outlook for the fourth quarter of fiscal 2024 is as follows:
  • Revenue is expected to be $20.00 billion, plus or minus 2%.
  • GAAP and non-GAAP gross margins are expected to be 74.5% and 75.5%, respectively, plus or minus 50 basis points.
  • GAAP and non-GAAP operating expenses are expected to be approximately $3.17 billion and $2.20 billion, respectively.
  • GAAP and non-GAAP other income and expense are expected to be an income of approximately $200 million, excluding gains and losses from non-affiliated investments.
  • GAAP and non-GAAP tax rates are expected to be 15.0%, plus or minus 1%, excluding any discrete items.
Highlights
NVIDIA achieved progress since its previous earnings announcement in these areas:
Data Center
  • Third-quarter revenue was a record $14.51 billion, up 41% from the previous quarter and up 279% from a year ago.
  • Announced NVIDIA HGX H200 with the new NVIDIA H200 Tensor Core GPU, the first GPU with HBM3e memory, with systems expected to be available in the second quarter of next year.
  • Introduced an AI foundry service—with NVIDIA AI Foundation Models, NVIDIA NeMo framework and NVIDIA DGX Cloud AI supercomputing — to accelerate the development and tuning of custom generative AI applications, first available on Microsoft Azure, with SAP and Amdocs among the first customers.
  • Announced that the NVIDIA Spectrum-X Ethernet networking platform for AI will be integrated into servers from Dell Technologies, Hewlett Packard Enterprise and Lenovo in the first quarter of next year.
  • Announced that NVIDIA GH200 Grace Hopper Superchips, including a new quad configuration, will power more than 40 new supercomputers, including the JUPITER system at Jülich Supercomputing Centre and Isambard-AI at the University of Bristol.
  • Made advances with global cloud service providers:
    • Google Cloud Platform made generally available new A3 instances powered by NVIDIA H100 Tensor Core GPUs and NVIDIA AI Enterprise software in Google Cloud Marketplace.
    • Microsoft Azure will be offering customers access to NVIDIA Omniverse Cloud Services for accelerating automotive digitalization, as well as new instances featuring NVL H100 Tensor Core GPUs and H100 with confidential computing, with H200 GPUs coming next year.
    • Oracle Cloud Infrastructure made NVIDIA DGX Cloud and NVIDIA AI Enterprise software available in Oracle Cloud Marketplace.
  • Partnered with a range of leading companies on AI initiatives, including Amdocs, Dropbox, Foxconn, Genentech (member of Roche Group), Infosys, Lenovo, Reliance Industries, Scaleway and TATA Group.
  • Announced record-setting performance in the latest two sets of MLPerf benchmarks for inference and training, with the NVIDIA Eos AI supercomputer training a GPT-3 model 3x faster than the previous record.
  • Announced growing worldwide support for the NVIDIA CUDA Quantum platform, including new efforts in Israel, the Netherlands, the U.K. and the U.S.
Gaming
  • Third-quarter revenue was $2.86 billion, up 15% from the previous quarter and up 81% from a year ago.
  • Launched DLSS 3.5 Ray Reconstruction, which creates high-quality ray-traced images for intensive ray-traced games and apps, including Alan Wake 2 and Cyberpunk 2077.
  • Released TensorRT-LLM for Windows, speeding on-device LLM inference by up to 4x.
  • Added 56 DLSS games and over 15 Reflex games, bringing the total number of RTX games and applications to over 475.
  • Surpassed 1,700 games on GeForce NOW, including launches of Alan Wake 2, Baldur's Gate 3, Cyberpunk 2077: Phantom Liberty, Forza Motorsport and Starfield.
Professional Visualization
  • Third-quarter revenue was $416 million, up 10% from the previous quarter and up 108% from a year ago.
  • Announced that Mercedes-Benz is using NVIDIA Omniverse to create digital twins to help plan, design, build and operate its manufacturing and assembly facilities around the world.
  • Announced a new line of desktop workstations with NVIDIA RTX 6000 Ada Generation GPUs and NVIDIA ConnectX smart interface cards for training smaller AI models, fine-tuning models and running inference locally.
Automotive
  • Third-quarter revenue was $261 million, up 3% from the previous quarter and up 4% from a year ago.
  • Furthered its collaboration with Foxconn to develop next-generation electric vehicles for the global market, using the next-generation NVIDIA DRIVE Hyperion platform and NVIDIA DRIVE Thor system-on-a-chip.
CFO Commentary
Commentary on the quarter by Colette Kress, NVIDIA's executive vice president and chief financial officer, is available at https://investor.nvidia.com.
Add your own comment

55 Comments on NVIDIA Announces Q3-FY24 Results; Earns 5 Times More Revenue from AI Chips than Gaming GPUs

#26
john_
PumperThose gross margins sure are gross.
I think they are just understated. I know that Nvidia is in the stock market and they can't lie, but I doubt those are real. When in mining period Nvidia was selling graphics cards at twice their price(I doubt they where sticking at MSRP when selling directly to mining farms) and was reporting 3-5 extra points than usual, now they are selling AI equipment and they only gone up a few more points in the profit margin?
Doubt it. I believe it is triple digit.
Posted on Reply
#27
Bwaze
AssimilatorBut this is now, and right now consumers are still quite happily paying for NVIDIA's GPUs. While Big Green could put all its eggs into the data centre market, there really is zero reason for it to do so. NVIDIA may be greedy, but they aren't greedy enough to throw away their massive investment into the consumer space just to chase the AI bubble that is ultimately going to pop.
Sure, but it doesn't mean Nvidia really has to offer anything inviting to gamers for now. As it happened during the cryptomadness.

There are buyers, but let's not pretend gamers are buying RTX 4090 a year and a half after it's release for 10 - 20% over MSRP!
Posted on Reply
#28
R0H1T
AssimilatorThe iPhone may be their headline product but the Mac line is and always will be a money-spinner.
Their biggest money spinner is services - Appstore/subscriptions & of course the Apple ecosystem. Same goes with Nvidia & CUDA, though it's arguable if they would be extracting same amount of profits in 10 years time, unlike the inherent advantage with all things Mac!
Posted on Reply
#29
Denver
AssimilatorThe NVIDIA haters are out in force as usual. Why is it so difficult for y'all to accept that a wildly successful and profitable company is wildly successful and profitable because it has product that the market wants to buy? Why do you have to invent ludicrous allegations of illegal behaviour?
Hate is a strong word, I see rational arguments from those who criticize Nvidia. Do you have the courage to back your opinions with financial stakes? Do you genuinely believe Nvidia is a trillion-dollar company?

I bet anyone that Nvidia will fall below Intel and AMD in market value within the next year, they are just easily replaceable in the main markets they operate in. :P
Posted on Reply
#30
MrDweezil
nguyenStadia was dead within a year, we are probably looking at different walls
It'll happen. Companies are pushing it prematurely, but it'll happen eventually.
Posted on Reply
#31
Assimilator
BwazeSure, but it doesn't mean Nvidia really has to offer anything inviting to gamers for now.
The fact that gamers are buying NVIDIA cards trivially demonstrates this is false.
R0H1TTheir biggest money spinner is services
Yes, that is more correct than what I said.
DenverHate is a strong word, I see rational arguments from those who criticize Nvidia.
Show me these alleged rational arguments, because all I see are completely unsubstantiated conspiracy theories that these reported figures are only possible if NVIDIA is lying or breaking the law in some way.
DenverDo you have the courage to back your opinions with financial stakes? Do you genuinely believe Nvidia is a trillion-dollar company?
It doesn't matter what I believe, all that matters is the evidence. And there has been zero evidence provided that NVIDIA is not the wildly profitable company it claims to be.
DenverI bet anyone that Nvidia will fall below Intel and AMD in market value within the next year, they are just easily replaceable in the main markets they operate in.
So easily replaceable that they are continuing to gain market share over their competitors, in the markets that they operate in.
Posted on Reply
#32
nguyen
MrDweezilIt'll happen. Companies are pushing it prematurely, but it'll happen eventually.
They are different markets, it's like a guy can have a PC, a laptop, a Steam Deck and a Geforce Now subscription all at the same time, having one does not exclude one from owning others.

The only possibility of Nvidia stop selling gaming GPU is when PC gaming died out, which honestly will not be happening any time soon.
Posted on Reply
#33
thesmokingman
I'm surprised anyone is surprised. Wtf do yall think 70% profit margin and a captive market looks like? I caught part of the earnings call and it was hilarious. Jensen proclaiming that making these cards is so damn hard, every aspect, hell even the shipping is hard work. Pulling back though and we can see that this market will shrink with AMD moving with better spec'd cards and the major players moving to custom chips. One wonders how long this gravy train will run for?
Posted on Reply
#34
TheoneandonlyMrK
AssimilatorIn a decade, maybe. But this is now, and right now consumers are still quite happily paying for NVIDIA's GPUs. While Big Green could put all its eggs into the data centre market, there really is zero reason for it to do so. NVIDIA may be greedy, but they aren't greedy enough to throw away their massive investment into the consumer space just to chase the AI bubble that is ultimately going to pop.


The iPhone may be their headline product but the Mac line is and always will be a money-spinner.
Plus where else would they. Beta test hardware and software on millions of consumer GPU.
Consumers paid for and beta tested Nvidia's AI hardware for years giving it the resources to develop this reality.

They're not giving that up.
Posted on Reply
#35
the54thvoid
Intoxicated Moderator
I can't wait until the AI investment reaps such rewards that I can enquire about whether or not I'll actually enjoy a new game, and after a quick discussion with a faceless digital intelligence, I'll know now to buy it. Jensen better watch out, his cards might just deter us all from buying new games.

Clearly, tongue-in-cheek. But sort of relevant.
Posted on Reply
#36
kapone32
Let me see a 4090 where I live costs minimum $2300 ($2600 with tax). Then you could get a 7900XT for $1200. Nvidia will sell 20 in a day and AMD would have to sell 50 to make the same amount. That would need common sense though. That 81% is probably relegated to USA and Australia because the best thing Nvidia is good at is marketing. They are so good that the day they announced the 4090 it was the death of the 3090 in the narrative, but the 3090TI was good at that too. Now the US is fearful of the Chinese using 4090s for Machine Learning of Drones and Nvidia ramps up production while making more money than is right, to be so desultory in their hubris that EVGA left them and the people said oh well. Even the new 12vHPWR that is still burning GPUs cannot change the narrative but I will continue hearing about how evil AMD is for not putting DLSS in Starfield. It doesn't matter though because AMD also has complete control of the handheld space and when the Pheonix APUs launch on desktop you can say goodbye to 1030,1060,1660,3050,4050 and also 570,580 cards from AMD but that will impact Nvdia's bottom line in no way.
Posted on Reply
#37
cvaldes
DenverI bet anyone that Nvidia will fall below Intel and AMD in market value within the next year, they are just easily replaceable in the main markets they operate in. :p
Highly unlikely.

Market cap
NVDA $1.22T
AMD $200B
INTC $185B

Nvidia would have to lose about 85% of its value to drop below AMD if the latter stayed even. However if NVDA dropped that much it would drag down the entire semiconductor sector as well as make a major dent in broad market indexes like the S&P 500, Nasdaq Composite and NASDAQ-100.

Nvidia does not function in a vacuum.

As for putting my money where my mouth is, I have indirect positions in all three via a variety of index funds.

AI isn’t going away, I will make money regardless who has market dominance a year from now.

Remember that revenues equals shipped products and paid services. Nvidia’s inbox is stacked with purchase orders, probably enough to keep TSMC busy for two, three years.

This isn’t Joe Gamer standing in front of a store shelf at Microcenter debating whether or not to buy a GeForce 4070.

Companies budget for these purchases in advance.
Posted on Reply
#38
dyonoctis
DavenGaming and professional graphics will move into data center as internet connection latency and bandwidth improves. Fabbing and assembling discrete graphics cards will become a waste of resources when all computing can become distributed and client devices become dumb terminals.

In that vast sea of big data, gaming will be one of many services offered albeit a very minor one in comparison.

Monitors will become VR/AR glasses. Input devices will just be your hands. A simple handheld device in your pocket will make the connection.
I think that the hybrid approach will keep being a thing until 100% of earth surface will have a perfect internet coverage with strong fall backs in case something goes wrong. Ending up with a potato computer whenever something goes south, or when you need to work in a remote area is going to be a big liability.

It's a bit like Toyota explaining that even if the west want to go full electric, there's a lot of places where it's just not going to work. You need alternative technologies to cover the weakness of the other one
Posted on Reply
#39
Assimilator
TheoneandonlyMrKPlus where else would they. Beta test hardware and software on millions of consumer GPU.
Consumers paid for and beta tested Nvidia's AI hardware for years giving it the resources to develop this reality.

They're not giving that up.
"Beta test" what is this stupid shit? How are any of NVIDIA's products "beta"?
Posted on Reply
#40
TheoneandonlyMrK
Assimilator"Beta test" what is this stupid shit? How are any of NVIDIA's products "beta"?
What.

Tensor cores, useful Eventually, occasionally.

Raytracing core's useful Eventually and occasionally

Dlss all upto version blah blah blah

Many other things time's IMHO.

So yeah as an owner/Beta tester yes.
Posted on Reply
#41
mrnagant
Game revenue is pretty boring. $2.8B. It has been higher in the past. Used to always be their #1 segment. That datacenter is popping. 80% of their Q3 and projected to be 75% of their Q4. Nvidia can basically just not care about gaming anymore in terms of cost. They can sell their cards for whatever they want and don't need to drop prices. If they don't sell they will gladly shift that production over to the datacenter and make hundreds of thousands of dollars more per wafer. Then their top end cards, they don't even need to sell them. They can just paper launch them and then not allocate any chips for gaming.

In one regards it is really exciting, but then in another it is kinda scary. We will get Blackwell, but it could be limited and expensive. AMD is skimping out on RDNA4. AMD is going to focus on their CPUs and I'd imagine put their GPU resources behind CDNA. ATM, gaming isn't the juggernaut in terms of revenue that it used to be and it obviously isn't where the money is. So I can see options being reduced (everyone), prices going up (Nvidia), and development slowing down (AMD).

Maybe if GPU cores go chiplet it won't be too bad.
Posted on Reply
#42
Denver
cvaldesHighly unlikely.

Market cap
NVDA $1.22T
AMD $200B
INTC $185B

Nvidia would have to lose about 85% of its value to drop below AMD if the latter stayed even. However if NVDA dropped that much it would drag down the entire semiconductor sector as well as make a major dent in broad market indexes like the S&P 500, Nasdaq Composite and NASDAQ-100.

Nvidia does not function in a vacuum.

As for putting my money where my mouth is, I have indirect positions in all three via a variety of index funds.

AI isn’t going away, I will make money regardless who has market dominance a year from now.

Remember that revenues equals shipped products and paid services. Nvidia’s inbox is stacked with purchase orders, probably enough to keep TSMC busy for two, three years.

This isn’t Joe Gamer standing in front of a store shelf at Microcenter debating whether or not to buy a GeForce 4070.

Companies budget for these purchases in advance.
I am of the opinion that Nvidia is not worth 1 trillion dollars and is likely to fall significantly below its current valuation.

I choose not to engage in the seemingly endless debate of Nvidia vs AMD vs Intel. My conviction in my perspective led me to exit at $500. If you genuinely believe in the positive prospects, consider purchasing the stock at this price point. Opinion is subjective, reality is not, and it eventually imposes itself by subjugating individual opinions.

The future will show who was right. Good luck!
Posted on Reply
#43
cvaldes
DenverI am of the opinion that Nvidia is not worth 1 trillion dollars and is likely to fall significantly below its current valuation.

I choose not to engage in the seemingly endless debate of Nvidia vs AMD vs Intel. My conviction in my perspective led me to exit at $500. If you genuinely believe in the positive prospects, consider purchasing the stock at this price point. Opinion is subjective, reality is not, and it eventually imposes itself by subjugating individual opinions.

The future will show who was right. Good luck!
Okay, you torched your own ass and you can't admit it.

First of all, I addressed your statement that NVDA would drop below AMD and INTC in valuation within a year. I showed market cap and you won't accept raw numbers. That's fine. But this is ON YOU.

Second, yes, you chose to engage in the endless debate of Nvidia vs. everyone else via your previous post. You mentioned no exit at $500.

And guess what? Unless your dollars are fully invested in something that returns better than NVDA, well, you lose. I know. I bailed out of NFLX after owning it for 18 months and it climbed through the stratosphere. Sure, I made money but not the same amount if I kept it.

We all have to sleep though. If you can't handle the suspense, just say so. No one can blame you for bailing out to get enough shut eye. But don't misconstrue it with making more money.

Third, even if NVDA doesn't have the same price acceleration in the next twelve months as it has in the previous 12, it's still more than losing 85% of market cap value, isn't it?

My money is that that in 12 months NVDA's market cap will still be higher than that of AMD or INTC.

You proposed something else and I might just bookmark this discussion just to revisit it in a year to see where we stand. Let me know if you'd like me to ping this thread a year from now.

Good try trying to cover your sorry ass on what you posted. Since I included your post, you can't back out. Everyone reading this thread will understand what sort of odds you are up against.

Feel free to respond to this, you will only dig a deeper grave for yourself. Remember: you are on the record for proposing that NVDA will have a lower market capitalization on November 22, 2024 than AMD. Technically, you stated that NVDA would be lower than both AMD and INTC. But let's just start with AMD. We get plenty of garbage chat here at TPU but this one is worth highlighting just due to its complete ludicrousness.

Best of luck.
Posted on Reply
#44
wolf
Better Than Native
Assimilator"Beta test" what is this stupid shit? How are any of NVIDIA's products "beta"?
It's a funny and I agree fairly ridiculous sentiment, not that I don't see how they arrived at it, but it's still silly.

If we were going to take that route and follow that logic, I'd argue that FSR 1/2/3 and Antilag+, perhaps others - they're beta features the users are testing, just on a smaller scale because they lack the vision to foresee (most of) these prospective innovations. Consumers paid for and beta tested AMD's reactionary hardware and software solutions for years giving AMD the resources to just barely stay in the PC space. Seen a fair few Radeon's sold, or purchases delayed, on hopium that they can match Nvidia's hardware and features, so Nvidia's features can't be all that bad if AMD aspires to meet or exceed them.

Consumers rarely can tell or steer the market in how to innovate, they want more, faster, better, but largely of known quantities. A 5090 that has 48GB and is twice as fast in every metric to a 4090? sounds great right? (price excluded for point making purpose), A monitor that's higher res and faster refresh, awesome! But average Joe would be hard pressed to predict or ask for the newer features and capability GPU's have come out with over the years. RT, Upscaling, Mesh Shading, Tessellation, and so on.
Posted on Reply
#45
FoulOnWhite
Good luck to them all, we are just the mice running around the maze while they drop tidbits in there for us. Don't complain when you have a 4090 in your PC
Posted on Reply
#46
Assimilator
TheoneandonlyMrKWhat.

Tensor cores, useful Eventually, occasionally.

Raytracing core's useful Eventually and occasionally

Dlss all upto version blah blah blah

Many other things time's IMHO.

So yeah as an owner/Beta tester yes.
Every single one of the features you mentioned is used by gamers, improves or enhances gameplay for those gamers, and are constantly being improved at no cost to purchasers of the associated hardware. "Beta" doesn't mean "feature that you personally have an irrational hatred for".

The fact of the matter is that these features are the very example of "when life gives you lemons, make lemonade". NVIDIA chose to dedicate more die space on its GPUs to ML- and RT-accelerating hardware because they didn't want to have to endure the cost and complexity of producing different GPUs for consumer and professional workloads. This hardware took away from raster hardware, which created a problem that their consumer products might no longer be able to compete on sheer performance. Instead of sitting back and hoping that wouldn't bite them, NVIDIA came up with feature solutions to leverage that hardware to add value for consumers. And they've been so successful in that regard, that those features have become the industry standard such that every single one of their competitors now has their own version of said features. That's not because those competitors slavishly copy whatever NVIDIA does, it's because consumers now expect those features as standard.

Basically, NVIDIA invented an entire new category of consumer graphic features entirely because they wanted to save money, and they've utilised those features so successfully that they've increased their market leadership. That's the very opposite of "beta", that's "astounding success", and your irrational hatred doesn't change that; it just makes you look ridiculous.

I've said it before and I'll say it again - NVIDIA's self-proclaimed moniker of "World Leader in Visual Computing Technologies" is incredibly pompous but entirely justified. They lead, they succeed, and others follow.
Posted on Reply
#47
Denver
cvaldesOkay, you torched your own ass and you can't admit it.

First of all, I addressed your statement that NVDA would drop below AMD and INTC in valuation within a year. I showed market cap and you won't accept raw numbers. That's fine. But this is ON YOU.

Second, yes, you chose to engage in the endless debate of Nvidia vs. everyone else via your previous post. You mentioned no exit at $500.

And guess what? Unless your dollars are fully invested in something that returns better than NVDA, well, you lose. I know. I bailed out of NFLX after owning it for 18 months and it climbed through the stratosphere. Sure, I made money but not the same amount if I kept it.

We all have to sleep though. If you can't handle the suspense, just say so. No one can blame you for bailing out to get enough shut eye. But don't misconstrue it with making more money.

Third, even if NVDA doesn't have the same price acceleration in the next twelve months as it has in the previous 12, it's still more than losing 85% of market cap value, isn't it?

My money is that that in 12 months NVDA's market cap will still be higher than that of AMD or INTC.

You proposed something else and I might just bookmark this discussion just to revisit it in a year to see where we stand. Let me know if you'd like me to ping this thread a year from now.

Good try trying to cover your sorry ass on what you posted. Since I included your post, you can't back out. Everyone reading this thread will understand what sort of odds you are up against.

Feel free to respond to this, you will only dig a deeper grave for yourself. Remember: you are on the record for proposing that NVDA will have a lower market capitalization on November 22, 2024 than AMD. Technically, you stated that NVDA would be lower than both AMD and INTC. But let's just start with AMD. We get plenty of garbage chat here at TPU but this one is worth highlighting just due to its complete ludicrousness.

Best of luck.
So much excitement coming from someone ready to lose money. Unlike you, I have all the serenity in the world.

Nvidia does not necessarily need to lose so much market value to fall below Intel and AMD, they just need to both grow while Nvidia continually weakens with the rise of competition and battles to make AI cheaper.

Furthermore, I just don't invest in what I don't believe in, It doesn't matter if banks or analysts say otherwise. If I lose or win, it will be by following what makes sense to me. GG.
Posted on Reply
#48
ratirt
I would really like to see gaming units sold not just the revenue part. Strictly for comparison. You may have the same number of units sold but if you jack up the price per unit, obviously your revenue will increase. I wonder, how these stats look like.
Posted on Reply
#49
TheoneandonlyMrK
AssimilatorEvery single one of the features you mentioned is used by gamers, improves or enhances gameplay for those gamers, and are constantly being improved at no cost to purchasers of the associated hardware. "Beta" doesn't mean "feature that you personally have an irrational hatred for".

The fact of the matter is that these features are the very example of "when life gives you lemons, make lemonade". NVIDIA chose to dedicate more die space on its GPUs to ML- and RT-accelerating hardware because they didn't want to have to endure the cost and complexity of producing different GPUs for consumer and professional workloads. This hardware took away from raster hardware, which created a problem that their consumer products might no longer be able to compete on sheer performance. Instead of sitting back and hoping that wouldn't bite them, NVIDIA came up with feature solutions to leverage that hardware to add value for consumers. And they've been so successful in that regard, that those features have become the industry standard such that every single one of their competitors now has their own version of said features. That's not because those competitors slavishly copy whatever NVIDIA does, it's because consumers now expect those features as standard.

Basically, NVIDIA invented an entire new category of consumer graphic features entirely because they wanted to save money, and they've utilised those features so successfully that they've increased their market leadership. That's the very opposite of "beta", that's "astounding success", and your irrational hatred doesn't change that; it just makes you look ridiculous.

I've said it before and I'll say it again - NVIDIA's self-proclaimed moniker of "World Leader in Visual Computing Technologies" is incredibly pompous but entirely justified. They lead, they succeed, and others follow.
I owned a 2060 since the beginning.

Who do you think you are trolling.

Nvidia trolled me .

I sat through months of no use of features because f all used them, BeTA.


I also tested most iterations of dlss because they can't make one right first time and guess what none are at iterative one because we beta tested them found issues for Nvidia.

I'm sorry this insults your master but deal with it.
Posted on Reply
#50
Assimilator
DenverNvidia does not necessarily need to lose so much market value to fall below Intel and AMD, they just need to both grow while Nvidia continually weakens with the rise of competition and battles to make AI cheaper.
WTAF.

Neither Intel nor AMD have been able to compete effectively with NVIDIA in the ML space and there's zero indication they'll be able to anytime soon. So to claim that both of those companies will somehow be able to grow by a factor of more than three times (which is what would need to happen for them to surpass NVIDIA's market cap), isn't just laughable - it's so absurd that your post is literally hurting my brain. And that more than three times figure is also only relevant if NVIDIA themselves doesn't grow at all in the same period, which ain't gonna happen.

Now, I'd agree that NVIDIA is overvalued right now, and that there's going to be a correction eventually. But even if their stock drops to the ~$200 it was at the beginning of this year, they'd have a market cap of ~$488 billion which still massively exceeds AMD and Intel combined.

But go on, keep digging that hole.
ratirtI would really like to see gaming units sold not just the revenue part. Strictly for comparison. You may have the same number of units sold but if you jack up the price per unit, obviously your revenue will increase. I wonder, how these stats look like.
How exactly does it matter how many units were sold? Even if you sell only one unit at 100000000% profit, you still made a lot of money.
TheoneandonlyMrKI owned a 2060 since the beginning.

Who do you think you are trolling.

Nvidia trolled me .

I sat through months of no use of features because f all used them, BeTA.


I also tested most iterations of dlss because they can't make one right first time and guess what none are at iterative one because we beta tested them found issues for Nvidia.

I'm sorry this insults your master but deal with it.
I'm sorry that I wasted my time arguing in good faith, when you evidently aren't capable of doing the same.
Posted on Reply
Add your own comment
Aug 28th, 2024 08:55 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts