Thursday, July 9th 2020

NVIDIA Surpasses Intel in Market Cap Size

Yesterday after the stock market has closed, NVIDIA has officially reached a bigger market cap compared to Intel. After hours, the price of the NVIDIA (ticker: NVDA) stock is $411.20 with a market cap of 251.31B USD. It marks a historic day for NVIDIA as the company has historically been smaller than Intel (ticker: INTC), with some speculating that Intel could buy NVIDIA in the past while the company was much smaller. Intel's market cap now stands at 248.15B USD, which is a bit lower than NVIDIA's. However, the market cap is not an indication of everything. NVIDIA's stock is fueled by the hype generated around Machine Learning and AI, while Intel is not relying on any possible bubbles.

If we compare the revenues of both companies, Intel is having much better performance. It had a revenue of 71.9 billion USD in 2019, while NVIDIA has 11.72 billion USD of revenue. No doubt that NVIDIA has managed to do a good job and it managed to almost double revenue from 2017, where it went from $6.91 billion in 2017 to $11.72 billion in 2019. That is an amazing feat and market predictions are that it is not stopping to grow. With the recent acquisition of Mellanox, the company now has much bigger opportunities for expansion and growth.
Add your own comment

136 Comments on NVIDIA Surpasses Intel in Market Cap Size

#76
cucker tarlson
ZoneDymoMeanwhile Nvidia has....upped their prices by a ton
as opposed to amd,who sells 250mm dies at 400 usd,which they initially wanted to sell at 450 but super lineup got in the way
give me a break
ZoneDymothe concept of True Audio, finally a company that values audio improvement (50% of the experience) over the always pushed graphics all the time.
I suppose they'd better get back to working on graphics tho.

and what is true audio ?
ZoneDymoAMD tried multiple things that are good imo, think of Mantle, a close to the metal API push that spawned Vulcan and pushed DirectX a bit in a similair direction
but nvidia gets no credit for dx12 ultimate,huh
ZoneDymoOr what about Freesync as opposed to the competitions "extra money pls" G-sync, that even pushed the competition to work on that same concept.
accoriding to amd,g-sync counterpart is called freesync premium
www.amd.com/en/technologies/free-sync
and still has no ulmb
Posted on Reply
#77
ZoneDymo
cucker tarlson1 as opposed to amd,who sells 250mm dies at 400 usd,which they initially wanted to sell at 450 but super lineup got in the way
give me a break


2 I suppose they'd better get back to working on graphics tho.

3 and what is true audio ?


4 but nvidia gets no credit for dx12 ultimate,huh


5 accoriding to amd,g-sync counterpart is called freesync premium
www.amd.com/en/technologies/free-sync
and still has no ulmb
1 Yes AMD is too expensive atm as well, which is sad, they are following Nvidia currently.
2 Ermm no? unrelated entirely
3 Google is your friend?
4 Nvidia literally has nothing to do with that sooo no
5 Yes and? still no extra money, just extra function labeled for the consumer to understand.
Posted on Reply
#79
mtcn77
cucker tarlsonaccoriding to amd,g-sync counterpart is called freesync premium
www.amd.com/en/technologies/free-sync and still has no ulmb
I fail to see the point. ULMB is just Nvidia's version. OEM's make their own, too. What is so special about it?
For instance, the 'best' VA as per response times is the least expensive in the budget monitors review. It comes with FreeSync and its own strobing.
Closing thoughts - The VX2458~C was impressive for gaming for a VA panel. These are usually plagued by slow black > grey transitions and black smearing in content, but that had been cleared up nicely here. For a VA panel it's fast and reliable in the upper refresh rate range, delivering decent response times and frame rates alongside the high contrast and deep blacks that this technology offers. It's not very well suited to lower refresh rates though so be careful there.
www.tftcentral.co.uk/reviews/budget_gaming_monitors_2020.htm#viewsonic_vx2458-c
Posted on Reply
#80
ZoneDymo
FiendishIf you're upset PhysX never took off more, you might look AMD's way.
gizmodo.com/nvidia-helping-modders-port-physx-engine-to-ati-radeon-5023150
Explain?
Ageia was the company that made a revolutionary physics engine (sadly surpassing what we do today) that Nvidia bought, made proprietory to their hardware and doing so just killed it off.
No developer in their right mind would spend time and money on a game that would only work on Nvidia cards unless Nvidia would compensate them for the lost revenue that would have been gained from other platforms, which Nvidia never did so that never happened.
All PhysX became was a silly tagged on gimmick in a handful of games like Borderlands 2, 99.9% the same game but with some orbs floating in the water....fantastic.
Posted on Reply
#81
cucker tarlson
ZoneDymoExplain?
Ageia was the company that made a revolutionary physics engine (sadly surpassing what we do today) that Nvidia bought, made proprietory to their hardware and doing so just killed it off.
No developer in their right mind would spend time and money on a game that would only work on Nvidia cards unless Nvidia would compensate them for the lost revenue that would have been gained from other platforms, which Nvidia never did so that never happened.
All PhysX became was a silly tagged on gimmick in a handful of games like Borderlands 2, 99.9% the same game but with some orbs floating in the water....fantastic.
did you see control ? absolutely amazing physx imprementation.

and didn't ageia require a dedicated card ?

mtcn77I fail to see the point. ULMB is just Nvidia's version.
so yo DO see the point tho.
gsync has nvidia's version of strobing.freesync does not.
lfc is only present in fs premium
Posted on Reply
#82
john_
cucker tarlsonlfc is only present in fs premium
LFC is available even at the simple FreeSync version. It just needs the upper limit of FreeSync range to be at least 2.5 times the lower limit. So a simple and cheap monitor with a FreeSync range from 30Hz to 75Hz, supports LFC.
Posted on Reply
#83
Fiendish
ZoneDymoExplain?
Ageia was the company that made a revolutionary physics engine (sadly surpassing what we do today) that Nvidia bought, made proprietory to their hardware and doing so just killed it off.
No developer in their right mind would spend time and money on a game that would only work on Nvidia cards unless Nvidia would compensate them for the lost revenue that would have been gained from other platforms, which Nvidia never did so that never happened.
All PhysX became was a silly tagged on gimmick in a handful of games like Borderlands 2, 99.9% the same game but with some orbs floating in the water....fantastic.
Ageia's hardware accelerated PhysX was already exclusive to their proprietary hardware (Ageia's PPUs) before Nvidia's acquisition, after the acquisition they merely ported the framework to run on Nvidia's GPUs as well. Nvidia both continued and expanded the development of PhysX and if you read the article I posted, Nvidia even was all for a third party porting GPU PhysX to run on Radeons but AMD refused to provide proper support.
Posted on Reply
#84
ZoneDymo
FiendishAgeia's hardware accelerated PhysX was already exclusive to their proprietary hardware (Ageia's PPUs) before Nvidia's acquisition, after the acquisition they merely ported the framework to run on Nvidia's GPUs as well. Nvidia both continued and expanded the development of PhysX and if you read the article I posted, Nvidia even was all for a third party porting GPU PhysX to run on Radeons but AMD refused to provide proper support.
Yes a dedicated card from a 3rd party.
Nvidia dropped all support for those dedicated cards after buying Ageia and then made it exclusive to their cards.
They even actively blocked users from running an Nvidia card as a dedicated physX card alongside an AMD card.....

continued and expanded....yeah thats why no game is build on gpu hardware accelerated physics right? its only used as a cpu implementation, no more interesting then Havok which we have had for years.

Sure Nvidia allowed for the competition to use it, more money pls first though.
That is always what I have said, Nvidia should have just made it open for everyone to use/implement and even contribute to, with the only request/requirement being that at the start of a game it would say "PhysX by Nvidia" and that would be their advertisement for being the core support for this development.

That is the problem, AMD tries to better everything for everyone, Nvidia tries to better everything exclusively for itself, probably why they are doing so well as a company, but why I dont support them myself.
cucker tarlsondid you see control ? absolutely amazing physx imprementation.

and didn't ageia require a dedicated card ?
Control is literally an Nvidia tech demo....and if anything its more about RTX then anything else so no, I have not seen a single video pointing out this "amazing physx imprementation"

Yes Ageia required a dedicated card,
Man imagine what could have been! a dedicated graphics, physics and soundcard, how fantastically better could games experiences be today if it wasnt for Nvidia buying ageia and killing it or Microsoft for ending direct sound with Vista.
Posted on Reply
#85
cucker tarlson
ZoneDymoI have not seen a single video pointing out this "amazing physx imprementation"
lol,that settles it.I played it and saw it myself.
but you haven't heard about it,so I must've been wrong.
ZoneDymoMan imagine what could have been! a dedicated graphics, physics and soundcard, how fantastically better could games experiences be today if it wasnt for Nvidia buying ageia and killing it or Microsoft for ending direct sound with Vista.
what :roll:
ZoneDymoThat is the problem, AMD tries to better everything for everyone, Nvidia tries to better everything exclusively for itself, probably why they are doing so well as a company, but why I dont support them myself.
lol,get a grip.
amd doesn't invest 1/10th of what nvidia does in pc gaming technologies
they lag behind more and more every gen and can't get their drivers up to snuff
Posted on Reply
#86
ZoneDymo
cucker tarlson1 lol,that settles it.I played it and saw it myself.
but you haven't heard about it,so I must've been wrong.

2 what :roll:


3 lol,get a grip.
amd doesn't invest 1/10th of what nvidia does in pc gaming technologies
they lag behind more and more every gen and can't get their drivers up to snuff
1 ermm no? you literally asked if I saw Control and I answered that I did and did not see any amazing physics....
Idk how your mind works with that response.

2 Not sure how that is hard to comprehend, you will have to be more specific with where I lost you.

3 Yes because they also have less then 1/10th to spend but what they spend they do a lot more for everyone then Nvidia does who cares pretty much only about themselves unless forced otherwise.
AMD drivers were fine for a long time, just with this last gen they messed things up, Nvidia has had massive driver issues as well in the last 2 years and also a lot when I last used Nvidia cards, it was a complete mess.
And lag behind more and more every gen? HD5000 series was the better choice, HD6000 series was the better choice, HD7000 series was the better choice, RX480 was a fantastic card for the price and often the better choice, Vega 56 was often the better choice and right now the RX5700(XT) is the better choice then their Nvidia counterparts soooo yeah no.

But if this is going to devolve into some silly fanboy nonsense (inb4 "you are the fanboy here buddy") then you migth as well just stop reacting.
Posted on Reply
#87
cucker tarlson
youtube is your friend
lol,didn't see physics in control.
this game is all about environment destruction.dafuq.
and I wasn't talking about a century ago i.e. 5000 series.
ever since hawaii they got worse and worse.now all they got is an overpriced g106 counterpart that has no dx12 ultimate support like next gen consoles.so yeah,"better choice"
Posted on Reply
#88
brutlern
Yeah, but nvidia makes GPUs while Intel makes CPUs and people change their GPUs way more often than they change their CPUs so it's simple math that nvidia would overtake intel at one point.
Posted on Reply
#89
mtcn77
cucker tarlsongsync has nvidia's version of strobing.
You mean like the way almost every gsync monitor has an inferior refresh rate ulmb(144Hz>120Hz ulmb)? I couldn't check them all, help me here...
Posted on Reply
#90
cucker tarlson
mtcn77You mean like the way almost every gsync monitor has an inferior refresh rate ulmb(144Hz>120Hz ulmb)? I couldn't check them all, help me here...
welcome to another episode of clueless gamer

strobing at 120 is already way better than 144/165.also - better than no strobing,would you belive that ?
Posted on Reply
#91
mtcn77
cucker tarlsonulmb at 120 is already way better than 144/165.also - better than no ulmb,
What about 144 with mbr?;) You are really desperate.

I kind of understand where you stand, but mbr monitors at least come with led overcharge. That makes them brighter for faster phasing.:rolleyes:
In order to partially compensate for the LEDs not being switched on for most of the time, the LED current in MBR mode is boosted to about 360% of the LED current in the flicker-free mode (as inferred from luminance measurements, so the actual differences in LED currents might be even higher because of a non-linear relation between LED current and emitted luminance).
display-corner.epfl.ch/index.php/BenQ_XL2540
Posted on Reply
#92
cucker tarlson
mtcn77What about 144 with mbr?;) You are really desperate.

I kind of understand where you stand, but mbr monitors at least come with led overcharge. That makes them brighter for faster phasing.:rolleyes:

display-corner.epfl.ch/index.php/BenQ_XL2540
cute,show me a 1440p one I might look into getting one
and that's benq's own standard,not amd's,so I guess I'm not the one who's gotten himself into a corner
Posted on Reply
#93
mtcn77
cucker tarlsoncute,show me a 1440p one I might look into getting one
:confused: like I determine the oem picks and launch dates. Keep skewing the topic without any hard evidence. I'm not circumstantially drawing straws like you have been. You seem to be avoiding some shortcomings of something, you know what.
Hardware manufacturers know their hardware the best. Change my mind.
Posted on Reply
#94
cucker tarlson
I would,if I knew what you meant when you write words.
Posted on Reply
#95
mtcn77
cucker tarlsonI would,if I knew what you meant when you write words.
Not words, sentences. Also, you keep adding edits, that doesn't do much towards helping your arguments.
cucker tarlsonand that's benq's own standard,not amd's,so I guess I'm not the one who's gotten himself into a corner
I'm not the one saying nvidia does it best.:cool: let's have it the way market demands and oems deliver. I am really sorry you have issues seperating my position from your own.
Posted on Reply
#96
Fiendish
ZoneDymoYes a dedicated card from a 3rd party.
Nvidia dropped all support for those dedicated cards after buying Ageia and then made it exclusive to their cards.
Support for PPUs wasn't dropped until years after the acquisition and PPU sales were abysmal to begin with. You do know that Ageia sold themselves to Nvidia because they had no future right?
They even actively blocked users from running an Nvidia card as a dedicated physX card alongside an AMD card.....
Nvidia actively blocking AMD happened after AMD (Godfrey Cheng specifically) publicly crapped on PhysX and made it clear AMD were going to back Havok against them, Havok was ironically owned by Intel at the time.
continued and expanded....yeah thats why no game is build on gpu hardware accelerated physics right? its only used as a cpu implementation, no more interesting then Havok which we have had for years.
PhysX research papers contribute to the industry as a whole regularly, Matthias Muller from Ageia (and NovodeX before that) still works for Nvidia and releases publications almost every year. GPU PhysX pioneered GPU particles which have been integrated into nearly every big physics engine now. At Nvidia, GPU PhysX has developed into things like Flex and Apex.
Sure Nvidia allowed for the competition to use it, more money pls first though.
You realize that not only was Nvidia not charging AMD anything, because it was a third party that was doing the porting, Nvidia couldn't have charged AMD. Also read the article, Nvidia was already providing support for Eran Badit's effort, so they had already put their money were their mouth was, so-to-speak.
That is always what I have said, Nvidia should have just made it open for everyone to use/implement and even contribute to, with the only request/requirement being that at the start of a game it would say "PhysX by Nvidia" and that would be their advertisement for being the core support for this development.
The PhysX framework was ported to GPU's via CUDA so even if it was open, it would still have to be ported and given that AMD wouldn't even support a third party (Eran Badit) that was going to port it for them free of charge, your assumption that this would have worked out is dubious.
That is the problem, AMD tries to better everything for everyone, Nvidia tries to better everything exclusively for itself, probably why they are doing so well as a company, but why I dont support them myself.
Considering that we only have boundry pushing standards like DXR/VulkanRT and DirectML/VulkanML right now because of Nvidia's contributions, this seems like a strange take.
Posted on Reply
#97
eLJay88
Vya DomusYou know damn well that's a load of crap.
Sadly not really.
AMD uses a far superior production process (7nm) yet it can't compete with Nvidia on a 12nm Node. It is not even in the same ball-park.

I addition it seems all recent gpu architecture's of AMD just seem crap stability wise. Where an Nvidia card is easy plug-n-play. with an AMD you have to undervolt, use custom bios or have to jump through other hoops just to get a normal working card. This hold up for Navi, Vega, Fury and even polaris seemed to have minor issues (huge compared to nvidia rock-solid stability). Even in the golden days for AMD GPU's (around the 5xxx/6xxx and 7) my good old HD 6850 was inferior to Nvidia offering in the stability department (notorious for black screens and 'display-adapter-freezes' resulting in CTD.
The last great AMD card was the 7970 which they rebranded to infinity.

Don't get me wrong I would love to buy an solid AMD card, but they don't even offer high-end or enthousiast cards. At this point I have higher hopes on Intel having success on the dGPU market then AMD which makes me really sad.

Luckily Zen was a total hit.
Posted on Reply
#98
cucker tarlson
mtcn77Not words, sentences. Also, you keep adding edits, that doesn't do much towards helping your arguments.

I'm not the one saying nvidia does it best.:cool: let's have it the way market demands and oems deliver. I am really sorry you have issues seperating my position from your own.
no I dont add edits
and next time you barge in a conversation between two members you may wanna check what they were talking about
Posted on Reply
#99
mtcn77
cucker tarlsonno I dont add edits
and next time you barge in a conversation between two members you may wanna check what they were talking about
Nvidia is not in the monitor business. I wouldn't mind if they were since that would encourage them to compete on features rather than stifle the competition field.
Posted on Reply
#100
MrMeth
MetroidYeah, Intel could have bought Nvidia at that time just like AMD acquired ATI but unlikely AMD, Intel opted not to and now Intel must be crying out loud in some corner hehe
AMD actually did try to buy Nvidia back in the day , ATI was there 2nd choice. The deal fell through because Jen sung wanted to run the company and be on the board.

www.neowin.net/news/rumor-amd-tried-to-buy-nvidia-before-buying-ati/

Man, I'm so old and I remember all the hardware specs and news from decades ago but I don't remember what I did yesterday sigh ...
Posted on Reply
Add your own comment
Dec 25th, 2024 19:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts