• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Surpasses Intel in Market Cap Size

Low quality post by Adam Krazispeed
Clever company but very,very dubious business practices got them to where they are.

In the early days of PC graphics accelerator cards they used some pretty underhand tactics against its competitors to ensure they survived and others didn't.

yeah like the x86 compiler bs if intels x86 compiler found any CPU besides "GENUINE INTEL" THE X86 Compiler would divert the cpus code path to a slow, jumbled bs mess to make intels cpus only seem faster than they really were, amd was actually faster at the time and Intels "SHIT'N TELL" O.E.M'S "REBATES" DEBACLE! WITH DELL AND OTHER O.E.M.S BRIBING THEM TO NOT SELL DESKTOP PCS (AND PROB LAPTOPS TOO) WITH AMD, CYRIX, OR ANY OTHER DESKTOP/X86 CPU BASED CPU!! AT THAT TIME!!! FUK INTEL!!

INTEL IS STILL A SHADY ANTI-CONSUMER CPU ENGINEERING COMPANY!! (AND OTHER SHIT TOO)

ILL BUY A FASTER 8C/16T cpu from AMD (even if slower single core/thread ipc/ lower clocks) even if slightly higher than intels equivalent 8c/16t chips!!! aka intel 8c/16t 349-379$ vs amd 8c/16t @$399 (3800 XT) (intel prob dont have it cheaper, and i could still get a 3700X zen2 cpu at my local Microcenter for $269 for a very decent 8c/16t gen4.0 supported cpu 3.60ghz base & 4.3/4.4 max boost (single thread or single core
 
Last edited:
Yeah, Intel could have bought Nvidia at that time just like AMD acquired ATI but unlikely AMD, Intel opted not to and now Intel must be crying out loud in some corner hehe
AMD actually did try to buy Nvidia back in the day , ATI was there 2nd choice. The deal fell through because Jen sung wanted to run the company and be on the board.


Man, I'm so old and I remember all the hardware specs and news from decades ago but I don't remember what I did yesterday sigh ...
 
AMD uses a far superior production process (7nm) yet it can't compete with Nvidia on a 12nm Node. It is not even in the same ball-park.

I find it fascinating you bring up process technology yet you also conveniently omit the fact that Nvidia's highest end offering has a staggering 80% more transistors. Basically Nvidia has the performance lead right know because they made a chip twice as big, wow, imagine if it was in the same "ball-park".

Will fans of the green team ever understand how to properly compare hardware ? Probably not but one can only hope.

my good old HD 6850 was inferior to Nvidia offering in the stability department

Yeah I bet, "rock solid stability" is the definitely the first thing that pops in my head from that era, besides stuff like this : https://www.techpowerup.com/review/asus-geforce-gtx-590/26.html

I went to heat up the card and then *boom*, a sound like popcorn cracking, the system turned off and a burnt electronics smell started to fill up the room
 
Last edited:
I find it fascinating you bring up process technology yet you also conveniently omit the fact that Nvidia's highest end offering has a staggering 80% more transistors. Basically Nvidia has the performance lead right know because they made a chip twice as big, wow, imagine if it wasn't in the same "ball-park".

Will fans of the green team ever understand how to properly compare hardware ? Probably not but one can only hope.
would amd make that 750mm2 chip possible with navi when 5700xt is already close to 250w ?
 
Something else green team fans don't seem to understand is the relationship between power, die size scaling, frequency and voltages. Larger processors tend to be far more power efficient, it's the reason why a 2080ti which has roughly 2.3X times the number of shaders that a 2060 has also doesn't need 2.3X times the power.

1594398209568.png


This phenomena is a complete mystery to them.
 
Last edited:
Something else green team fans don't seem to understand is the relationship between power, die size scaling, frequency and voltages. Larger processors tend to be far more power efficient, it's the reason why a 2080ti that has roughly 2.3X times the number of shaders that a 2060 has also doesn't need 2.3X times the power.
Old times when the saying went, big gpus don't need so much of everything. I don't know which since gpureview shut down.
 
Something else green team fans don't seem to understand is the relationship between power, die size scaling, frequency and voltages. Larger processors tend to be far more power efficient, it's the reason why a 2080ti which has roughly 2.3X times the number of shaders that a 2060 has also doesn't need 2.3X times the power.

View attachment 161795

This phenomena is a complete mystery to them.
is this the reason rx470 beats vega 64 in power efficiency ? :roll:
and why would you not include one with 5700xt,I wonder.....
 
Nvidia even was all for a third party porting GPU PhysX to run on Radeons but AMD refused to provide proper support.

Nvidia wanted to create two types of gamers. Those with an Nvidia GPU that would enjoy their games at their full visual quality and those who would have to settle for something visually inferior. We also saw how open Nvidia was the next years with their GameWorks libraries. Locked and proprietary. Also PhysX, those first years it was running extremely badly on the CPU, being totally unoptimized. If I remember correctly(after so many years), it was running on a single thread and using ancient MMX instructions. It's software version was meant to make the GPU version look like 10 times faster. So, why would AMD trust Nvidia and support a proprietary and locked standard? AMD supporting PhysX back then would have been a mistake. Make hardware PhysX a necessity in gaming and hope that a company like Nvidia will not stub you in the back? I think not.

Nvidia's true intentions where totally clear when it chose to lock PhysX while it was probably totally independent to the primary GPU used. Probably because Ageia developed that way and Nvidia didn't bothered to make it incompatible with anything else than Nvidia GPUs. So with a simple patch you could unlock PhysX and play those few games that where supporting hardware PhysX with an AMD primary GPU without any problems and good framerates. I enjoyed Alice with a 4890 as a primary card and a 9600GT as a PhysX card. Super smooth, super fun. There was also a driver from Nvidia that accidentally came out without a PhysX lock. I think it was 256.xxx something.
Nvidia could had offered PhysX without support in those cases where the primary GPU was not an Nvidia one. They didn't.
 
And satan
Lol,I dont know what's funnier,the contents of this paragraph or the fact it starts with "on topic"

I like the implication that profiting off any of those things is inherently bad. I don't feel it is if it's filling a need (like gaming during a pandemic), and at least for mining, AMD arguably profited more.
 
Also PhysX, those first years it was running extremely badly on the CPU, being totally unoptimized. If I remember correctly(after so many years), it was running on a single thread and using ancient MMX instructions. It's software version was meant to make the GPU version look like 10 times faster.
That was a bullshit myth.

So, why would AMD trust Nvidia and support a proprietary and locked standard?
You're ascribing information from 2010 to AMD's decision making process in 2008. In 2008 AMD had a lot more reason to distrust Intel than Nvidia and yet they had no problem supporting Intel's proprietary and locked standard (Havok).

AMD supporting PhysX back then would have been a mistake. Make hardware PhysX a necessity in gaming and hope that a company like Nvidia will not stub you in the back? I think not.

AMD didn't just "not support" PhysX, they also explicitly backed Havok against it. It was a business move through and through, one that was clearly made to hurt Nvidia and stifle their development, which it did... but it also deprived their customers of a potential feature and stifled the industry's development.

Nvidia's true intentions where totally clear when it chose to lock PhysX while it was probably totally independent to the primary GPU used.
This came after AMD had already made their intentions clear that they wouldn't play ball.
 
Last edited:
You want me to go on a wild goose chase? Some Intel executive said that as a joke. I literally cannot find it at this time.
haha I like the expression :)
 
Yeah, he explains that they didn't had the time, resources and knowledge to make it multithreaded and use the best options available. Your point? You have to realize that someone posting a huge explanation in public, doesn't necessarily say all the truth.

You're ascribing information from 2010 to AMD's decision making process in 2008. In 2008 AMD had a lot more reason to distrust Intel than Nvidia and yet they had no problem supporting Intel's proprietary and locked standard (Havok).
Intel and AMD are a duopoly in the x86 business for decades. Think Samsung and Apple. They fight in courts and at the same time they are doing business together. Nvidia on the other hand was a newcomer that had a vision where the GPU is doing the most heavy tasks in a PC, while the CPU is playing the roll of a traffic policeman in the system. Nvidia was and today clearly is a common enemy to both of them.

AMD didn't just "not support" PhysX, they also explicitly backed Havok against it. It was a business move through and through, one that was clearly made to hurt Nvidia and stifle their development, which it did... but it also deprived their customers of a potential feature and stifled the industry's development.
Common... bullshit myth from Nvidia fans. We had physics effects long before Ageia. Nvidia took something that was free at a time, running on the CPU and tried to make it a proprietary feature for it's GPUs. The result was games with PhysX effects available only to those with an Nvidia GPU. The rest could "enjoy" a totally different game.

This came after AMD had already made their intentions clear that they wouldn't play ball.
Nvidia's lock didn't had to do with AMD's decision. If the GPU was not an Nvidia one, PhysX got disabled. I had read a case where someone with an Nvidia GPU was having problems enabling PhysX because he had a second USB monitor and the Nvidia driver was treating the USB driver as a non Nvidia graphics driver. Hilarious. Also I believe Nvidia removed the lock a few years later. i think drivers after 2014 do not lock PhysX, but i might be wrong here.
In any case Nvidia could let the PhysX feature unlocked and just throw a pop up window while installing the driver informing that there would be no customer support when a non Nvidia GPU is used as primary. That customer support and bug reports would be valid only for those using an Nvidia GPU as primary. Anyway let me repeat something here. With a simple patch PhysX was running without any problems with an AMD card as primary.
 
Last edited:
Yeah, he explains that they didn't had the time, resources and knowledge to make it multithreaded and use the best options available. Your point? You have to realize that someone posting a huge explanation in public, doesn't necessarily say all the truth.

Your claim, "It's software version was meant to make the GPU version look like 10 times faster" is bullshit. You're saying the codebase from before Nvidia's acquisition was designed to make GPU's faster, despite not even being ported to GPU's yet.

Intel and AMD are a duopoly in the x86 business for decades. Think Samsung and Apple. They fight in courts and at the same time they are doing business together. Nvidia on the other hand was a newcomer that had a vision where the GPU is doing the most heavy tasks in a PC, while the CPU is playing the roll of a traffic policeman in the system. Nvidia was and today clearly is a common enemy to both of them.

So AMD is fine with proprietary and locked standards as long as they dick over Nvidia. Gotcha.

Common... bullshit myth from Nvidia fans. We had physics effects long before Ageia. Nvidia took something that was free at a time, running on the CPU and tried to make it a proprietary feature for it's GPUs. The result was games with PhysX effects available only to those with an Nvidia GPU. The rest could "enjoy" a totally different game.

No they didn't... what is with you guys and how quick you resort to just blatant lying? Nvidia didn't change anything regarding the propriety of Ageia's properties and it sure as hell wasn't anymore "free" before than it was after. The effects that were ported to the GPU were the effects that would only run well on PPUs before. GPU PhysX effects could have been functional on Radeons if AMD had properly supported the porting, it's AMD's fault PhysX wasn't supported on their hardware in Alice and other games.

Nvidia's lock didn't had to do with AMD's decision. If the GPU was not an Nvidia one, PhysX got disabled. I had read a case where someone with an Nvidia GPU was having problems enabling PhysX because he had a second USB monitor and the Nvidia driver was treating the USB driver as a non Nvidia graphics driver. Hilarious. Also I believe Nvidia removed the lock a few years later. i think drivers after 2014 do not lock PhysX, but i might be wrong here.
In any case Nvidia could let the PhysX feature unlocked and just throw a pop up window while installing the driver informing that there would be no customer support when a non Nvidia GPU is used as primary. That customer support and bug reports would be valid only for those using an Nvidia GPU as primary. Anyway let me repeat something here. With a simple patch PhysX was running without any problems with an AMD card as primary.

Nvidia's lock came after AMD's decision to torpedo their efforts, you're saying Nvidia should have thrown AMD's customers a bone when even AMD wouldn't throw them one themselves and were simultaneously giving Nvidia the finger.
 
Your claim, "It's software version was meant to make the GPU version look like 10 times faster" is bullshit. You're saying the codebase from before Nvidia's acquisition was designed to make GPU's faster, despite not even being ported to GPU's yet.
All my posts here talk about Nvidia and PhysX. I don't care about the code before Nvidia. That's your conclusion because it just suits you. But anyway, that link you posted proves that the code was far from optimized. And Nvidia could make it much more optimized easily and fast. They are a software company in case you don't know that. They had a ton of programmers and experience to fix it, but they didn't.

Now, PhysX wasn't locked when it was meant to run on Ageia cards, before Nvidia took over. I would be objecting on Ageia cards, you would also, if I was seeing developers throwing all physics effects on the Ageia card and forcing people to buy one more piece of hardware, when there where already multicore CPUs to do the job.

By the way. Saying ALL the time that the other person posts bullshit, is a red flag. You are putting a red flag on yourself, that you are a total waste of time. You look like a brainless 8 years old fanboy that just wants to win an argument when you keep saying that the other person posts bullshit. This is the simplest way to explain it to you.

So AMD is fine with proprietary and locked standards as long as they dick over Nvidia. Gotcha.
If this convenient explanation makes yourself happy, no problem. Why spoil your happiness?

No they didn't... what is with you guys and how quick you resort to just blatant lying? Nvidia didn't change anything regarding the propriety of Ageia's properties and it sure as hell wasn't anymore "free" before than it was after. The effects that were ported to the GPU were the effects that would only run well on PPUs before. GPU PhysX effects could have been functional on Radeons if AMD had properly supported the porting, it's AMD's fault PhysX wasn't supported on their hardware in Alice and other games.
You just don't want to understand. You have an image in you mind that Nvidia is a company run by saints who want to push technology and make people happy. Maybe in another reality. It's funny that GameWorks even hurt performance in older series of Nvidia cards, but hey, Nvidia would have treated AMD cards fair with it's locked and proprietary code. You reject reality and then you ask what is it with us? And who are we? Are we a group? Maybe a group of non believers?

Nvidia's lock came after AMD's decision to torpedo their efforts, you're saying Nvidia should have thrown AMD's customers a bone when even AMD wouldn't throw them one themselves and were simultaneously giving Nvidia the finger.

Look at my system specs. My answer is there. I keep a simple GT 620 card in my system just so I can enjoy hardware PhysX effects in games like Batman for example. When i enable software PhysX in a systme with a 4th gen quad core i5 and an RX 580 framerate goes down to single digit. That simple GT 620 is enough for fluid gaming. And no I didn't had to install a patch because as I said, Nvidia decided to remove the lock? Why did they removed the lock. Did they decided to support AMD cards by themselves? Throw a bone to AMD's customers? Maybe they finally came in agreements with AMD? And by the way, why didn't they announced that lock removal? There was NO press release.
But things changed. The PhysX software became faster on the CPU, it had to become, or Havoc would have totally killed it and only a couple of developers where choosing to take Nvidia's money and create a game with hardware PhysX, where gamers that where using Intel or AMD GPUs would have to settle for a game with only minimal physics effects. No developer could justify going hardware PhysX in a world with 4-6-8 cores/threads CPUs. So Nvidia decided to offer a PhysX software engine that was usable. It's dream to make AMD GPUs look inferior through physics had failed.

As long as you reject reality, you would keep believing that we(the non believers) post bullshit and lies.
 
Last edited:
It's a speculation for those that don't know what Intel began to do in the early 2000s. They drove AMD out of the OEM and server business with billions in bribes, there was no way AMD would have survived that exodus up until today had they not bought ATI and started shipping APUs in consoles, at one point in time that was basically their only considerable source of income. Intel probably never predicted that they were going to buy ATI nor what their intentions with it were.

This is an excellent point , AMD with out ATI purchase doesn't make it through the failure that was Bulldozer , the console Contracts were the only thing keeping the lights on. What a scary world no AMD and ,$1k Intel quad core CPUs.
 
All my posts here talk about Nvidia and PhysX. I don't care about the code before Nvidia. That's your conclusion because it just suits you. But anyway, that link you posted proves that the code was far from optimized. And Nvidia could make it much more optimized easily and fast. They are a software company in case you don't know that. They had a ton of programmers and experience to fix it, but they didn't.

Sigh... they did fix it with PhysX 3, it was a significant overhaul and was completed as quickly as could be expected for something like that. Maybe you need to take another look that the timelines here.

Now, PhysX wasn't locked when it was meant to run on Ageia cards, before Nvidia took over. I would be objecting on Ageia cards, you would also, if I was seeing developers throwing all physics effects on the Ageia card and forcing people to buy one more piece of hardware, when there where already multicore CPUs to do the job.

You just described Ageia's pre-Nvidia business model verbatim...

By the way. Saying ALL the time that the other person posts bullshit, is a red flag. You are putting a red flag on yourself, that you are a total waste of time. You look like a brainless 8 years old fanboy that just wants to win an argument when you keep saying that the other person posts bullshit. This is the simplest way to explain it to you.

You lied, objectively... why shouldn't you be called out?

You just don't want to understand. You have an image in you mind that Nvidia is a company run by saints who want to push technology and make people happy. Maybe in another reality. It's funny that GameWorks even hurt performance in older series of Nvidia cards, but hey, Nvidia would have treated AMD cards fair with it's locked and proprietary code.

WOW, so you're saying you know whats in my mind and you are going to tell me what I think and want... that's certainly not an insidious and disingenuous way to argue </s>

You reject reality and then you ask what is it with us? And who are we? Are we a group? Maybe a group of non believers?

I guess you just jumped into the discussion without reading the other posts, that makes sense, that kind of lazy ignorance seems your speed.

Look at my system specs. My answer is there. I keep a simple GT 620 card in my system just so I can enjoy hardware PhysX effects in games like Batman for example. When i enable software PhysX in a systme with a 4th gen quad core i5 and an RX 580 framerate goes down to single digit. That simple GT 620 is enough for fluid gaming. And no I didn't had to install a patch because as I said, Nvidia decided to remove the lock? Why did they removed the lock. Did they decided to support AMD cards by themselves? Throw a bone to AMD's customers? Maybe they finally came in agreements with AMD? And by the way, why didn't they announced that lock removal? There was NO press release.
But things changed. The PhysX software became faster on the CPU, it had to become, or Havoc would have totally killed it and only a couple of developers where choosing to take Nvidia's money and create a game with hardware PhysX, where gamers that where using Intel or AMD GPUs would have to settle for a game with only minimal physics effects. No developer could justify going hardware PhysX in a world with 4-6-8 cores/threads CPUs. So Nvidia decided to offer a PhysX software engine that was usable. It's dream to make AMD GPUs look inferior through physics had failed.

As long as you reject reality, you would keep believing that we(the non believers) post bullshit and lies.

Your notions are directly contradicted by the facts, how is calling that out rejecting reality?
 
Sigh... they did fix it with PhysX 3, it was a significant overhaul and was completed as quickly as could be expected for something like that. Maybe you need to take another look that the timelines here.
Oh my. Here we go again.

Well I checked the date. SDK 3.0 came out in June 2011. I guess programmers also need time to learn it and implement it, so games using it came out when? Probably when hardware PhysX was clear that wasn't meant to became a standard.

Try again.
You just described Ageia's pre-Nvidia business model verbatim...
Ageia didn't had the connections, money, power to enforce that. So even if they wanted to do that, they couldn't. Also PPUs where not something that people where rushing to buy, so developers wouldn't cripple the game for 99% of their customers, just to make 1% happy. Nvidia was a totally different beast. And they did try to enforce physics on their GPUs. You are NOT reading or you just pretend to not read what I post.
You lied, objectively... why shouldn't you be called out?
It seems that you are a waste of time after all. What you don't like is not a lie. I could call you also a liar. But I am not 5 years old.
WOW, so you're saying you know whats in my mind and you are going to tell me what I think and want... that's certainly not an insidious and disingenuous way to argue </s>
I come on. You keep posting like a 5 years old. I am just bored to post the parts of your posts where you make assumptions about what I think, what I mean, where I intentionally lie.
I guess you just jumped into the discussion without reading the other posts, that makes sense, that kind of lazy ignorance seems your speed.
You do reject reality. As for ignorance, it's your bliss.
Your notions are directly contradicted by the facts, how is calling that out rejecting reality?
So you had nothing to say here.

Well, nice losing my time with you. Have a nice day.
 
Intel's income is higher than NV's revenue...

Am I missing something about the whole "AI business" or is it about rather straighforward number crunching?

ATI is now 7 years behind Nvidia
By which braindamaged metric? Dear God...
 
Intel is a company that tries every trick in the book. They take their leisure to putting their right foot forward, until they threw the kitchen sink and the bath tub at the problem. That is about when they have decided upon the said question.


That is some of the dumbest stuff I have ever read in my life.

Even worse is that they turned down Apple's pitch for the CPU in the first iPhone. Intel did not see the potential of smartphones at the time. ARM is now a major long term threat to their existence.

Intel has form when it comes to these sorts of missed opportunities.
This was the most arrogant, short sighted decisions in the history of tech. Intel may have owned mobile computing.

AMD is no underdog. Check the dictionary for the meaning of the word:

View attachment 161684
Until AMD can make inroads into the server rooms where they have like maybe 10% penetration and oem's where they have even less than that they are still HUGE underdogs. Retails sales are like 1% of the overall market for cpu's if that.
 
AMD arguably profited more.
I'd argue the retailers & miners profited a heck lot more, if AMD started a mining farm back then I bet they'd have made 10x the profit during the (peak) mining booms.
 
I'd argue the retailers & miners profited a heck lot more,

No. Just no. AMD's cashflow is much larger than that. Hell bitcoins market cap isn't even really all that huge in talking corporation money. Let me put it this way, if Intel wanted to, they could just buy all bitcoins and end it.

AMD probably could too though it would likely end them.

The disparity between the big money like that and Bitcoin is actually quite large.
 
You mean during the two mining peaks, when worldwide crypto market cap soared 5x-10x in months? Some of currencies saw even greater gains. I don't have the exact numbers from back then but I do remember AMD's GPU division didn't report anywhere near the kind of profits you saw from crypto & yes I realize electricity cost play a major role in it. Of course the smart(er) money probably invested in crypto rather than opening giant farms.
 
Back
Top