Wednesday, March 4th 2009
NVIDIA to Try and Develop x86 CPU in Two to Three Years
Graphics market seems to be already small enough for NVIDIA, so the green corporation looks further ahead into building a x86 processor in the near future. At the Morgan Stanley Technology Conference in San Francisco yesterday, the company revealed that it had plans to enter the x86 processor market by building a x86 compatible system-on-chip in the next two to three years. Michael Hara, NVIDIA's senior vice president of investor relations and communications, commented:
Source:
Blog on EDN
I think some time down the road it makes sense to take the same level of integration that we've done with Tegra ...... Tegra is by any definition a complete computer on a chip, and the requirements of that market are such that you have to be very low power, very small, but highly efficient. So in that particular state it made a lot of sense to take that approach, and someday it's going to make sense to take the same approach in the x86 market as well.He also said that NVIDIA's future x86 CPU wouldn't be appropriate for every segment of the market, especially the high-end of the PC market which includes gaming systems, graphics engineering stations and many other. The x86 chip will be mainly targeted at smaller system-on-chip platforms. No other details were unveiled at the time of this publication. It's also very early to talk about something that's still on paper.
51 Comments on NVIDIA to Try and Develop x86 CPU in Two to Three Years
Also, this is not that big of a surprise. There have been rumors about this going back at least 2-3 years, back to autumn of 2006 when it was even rumored that nVidia might be buying AMD. Instead nVidia around that time hired a bunch of former Stexar (ex-Intel) x86 engineers. I'm guessing around this same time period is when nVidia started tinkering with x86 architecture, behind closed doors at least.
But I too agree that all will be in vain due to the x86 licensing. If game developers all jump to the x86 ship, nVidia is pretty much DEAD. AMD/ATI have their hands on the license, and hence can make C/GPU cards, but nVidia will be left drowning in an abandoned architecture. :(
I bet nVidia is sweating with their efforts in 3D gaming, only to be swept away from them with the coming of Larrabee. Seriously? I hope you are being sarcastic.
The High-End market might be the main market here in TPU, but Intel is not stupid enough to believe that is what makes them rich! :) Do you really believe that the 40,590 members of TPU even have X-editions of Intel's line-up? Heck no. Intel ripens itself with CPUs that sell; AKA Atoms, e1/2/4/5/700 series CPU, and some lower Q's. Period.
It is like saying that Dodge was only successful because of the Viper... it was the Neons that fattened the pockets. :D
PowerVR HSR FTW!
Bryan d
I really don't know why they(nVIDIA) want to destroying their reputation by competing in developing x86 CPU..(I'm not trying to offend any one but nVIDIA is doing a good job by just developing GPU's!!) b'cause ATI still my fav.. :rockout:
Anyway if you read better my post, and don't try to understand it from your ass, this same thing is explained in the post itself, when I say "high-end" (with quotes), $100-500 CPUs, $30-100 chipsets... I think it's clear that I'm talking about "high-end" CPUs as opposed to cheap embedded solutions like Atom, Nano, etc. I think that the fact that I didn't talk about $300-1500 CPUs (half the people in TPU are in that range...) already contadicts your stupid reply. I can stand critics, but from someone that didn't bother to read and understand my post NOPE.
Really, I don't know how to explain to you, that from a modern market point of view even a cheap $50 e1/2/4 CPU is already high-end compared to $10 Atom. << Atom is way overpriced now, competition and simple market trends wil bring it down there soon.
On a $50 CPU intel can have $10 for themselves, on a $500 they can have $100-200 (does it matter if they sell 50 flod less of them? NO), on a $10 CPU they can't have a shit in comparison, and as I said the market is not growing at a pace enoubh to fight that fact.
Nettops are going to cannibalize the market that feeds Intel. I repeat: Intel owns the market already and a change doesn't benefit them, just as the previous trend change towards cheaper CPUs didn't benefit them neither. On the other hand a new market trend favours almost every other manufacturer, because it's a mean of scape from a market where they cannot compete to one where the field is even for everybody. AMD, Via, Nvidia (if they finally do a CPU) can't compete in the desktop market (AMD does it's best, but hardly) and nettop, ulpcs, and all can save them. Intel is scared by the fact that with the current trend "only" enthusiast will buy their desktop CPUs in a not so far future and that includes $50 CPUs. How did you say? Ah, PERIOD.
I want nVidia CPUs, especially if they can be overclocked!
On Nvidia, everyone is freaking out over nothing, if and when Nvidia gets into the CPU market, it will be cell phones, smart phones, things like that, 20 or so years as we'll probably see a Nvidia CPU.
O K. For the nth time. And traduced according to "THE FRIGGIN ATOM IS ALL THAT IS USED CURRENTLY.":
- Every nettop sold with an Atom, is one less e7200 + G31 mobo sold by Intel.
Which at the same time can be traduced as:
- Every 30$ given to Intel is one less $150 given to Intel.
Do any of you guys understand this simple thing yet? Because I have no other damn way of saying it...
1. Try to gain a license for x86.
---a. Success! start making money competing in both markets! GO TO 2.
---b. FAIL! file anti competition complaint. GO TO 3.
2. Make money and compete directly on both fronts with their rival AMD. END.
3. Complaint filed in courts.
---a. Success! start competing with AMD on both fronts. END.
---b. FAIL! AMD 5xxx series GPU > NV GTX 300 series. CEO commits suicide, NV files bankruptcy.
I work in the US army 5th special forces group in FT Campbell kentucky. We have contracts with both dell and intel worth more than the consumer market demands by far, and we overpay, drastically. Intel likes dabbling in the consumer market, but it is contracts like these that make them money. Our contracts with AMD are only for server processors, and only dual cores at that, which hold only ~10% of the share.
I wonder if nvidia are still considering ION but with the VIA nano instead.
EDIT: And don't subestimate Nvidia in that market you mentioned. Maybe not the army, but in medical environment or in topography CUDA has created a very good impression, doing things that Intel can never dream of, at least for now. CUDA is already being implemented there and as to the army, a GPGPU solution could make a lot of things better than a CPU like GPS image tracking, or pattern finding, for example. What you show is the CURRENT reality, but Intel is scared of the future. Things are bright for them now, and theywant them that way forever. Intel right now is that child that had not seen bad days (or are far forgotten) and is scared of what might be ahead.
My god. While that is true to a point, and was more true some time ago than it is today, it definately is not to the extent that prices show. The cheaper and the more expensive Quad (about $150 and $1500) costs exactly the same to develop (it's the same chip in the same waffer), almost exactly the same to test and the excuse to sell them high is because they are selected chips that can clock higher, have some better properties, etc. / end of theory
Now we take Core2 and it just happens to OC like a charm, to almost 3.8 GHz irregardless of what was it's original clock. Could have been clocked at 3 Ghz in the first place?? Of course my friend. Do they make a profit selling them at $130?? Of course my friend. Then could Intel sell $150 Quads clocked at 3Gz+ and still make a profit? Of course my friend. It's common bussiness, but in no way it is because they are harder to make. :laugh:
* higher end CPUs DO tend to OC better, but is trully because they are selected pieces or because they have a higher multiplier?? I just ask. :rolleyes: Yeah, agreed. And because of that, what happens when that IP starts losing it's value as is the case with x86 CPUs? :cry:
EDIT: You forgot to list "they force the adoption of that thing" and "they ensure the exclusivity, by not licensing to companies that can do it better" in that las sentence though.
Overall everything you said was only half true. Like that you make money on volume. You make money on volume and in margins. No margin = no profits. High-end parts have a much higher margin (like 10/20 to one) and even if by volume low-end is 10 times bigger, in profits it's only twice as big. And the most profitable segment is always the mainstream anyway, and it's that market segment which is in risk really. Ion while could be considered super-low-end does threaten some Intels mainstream solutions, because it has better graphics than the better Intel. If it was a success that could lead to Nvidia moving that scheme to higher levels and would, in fact, erase the need for ANY mainstream CPU. Intel then would be left with only the enthusiast segment (which will never dissapear, and no one neither Nvidia said so) and the new low segment, which would be the Atom (and sucessors) and not the $50-100 CPUs of today. Not the bright future Intel wants, that's for sure. That's the main reason they are trying their best to ban Ion, without making too much noise.
I think Darkmatter needs to take a deep breath, and just step away from "Darkmatter land" for a moment.
If you really believe Intel only focuses on its high-end CPU's to drive its profits then you can go ahead and believe what you want. But the vast majority of Corporate environments, Home Business, Educational Institutes, and personal computers that I have seen with my own eyes do not use high-end; these are what make up Intel's profits.
Well I am finished here. If you will continue to spit and scream at your monitor, go ahead. :)
Darkmatter go see some family and loved ones,
Bryan D.
I mean I've been giving numbers, I've been giving the names of the CPUs I considered high-end in this discussion (ah and only within the context of this discussion, I'm not crazy still, thanks). Namely e7200 and G31. :banghead:
And BTW I'm very proud of the companies, colleges and government in Spain now, because they have much better computers than what you are suggesting my friend. Better than the ones I have listed above...
EDIT: Ah: I didn't read that the first time, it is pretty clear what kind of things go after: I am finished here. But it was so funny that I had to reply to this. :laugh:. You are naive at best if you think that anything said here or any forum can alterate me. Just because I use the language I use doesn't mean that I am altered or something. It meas, well that I am spanish... maybe? more so... from Bilbao? (don't worry if any spanish people enter, they will understand why this matters :D)