Wednesday, April 1st 2009
Intel Losing CPU Market-Share to AMD
With the introduction of the K8 architecture, years ago, AMD found inroads into significantly rising in market-share for CPUs, at the expense of Intel. That growth ceased with Intel's introduction of the competing Core microarchitecture, following which, AMD was pushed into some deep financial trouble. Eventually the company spun-off its manufacturing division to form the Globalfoundries with investment from Advanced Technology Investment Company, recently.
With the introduction of the 45 nm Phenom II series of processors however, sharp growths in demand for AMD have been observed, with Phenom II X3 700 series triple-core, and the Phenom II X4 920 quad-core desktop processors. The surge in demand is caused due to recent price-cuts by the company. Motherboard vendors forecast the overall global market-share for AMD desktop processors to go up by 30 percent in Q2, 2009. With a conservative-estimate of its current market share to be around 20 percent, the growth would send the figure to 26 percent. The company plans to further expand its desktop CPU lineup with the introduction of an entry-level desktop platform before September.
Source:
DigiTimes
With the introduction of the 45 nm Phenom II series of processors however, sharp growths in demand for AMD have been observed, with Phenom II X3 700 series triple-core, and the Phenom II X4 920 quad-core desktop processors. The surge in demand is caused due to recent price-cuts by the company. Motherboard vendors forecast the overall global market-share for AMD desktop processors to go up by 30 percent in Q2, 2009. With a conservative-estimate of its current market share to be around 20 percent, the growth would send the figure to 26 percent. The company plans to further expand its desktop CPU lineup with the introduction of an entry-level desktop platform before September.
115 Comments on Intel Losing CPU Market-Share to AMD
I thought all CEOs did was playing golfs and put stamp on documents. How naive of me !
j/k, of course. I know better than that.
Still, it's quite surprising that, once Hector is gone, AMD is starting to get better. Hector is the bad mojo, or ... you know, he sucks at what he is supposed to do.
How he got his fame, that's beyond me.
It is nice to see AMD back in the "game" and applying pressure to Intel :toast:
there are plenty of links about it, those that go into depth explain it quite well, 10.1 removes the need for extra rendering passes for some effects, the same effects that gave the perf boost to ati cards.
so you can read up about this and get the FACTS not the excuses used by Ubi to placate nVidia.
techreport.com/discussions.x/14707 basickly it was removed to remove the advantege ati had shown due to their cards supporting 10.1 when NOTHING nvidia had or even has today can support 10.1(true dx10)
10.1 in assains creede is actually legitimate because you can just reuse the depth buffer in DX10.1 instead of second pass.
again, something nvidia cards cant do because nvidia didnt want to support true dx10(hence ms cutting dx10 specs and having to bring out dx10.1 later)
ATI on the other hand had true dx10(now called 10.1) support with the hd2k cards but....well, nvidia didnt want to follow ms's specs, and cryed enought that ms backed down and removed the stuff nvidia couldnt/wouldnt support.
mind you, im on an 8800gts 512...so dont say im an nvidia hater, i love this card, but i dont love the actions of tthe company behind it.
www.pcgameshardware.de/aid,645430/Ubisoft-No-Direct-3D-101-support-for-Assassins-Creed-planned/Assassins-Creed/News/ if that dosnt look like somebodys just making excuses for patching out something that offers a bennifit to the "other team" i duno what you been smoking.....
1) Most games are still designed for DirectX 9.0c so they don't lose the enormous customer potential of Windows 98 through Windows XP.
2) DirectX 10 support is usually coded as an alternate in software (it's easier on the hardware when ran on Vista). That is, it is more or less the same as DirectX 9.0c. Very, very few developers go out of their way to focus on DirectX 10 support (ehm, games released exclusively for DirectX 10).
3) DirectX 10, being mostly useless from the sales and development standpoint, carries over to DirectX 10.1; however, even fewer people have DirectX 10.1 support than DirectX 10.
4) Ubisoft developed the game with DirectX 10.1 in mind. First, they saw that NVIDIA announced they had no plans to support DirectX 10.1. Then they ran into problems themselves with the DirectX 10.1 code path in the game after it was launched. They decided that about 1 out of every 10 cards playing the game could handle DirectX 10.1 and decided it would cost too much to fix the botched code in comparison to just removing it altogether.
And that's pretty much it. It wasn't worth fixing so they removed it. NVIDIA's dominance and them saying they won't support DirectX 10.1 may have something to do with deciding it wasn't worth fixing but, as with most publishers, it comes down to cost. The cost to fix it exceeded the amount they were willing to pay so they just got rid of it.
Since Ubee put DX10.1 in Tom Clancy HAWX
I have personaly seen the game on ati hardware vs my 8800gts, it looks/runs better on a 3850/3870 or even 2900xt then it runs for me on my 8800gts 512mb (755/1900/2200) once AA is enabled.
the r600 and higher are TRUE dx10(whats now called 10.1) cards, the 4k cards add back some features of dx9 cards(hardware aa support insted of doing it all in shaders)
had nvidia not refused to support true dx10 and convenced MS to dumb it down they would have bennifited from one less rendering pass being needed, but nv refuses to support 10.1, and when it showed a bennifit for ATI on a game NV supported(either with cash, advertising or hardware) NV was PISSED and got ubi to remove it.....
its not a conspiricy theory, its just buisness, and nv doing what i would call a dirty trick to the public at large, even their own customers.
hell, the g80, g92, gt200 and we still dont see dx10.1 out of nvidia, they COULD do it, but it would take more work then just re-using stuff they already have :/
Plain and simple, it's a conspiracy theory made up by ATI fanboys to make themselves feel better. Nv never paid off the multitude of other vendors whose TWIMTBP titles ran better on ATI hardware.
It's all a bunch of BS.
MS im sure hoped that by cutting back 10 and allowing the g80 to be a true dx10 card(by changing what dx10 really was) they would beable to draw more people to vista and dx10, it didnt work, mostly due to bad press and the fact that pre sp1 vista was a buggy pain in the ass to deal with.
You can compair the image quility of dx9, 10 and 10.1 on assassins creede yourself and see that theres not a problem, You can read the dx10.1 specs and see that what they refered to (the "missing rendering pass" is also a SPICIFIC FEATURE of dx10.1 that makes it more efficent then dx10 by allowing the depth buffer to be re-used insted of needing a 2nd rendering pass.
again, if you look at the statements that ubi made when interviewed about it, they dont hold up, they are vauge or use dbl talk to avoid telling people what the real reasion is.
to me it comes off as them saying whatever they have to in order to justify removing something that works fine for ati owners.
It dosnt effect me directly as thru the whole time I have had a g92 card, yet you say im an ati fanboi because I dont just accept the excuses ubi and nvidia put out for their actions.
like nvidia saying they didnt put 10.1 support in the gtx260 and gtx280 cards because "nobodys using 10.1", then why even bother supporting dx10 at all? NOBODY is truely using dx10 because it would cut off to large a portion of the market, those people who are running 2k/xp with dx9 hardware, they could have just made a really bitchin dx9 card since nobodys really using 10......but that would look really insain......(hell it looks insain that they put out extreamly high priced cards with no dx10.1 to me....)
but hey, you must be right, nvidia can do no wrong after all.....:rolleyes:
Personaly, I have seen the stuff nV has pulled over the years, and dispite really liking my current card and being impressed by nvidias current driver development I dont think they are what you seen to think them to be, they are not flawless, they are not above bribery and other dirty tricks to keep their lead in benchmarks.
I guess you also think that the doom3 "conspiricy" was thought up by ati fanboys?
to refresh your memory, nvidia and id work togather, and intentionaly put in code that would run like SHIT on ati hardware, they used "texture lookups" insted of shader code, nvidia hardware did texture lookups insainly well back then, ati's hardware did shader work insainly well, by editing 1 file and replacing the texturelookup code with equivlant shader code ati cards became FASTER then nvidia cards with no quility diffrance(but these changes also slowed nvidia cards down even more then texture lookups slowed ati cards down)
In the end ati put a fix in their drivers to get around the "problem", clearly if you looked at what they did, it wouldnt have been hard to have put both paths in the game and have it auto detect ati vs nvidia and use the proper path for that card, but they didnt..........
this stuffs happened many times over the years.
tiger woods first golf game for example wouldnt run in 3d mode on non-nvidia cards, u could trick it into running in full 3d mode with all features by using an app to change the device id to that of an nvidia card.
and that was an early TWIMTBP title and they have continued to do that kinda stuff over the years, hey its a good marketing move if you dont get caught as they did with AC, doom3 and tigerwoods(just 3 examples)
I mean if you can keep your perf higher then the compeditors for the first months benching your set, if you can keep it going longer, your golden.
if u get caught, you just get the company to say the game/app needs patched because of flawed code or some other excuse.
And Ubi never said a render pass was missing, like the DX10.1 feature you are referring to.They said their implementation is buggy. If you want to take that as a conspiracy against ATI by nV and Ubi, be my guest.
And none of what you are saying has any solid backing in terms of evidence. No proof of motive exists. No, NV is not an angel of a company, nor is MS, Intel, or AMD. They are all guilty of something shady at any given point in time, but just because a game has the TWIMTBP tags on it, does not mean that the developer is doing anything to hurt ATI. Yes, they optimize for nV, because nV provides them the means to do so, but they don't sabotage ATI like so many want to believe.
People that only buy AMD (for whatever reason) finally found a great competitive processor with Phenom 2.
So the "AMD only" group wasn't very motivated until the Phenom 2, for an upgrade. Most upgraded from the dismal original phenom or the good old ground breaking X2 939 or AM2.
I buy AMD and Intel. Why would you limit yourself to only one or the other. Its not a sports team.... its a processor.
PS I am just saying that Intel was ahead of the game by alot from the Core2 launch until Phenom 2 finally caught up but is still behind Core i7.
many times you see "unexplainable" perf issues with one or the other companys hardware for no apparent reasion, i mean HL2 vs doom3, well ati is just better at d3d and also the game/engine was optimized at least at the time for ati, BUT it also had rendering path optimizations that helped some nvidia cards run better as well, Doom3 had a spicific peice of coding that ran VERY poorly on ati hardware, somebody found the fix and posted it(then ati's driver dept figuared out how to fix it in drivers with that info)
Id is one of those companys I use to have nothing but respect for, they use to be very even handed, they would add optimizations for most common readely avalable hardware, 3dfx,ati,nvidia,hell even powervr got support in quake1 and 2, then came doom3........
there are things I will accept as optimizations and things I wont accept as purely being optimizations, doom3 is one title that was clearly coded with extream bias to nvidia(it would have been easy to have put both code paths in) AC, well from what i read myself its very clear that nV presured Ubi to "fix" their problem, the easiest fix was to just dissable/remove dx10.1 and say it was flawed/borked........
And again, still no proof exists that Ubi pulled 10.1 as a favor to nV.
I know that affected 3xxx series ATi, but maybe 4xxx fixed the mistake?
and the optimizations for doom3 took a user very little time to figuar out, if u would like, i could link the post on megagames about it....
www.megagames.com/news/html/pc/doom3enhancetheexperiencept2.shtml theres more advanced versions but the megagames one is easy to find thats why i use it :)
fact is that as u see the changes where EASY to make and made a HUGE diffrance in perf for ATI cards, but Id include such shader/math based code, because nvidia cards do texture lookups faster then they do math(at the time)
Yeah, and that quote in no way goes against what I said about them optimizing for nV, but not sabotaging ATI. No matter how you look at it, it's not sabotage to NOT program for something's strong points. There is no conspiracy.