Wednesday, April 1st 2009

Intel Losing CPU Market-Share to AMD

With the introduction of the K8 architecture, years ago, AMD found inroads into significantly rising in market-share for CPUs, at the expense of Intel. That growth ceased with Intel's introduction of the competing Core microarchitecture, following which, AMD was pushed into some deep financial trouble. Eventually the company spun-off its manufacturing division to form the Globalfoundries with investment from Advanced Technology Investment Company, recently.

With the introduction of the 45 nm Phenom II series of processors however, sharp growths in demand for AMD have been observed, with Phenom II X3 700 series triple-core, and the Phenom II X4 920 quad-core desktop processors. The surge in demand is caused due to recent price-cuts by the company. Motherboard vendors forecast the overall global market-share for AMD desktop processors to go up by 30 percent in Q2, 2009. With a conservative-estimate of its current market share to be around 20 percent, the growth would send the figure to 26 percent. The company plans to further expand its desktop CPU lineup with the introduction of an entry-level desktop platform before September.
Source: DigiTimes
Add your own comment

115 Comments on Intel Losing CPU Market-Share to AMD

#101
FryingWeesel
Wile EAnd I guess innocent until proven guilty means very little to you?

Yeah, and that quote in no way goes against what I said about them optimizing for nV, but not sabotaging ATI. No matter how you look at it, it's not sabotage to NOT program for something's strong points. There is no conspiracy.
duno where this innocent till proven guilty crap comes from, surely not the US legal system, i have enough experience with that to tell you, its guilty till proven innocent.

if u read all of the quote, basickly they did something they knew would run poorly on ati cards, when they could have just included both, optimizing for one by doing something that will hamper perf on another is in my eyes bullshit.
Posted on Reply
#102
Wile E
Power User
FryingWeeselduno where this innocent till proven guilty crap comes from, surely not the US legal system, i have enough experience with that to tell you, its guilty till proven innocent.

if u read all of the quote, basickly they did something they knew would run poorly on ati cards, when they could have just included both, optimizing for one by doing something that will hamper perf on another is in my eyes bullshit.
Where does it say they left it out on purpose? And by adding that code, they would also have to add some sort of detection and switch routine to the code. I don't see how that is sabotage.
Posted on Reply
#103
DaedalusHelios
Thats like saying that games using Havok are fighting Physx development and therefore Nvidia.

If the programmers don't have the time or money to spend on optimizing you can throw them some money to get it done. If ATi helped with funding too I am sure both would have been running on par.

I don't see why it has to be a conspiracy. ;)
Posted on Reply
#104
eidairaman1
The Exiled Airman
Ive known TWIMTBP titles actually play extrememly well on ATI parts.
Wile EI didn't call you a fanboy. I said fanboys made it up. Did you make it up?

And Ubi never said a render pass was missing, like the DX10.1 feature you are referring to.They said their implementation is buggy. If you want to take that as a conspiracy against ATI by nV and Ubi, be my guest.

And none of what you are saying has any solid backing in terms of evidence. No proof of motive exists. No, NV is not an angel of a company, nor is MS, Intel, or AMD. They are all guilty of something shady at any given point in time, but just because a game has the TWIMTBP tags on it, does not mean that the developer is doing anything to hurt ATI. Yes, they optimize for nV, because nV provides them the means to do so, but they don't sabotage ATI like so many want to believe.
Posted on Reply
#105
a_ump
okay, i wasn't sure what to thk, and imo you can call the Doom 3 coding w/e you want. Great support for Nvidia and or conspiracy against ATI, either way they could have easily fixed ATI's support in that game but no matter no.

However i found this article interesting about the Dx10.1 removal in Assassin's Creed.

techreport.com/discussions.x/14707

So they responded that there were no image quality differences with Dx10.1 compared to Dx10, only performance improvements for compliant hardware. then they state that they didn't want there to be a bad gaming experience, why would increased performance lower the gaming experience? just sounds like bullshit to me. We see benchmarks of great performance with ATI in a game that is TWIMTBP, then it's removed and not thought about since to re-instate dx10.1
Posted on Reply
#106
FordGT90Concept
"I go fast!1!11!1!"
a_umpthen they state that they didn't want there to be a bad gaming experience, why would increased performance lower the gaming experience?
Because the code for DirectX 10.1 was using a separate rendering path so fixing bugs in the DirectX 10/9 rendering path could easily cause complications in the DirectX 10.1 code. It's easier just to remove the DirectX 10.1 render path and focus on improving the DirectX 10/9 path. Ya know, fix it for the masses, not the few.
Posted on Reply
#107
Braveheart
*looks at Q9450 with a weird, undecided face*
Posted on Reply
#108
TheMailMan78
Big Member
FordGT90ConceptBecause the code for DirectX 10.1 was using a separate rendering path so fixing bugs in the DirectX 10/9 rendering path could easily cause complications in the DirectX 10.1 code. It's easier just to remove the DirectX 10.1 render path and focus on improving the DirectX 10/9 path. Ya know, fix it for the masses, not the few.
Says the person with the Nvidia card.
Posted on Reply
#109
Wile E
Power User
TheMailMan78Says the person with the Nvidia card.
I say it too. ;)
Posted on Reply
#110
TheMailMan78
Big Member
Why use an "inferior" path? I mean 10.1 runs better.
Posted on Reply
#111
Wile E
Power User
TheMailMan78Why use an "inferior" path? I mean 10.1 runs better.
For a much smaller amount of people. Would've meant extra dev time and money to perfect. It runs on DX10 on all of the modern cards, it only runs in DX10.1 for a small percentage. Some of their later games have it. It just wasn't a priority to add it back into AC.
Posted on Reply
#112
FryingWeesel
just dont update AC, you can play the game with 10.1 if u dont update it.....and it works perfectly...
Posted on Reply
#113
TheMailMan78
Big Member
Wile EFor a much smaller amount of people. Would've meant extra dev time and money to perfect. It runs on DX10 on all of the modern cards, it only runs in DX10.1 for a small percentage. Some of their later games have it. It just wasn't a priority to add it back into AC.
So people with "bleeding edge" 10.1 get the shaft because Nvidia didn't develop accordingly?
Posted on Reply
#114
Wile E
Power User
TheMailMan78So people with "bleeding edge" 10.1 get the shaft because Nvidia didn't develop accordingly?
nVidia didn't develop the title. Ubi did. Just like most of the major dev houses, they feel it's a waste to spend too much dev time on patching games, unless they really need it. AC doesn't need 10.1 to run, so they felt their dev money was better spent elsewhere, especially considering that all modern cards can already run it in DX10 anyway. Why implement and debug DX10.1 for a minority of gamers, on a game that is perfectly stable without, when they can use that dev time for newer titles?

Some of their newer titles do have 10.1 back in them.
Posted on Reply
#115
a_ump
i believe he was referring to Nvidia failing to implement Dx10.1 onto their GPU's, not the game itself. I agree lol why can't a very successful company like Nvidia implement dx10.1 yet ATI could? i do realize at first that ATI released their first dx10/10.1 card 6months after the G80 but dam Nvidia's had plenty of time to get dx10.1. It just amazes me how much pull nvidia has over so many other large companies like microsoft. I realize that it made sense for microsoft to lower dx10 requirements at first cause nvidia had the only dx10 compatible cards, but after SP1 it should have been strictly DX10.1, that coulda turned tables in an instant, and performance in games such as crysis and stalker maybe would be much better.
Posted on Reply
Add your own comment
Jul 22nd, 2024 11:46 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts